Sep 12 17:23:12.030263 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 12 17:23:12.030280 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 15:37:01 -00 2025 Sep 12 17:23:12.030287 kernel: KASLR enabled Sep 12 17:23:12.030291 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 12 17:23:12.030296 kernel: printk: legacy bootconsole [pl11] enabled Sep 12 17:23:12.030299 kernel: efi: EFI v2.7 by EDK II Sep 12 17:23:12.030304 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e018 RNG=0x3fd5f998 MEMRESERVE=0x3e471598 Sep 12 17:23:12.030308 kernel: random: crng init done Sep 12 17:23:12.030312 kernel: secureboot: Secure boot disabled Sep 12 17:23:12.030316 kernel: ACPI: Early table checksum verification disabled Sep 12 17:23:12.030320 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 12 17:23:12.030324 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030328 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030333 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 12 17:23:12.030338 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030342 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030346 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030351 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030356 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030360 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030364 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 12 17:23:12.030368 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:23:12.030372 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 12 17:23:12.030377 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 17:23:12.030381 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 12 17:23:12.030385 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 12 17:23:12.030389 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 12 17:23:12.030393 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 12 17:23:12.030398 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 12 17:23:12.030403 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 12 17:23:12.030407 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 12 17:23:12.030411 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 12 17:23:12.030415 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 12 17:23:12.030420 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 12 17:23:12.030424 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 12 17:23:12.030428 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 12 17:23:12.030432 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 12 17:23:12.030437 kernel: NODE_DATA(0) allocated [mem 0x1bbaada00-0x1bbab4fff] Sep 12 17:23:12.030441 kernel: Zone ranges: Sep 12 17:23:12.030445 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 12 17:23:12.030452 kernel: DMA32 empty Sep 12 17:23:12.030456 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 12 17:23:12.030460 kernel: Device empty Sep 12 17:23:12.030465 kernel: Movable zone start for each node Sep 12 17:23:12.030469 kernel: Early memory node ranges Sep 12 17:23:12.030474 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 12 17:23:12.030479 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 12 17:23:12.030483 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 12 17:23:12.030487 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 12 17:23:12.030492 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 12 17:23:12.030496 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 12 17:23:12.030500 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 12 17:23:12.030505 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 12 17:23:12.030509 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 12 17:23:12.030513 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 12 17:23:12.030518 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 12 17:23:12.030522 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 12 17:23:12.030527 kernel: psci: probing for conduit method from ACPI. Sep 12 17:23:12.030531 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 17:23:12.030536 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 17:23:12.030540 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 12 17:23:12.030545 kernel: psci: SMC Calling Convention v1.4 Sep 12 17:23:12.030549 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 12 17:23:12.030553 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 12 17:23:12.030557 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 17:23:12.030562 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 17:23:12.030566 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 17:23:12.030571 kernel: Detected PIPT I-cache on CPU0 Sep 12 17:23:12.030576 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 12 17:23:12.030580 kernel: CPU features: detected: GIC system register CPU interface Sep 12 17:23:12.030585 kernel: CPU features: detected: Spectre-v4 Sep 12 17:23:12.030589 kernel: CPU features: detected: Spectre-BHB Sep 12 17:23:12.030593 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 17:23:12.030598 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 17:23:12.030602 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 12 17:23:12.030606 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 17:23:12.030611 kernel: alternatives: applying boot alternatives Sep 12 17:23:12.030616 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9b01894f6bb04aff3ec9b8554b3ae56a087d51961f1a01981bc4d4f54ccefc09 Sep 12 17:23:12.030621 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:23:12.030626 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 17:23:12.030630 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:23:12.030635 kernel: Fallback order for Node 0: 0 Sep 12 17:23:12.030639 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 12 17:23:12.030643 kernel: Policy zone: Normal Sep 12 17:23:12.030648 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:23:12.030652 kernel: software IO TLB: area num 2. Sep 12 17:23:12.030656 kernel: software IO TLB: mapped [mem 0x0000000036290000-0x000000003a290000] (64MB) Sep 12 17:23:12.030661 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:23:12.030665 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:23:12.030670 kernel: rcu: RCU event tracing is enabled. Sep 12 17:23:12.030675 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:23:12.030680 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:23:12.030684 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:23:12.030688 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:23:12.030693 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:23:12.030697 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:23:12.030702 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:23:12.030706 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 17:23:12.030711 kernel: GICv3: 960 SPIs implemented Sep 12 17:23:12.030715 kernel: GICv3: 0 Extended SPIs implemented Sep 12 17:23:12.030719 kernel: Root IRQ handler: gic_handle_irq Sep 12 17:23:12.030723 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 12 17:23:12.030728 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 12 17:23:12.030733 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 12 17:23:12.030737 kernel: ITS: No ITS available, not enabling LPIs Sep 12 17:23:12.030742 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:23:12.030746 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 12 17:23:12.030750 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:23:12.030755 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 12 17:23:12.030759 kernel: Console: colour dummy device 80x25 Sep 12 17:23:12.030764 kernel: printk: legacy console [tty1] enabled Sep 12 17:23:12.030777 kernel: ACPI: Core revision 20240827 Sep 12 17:23:12.030782 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 12 17:23:12.030788 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:23:12.030793 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:23:12.030797 kernel: landlock: Up and running. Sep 12 17:23:12.030801 kernel: SELinux: Initializing. Sep 12 17:23:12.030806 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:23:12.030814 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:23:12.030819 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 12 17:23:12.030824 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 12 17:23:12.030828 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 12 17:23:12.030833 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:23:12.030838 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:23:12.030843 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 17:23:12.030848 kernel: Remapping and enabling EFI services. Sep 12 17:23:12.030853 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:23:12.030857 kernel: Detected PIPT I-cache on CPU1 Sep 12 17:23:12.030862 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 12 17:23:12.030868 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 12 17:23:12.030872 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:23:12.030877 kernel: SMP: Total of 2 processors activated. Sep 12 17:23:12.030882 kernel: CPU: All CPU(s) started at EL1 Sep 12 17:23:12.030886 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 17:23:12.030891 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 12 17:23:12.030896 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 17:23:12.030901 kernel: CPU features: detected: Common not Private translations Sep 12 17:23:12.030906 kernel: CPU features: detected: CRC32 instructions Sep 12 17:23:12.030911 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 12 17:23:12.030916 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 17:23:12.030920 kernel: CPU features: detected: LSE atomic instructions Sep 12 17:23:12.030925 kernel: CPU features: detected: Privileged Access Never Sep 12 17:23:12.030930 kernel: CPU features: detected: Speculation barrier (SB) Sep 12 17:23:12.030934 kernel: CPU features: detected: TLB range maintenance instructions Sep 12 17:23:12.030939 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 17:23:12.030944 kernel: CPU features: detected: Scalable Vector Extension Sep 12 17:23:12.030949 kernel: alternatives: applying system-wide alternatives Sep 12 17:23:12.030954 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 12 17:23:12.030959 kernel: SVE: maximum available vector length 16 bytes per vector Sep 12 17:23:12.030964 kernel: SVE: default vector length 16 bytes per vector Sep 12 17:23:12.030969 kernel: Memory: 3959668K/4194160K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38912K init, 1038K bss, 213304K reserved, 16384K cma-reserved) Sep 12 17:23:12.030973 kernel: devtmpfs: initialized Sep 12 17:23:12.030978 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:23:12.030983 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:23:12.030988 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 17:23:12.030992 kernel: 0 pages in range for non-PLT usage Sep 12 17:23:12.030998 kernel: 508576 pages in range for PLT usage Sep 12 17:23:12.031002 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:23:12.031007 kernel: SMBIOS 3.1.0 present. Sep 12 17:23:12.031012 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 12 17:23:12.031016 kernel: DMI: Memory slots populated: 2/2 Sep 12 17:23:12.031021 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:23:12.031026 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 17:23:12.031031 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 17:23:12.031035 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 17:23:12.031041 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:23:12.031045 kernel: audit: type=2000 audit(0.058:1): state=initialized audit_enabled=0 res=1 Sep 12 17:23:12.031050 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:23:12.031055 kernel: cpuidle: using governor menu Sep 12 17:23:12.031059 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 17:23:12.031064 kernel: ASID allocator initialised with 32768 entries Sep 12 17:23:12.031069 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:23:12.031073 kernel: Serial: AMBA PL011 UART driver Sep 12 17:23:12.031078 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:23:12.031084 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:23:12.031089 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 17:23:12.031093 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 17:23:12.031098 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:23:12.031103 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:23:12.031107 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 17:23:12.031112 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 17:23:12.031117 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:23:12.031121 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:23:12.031127 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:23:12.031131 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:23:12.031136 kernel: ACPI: Interpreter enabled Sep 12 17:23:12.031141 kernel: ACPI: Using GIC for interrupt routing Sep 12 17:23:12.031145 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 12 17:23:12.031150 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 17:23:12.031155 kernel: printk: legacy bootconsole [pl11] disabled Sep 12 17:23:12.031160 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 12 17:23:12.031164 kernel: ACPI: CPU0 has been hot-added Sep 12 17:23:12.031170 kernel: ACPI: CPU1 has been hot-added Sep 12 17:23:12.031174 kernel: iommu: Default domain type: Translated Sep 12 17:23:12.031179 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 17:23:12.031184 kernel: efivars: Registered efivars operations Sep 12 17:23:12.031188 kernel: vgaarb: loaded Sep 12 17:23:12.031193 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 17:23:12.031198 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:23:12.031202 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:23:12.031207 kernel: pnp: PnP ACPI init Sep 12 17:23:12.031212 kernel: pnp: PnP ACPI: found 0 devices Sep 12 17:23:12.031217 kernel: NET: Registered PF_INET protocol family Sep 12 17:23:12.031222 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:23:12.031227 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 17:23:12.031232 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:23:12.031236 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:23:12.031241 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 17:23:12.031246 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 17:23:12.031250 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:23:12.031256 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:23:12.031261 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:23:12.031265 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:23:12.031270 kernel: kvm [1]: HYP mode not available Sep 12 17:23:12.031275 kernel: Initialise system trusted keyrings Sep 12 17:23:12.031279 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 17:23:12.031284 kernel: Key type asymmetric registered Sep 12 17:23:12.031289 kernel: Asymmetric key parser 'x509' registered Sep 12 17:23:12.031293 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 17:23:12.031299 kernel: io scheduler mq-deadline registered Sep 12 17:23:12.031303 kernel: io scheduler kyber registered Sep 12 17:23:12.031308 kernel: io scheduler bfq registered Sep 12 17:23:12.031313 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:23:12.031317 kernel: thunder_xcv, ver 1.0 Sep 12 17:23:12.031322 kernel: thunder_bgx, ver 1.0 Sep 12 17:23:12.031327 kernel: nicpf, ver 1.0 Sep 12 17:23:12.031331 kernel: nicvf, ver 1.0 Sep 12 17:23:12.031431 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 17:23:12.031482 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T17:23:11 UTC (1757697791) Sep 12 17:23:12.031488 kernel: efifb: probing for efifb Sep 12 17:23:12.031493 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 12 17:23:12.031498 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 12 17:23:12.031503 kernel: efifb: scrolling: redraw Sep 12 17:23:12.031507 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 17:23:12.031512 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 17:23:12.031517 kernel: fb0: EFI VGA frame buffer device Sep 12 17:23:12.031523 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 12 17:23:12.031527 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:23:12.031532 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 17:23:12.031537 kernel: watchdog: NMI not fully supported Sep 12 17:23:12.031542 kernel: watchdog: Hard watchdog permanently disabled Sep 12 17:23:12.031546 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:23:12.031551 kernel: Segment Routing with IPv6 Sep 12 17:23:12.031556 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:23:12.031560 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:23:12.031566 kernel: Key type dns_resolver registered Sep 12 17:23:12.031570 kernel: registered taskstats version 1 Sep 12 17:23:12.031575 kernel: Loading compiled-in X.509 certificates Sep 12 17:23:12.031580 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 7675c1947f324bc6524fdc1ee0f8f5f343acfea7' Sep 12 17:23:12.031585 kernel: Demotion targets for Node 0: null Sep 12 17:23:12.031589 kernel: Key type .fscrypt registered Sep 12 17:23:12.031594 kernel: Key type fscrypt-provisioning registered Sep 12 17:23:12.031598 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:23:12.031603 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:23:12.031609 kernel: ima: No architecture policies found Sep 12 17:23:12.031613 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 17:23:12.031618 kernel: clk: Disabling unused clocks Sep 12 17:23:12.031623 kernel: PM: genpd: Disabling unused power domains Sep 12 17:23:12.031628 kernel: Warning: unable to open an initial console. Sep 12 17:23:12.031632 kernel: Freeing unused kernel memory: 38912K Sep 12 17:23:12.031639 kernel: Run /init as init process Sep 12 17:23:12.031644 kernel: with arguments: Sep 12 17:23:12.031649 kernel: /init Sep 12 17:23:12.031656 kernel: with environment: Sep 12 17:23:12.031661 kernel: HOME=/ Sep 12 17:23:12.031666 kernel: TERM=linux Sep 12 17:23:12.031672 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:23:12.031679 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:23:12.031687 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:23:12.031693 systemd[1]: Detected virtualization microsoft. Sep 12 17:23:12.031700 systemd[1]: Detected architecture arm64. Sep 12 17:23:12.031706 systemd[1]: Running in initrd. Sep 12 17:23:12.031712 systemd[1]: No hostname configured, using default hostname. Sep 12 17:23:12.031718 systemd[1]: Hostname set to . Sep 12 17:23:12.031723 systemd[1]: Initializing machine ID from random generator. Sep 12 17:23:12.031729 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:23:12.031736 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:23:12.031742 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:23:12.031748 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:23:12.031755 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:23:12.031762 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:23:12.031776 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:23:12.031782 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:23:12.031788 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:23:12.031793 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:23:12.031799 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:23:12.031804 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:23:12.031811 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:23:12.031817 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:23:12.031823 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:23:12.031829 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:23:12.031835 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:23:12.031841 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:23:12.031848 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:23:12.031855 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:23:12.031861 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:23:12.031867 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:23:12.031874 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:23:12.031879 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:23:12.031884 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:23:12.031889 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:23:12.031894 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:23:12.031901 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:23:12.031906 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:23:12.031911 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:23:12.031927 systemd-journald[224]: Collecting audit messages is disabled. Sep 12 17:23:12.031942 systemd-journald[224]: Journal started Sep 12 17:23:12.031955 systemd-journald[224]: Runtime Journal (/run/log/journal/faf617395f2240599e51563f6233fea4) is 8M, max 78.5M, 70.5M free. Sep 12 17:23:12.039799 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:12.044345 systemd-modules-load[226]: Inserted module 'overlay' Sep 12 17:23:12.065332 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:23:12.065362 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:23:12.068139 kernel: Bridge firewalling registered Sep 12 17:23:12.068213 systemd-modules-load[226]: Inserted module 'br_netfilter' Sep 12 17:23:12.072686 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:23:12.076853 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:23:12.085486 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:23:12.092182 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:23:12.100576 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:12.109106 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:23:12.126123 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:23:12.130219 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:23:12.149076 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:23:12.156012 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:23:12.161827 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:23:12.172262 systemd-tmpfiles[253]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:23:12.178374 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:23:12.192829 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:23:12.201666 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:23:12.216866 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:23:12.221325 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:23:12.238974 dracut-cmdline[256]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9b01894f6bb04aff3ec9b8554b3ae56a087d51961f1a01981bc4d4f54ccefc09 Sep 12 17:23:12.266521 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:23:12.285540 systemd-resolved[263]: Positive Trust Anchors: Sep 12 17:23:12.285552 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:23:12.285572 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:23:12.287179 systemd-resolved[263]: Defaulting to hostname 'linux'. Sep 12 17:23:12.288358 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:23:12.299756 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:23:12.362777 kernel: SCSI subsystem initialized Sep 12 17:23:12.367782 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:23:12.374796 kernel: iscsi: registered transport (tcp) Sep 12 17:23:12.386582 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:23:12.386609 kernel: QLogic iSCSI HBA Driver Sep 12 17:23:12.398507 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:23:12.418815 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:23:12.424729 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:23:12.470071 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:23:12.476887 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:23:12.533786 kernel: raid6: neonx8 gen() 18572 MB/s Sep 12 17:23:12.549778 kernel: raid6: neonx4 gen() 18553 MB/s Sep 12 17:23:12.568787 kernel: raid6: neonx2 gen() 17068 MB/s Sep 12 17:23:12.588801 kernel: raid6: neonx1 gen() 15130 MB/s Sep 12 17:23:12.607776 kernel: raid6: int64x8 gen() 10552 MB/s Sep 12 17:23:12.626776 kernel: raid6: int64x4 gen() 10606 MB/s Sep 12 17:23:12.646776 kernel: raid6: int64x2 gen() 8998 MB/s Sep 12 17:23:12.667680 kernel: raid6: int64x1 gen() 7020 MB/s Sep 12 17:23:12.667739 kernel: raid6: using algorithm neonx8 gen() 18572 MB/s Sep 12 17:23:12.688544 kernel: raid6: .... xor() 14899 MB/s, rmw enabled Sep 12 17:23:12.688551 kernel: raid6: using neon recovery algorithm Sep 12 17:23:12.693776 kernel: xor: measuring software checksum speed Sep 12 17:23:12.699078 kernel: 8regs : 27100 MB/sec Sep 12 17:23:12.699086 kernel: 32regs : 28818 MB/sec Sep 12 17:23:12.701396 kernel: arm64_neon : 37735 MB/sec Sep 12 17:23:12.704198 kernel: xor: using function: arm64_neon (37735 MB/sec) Sep 12 17:23:12.740785 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:23:12.746367 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:23:12.755521 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:23:12.787711 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 12 17:23:12.790555 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:23:12.802368 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:23:12.821711 dracut-pre-trigger[479]: rd.md=0: removing MD RAID activation Sep 12 17:23:12.841383 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:23:12.850662 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:23:12.888247 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:23:12.898503 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:23:12.957663 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:23:12.962849 kernel: hv_vmbus: Vmbus version:5.3 Sep 12 17:23:12.957827 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:12.974888 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:12.998473 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 12 17:23:12.998487 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 12 17:23:12.998499 kernel: PTP clock support registered Sep 12 17:23:12.998506 kernel: hv_vmbus: registering driver hv_storvsc Sep 12 17:23:12.988961 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:13.017166 kernel: scsi host0: storvsc_host_t Sep 12 17:23:13.017354 kernel: hv_vmbus: registering driver hid_hyperv Sep 12 17:23:13.017364 kernel: scsi host1: storvsc_host_t Sep 12 17:23:13.017378 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 12 17:23:13.021779 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 12 17:23:13.022326 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:23:13.039864 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 12 17:23:13.039890 kernel: hv_vmbus: registering driver hv_netvsc Sep 12 17:23:13.039902 kernel: hv_utils: Registering HyperV Utility Driver Sep 12 17:23:13.039918 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Sep 12 17:23:13.039925 kernel: hv_vmbus: registering driver hv_utils Sep 12 17:23:13.023793 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:13.069812 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Sep 12 17:23:13.069826 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 12 17:23:13.069931 kernel: hv_utils: Heartbeat IC version 3.0 Sep 12 17:23:13.069943 kernel: hv_utils: Shutdown IC version 3.2 Sep 12 17:23:13.059119 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:23:13.406839 kernel: hv_utils: TimeSync IC version 4.0 Sep 12 17:23:13.061580 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:13.398281 systemd-resolved[263]: Clock change detected. Flushing caches. Sep 12 17:23:13.424823 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 12 17:23:13.424985 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 12 17:23:13.428726 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:13.450995 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:23:13.451216 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 12 17:23:13.451306 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 12 17:23:13.451427 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#251 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:23:13.456926 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#194 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:23:13.462537 kernel: hv_netvsc 000d3af5-af47-000d-3af5-af47000d3af5 eth0: VF slot 1 added Sep 12 17:23:13.469324 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:23:13.469347 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:23:13.478546 kernel: hv_vmbus: registering driver hv_pci Sep 12 17:23:13.478578 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 12 17:23:13.482443 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:23:13.482470 kernel: hv_pci b322da77-6071-4e24-b95b-205d791a039a: PCI VMBus probing: Using version 0x10004 Sep 12 17:23:13.488534 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 12 17:23:13.496469 kernel: hv_pci b322da77-6071-4e24-b95b-205d791a039a: PCI host bridge to bus 6071:00 Sep 12 17:23:13.496593 kernel: pci_bus 6071:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 12 17:23:13.496667 kernel: pci_bus 6071:00: No busn resource found for root bus, will use [bus 00-ff] Sep 12 17:23:13.502431 kernel: pci 6071:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 12 17:23:13.516272 kernel: pci 6071:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 12 17:23:13.516368 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#116 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:23:13.516486 kernel: pci 6071:00:02.0: enabling Extended Tags Sep 12 17:23:13.534574 kernel: pci 6071:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 6071:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 12 17:23:13.543941 kernel: pci_bus 6071:00: busn_res: [bus 00-ff] end is updated to 00 Sep 12 17:23:13.544057 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#92 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:23:13.544123 kernel: pci 6071:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 12 17:23:13.604331 kernel: mlx5_core 6071:00:02.0: enabling device (0000 -> 0002) Sep 12 17:23:13.611392 kernel: mlx5_core 6071:00:02.0: PTM is not supported by PCIe Sep 12 17:23:13.611482 kernel: mlx5_core 6071:00:02.0: firmware version: 16.30.5006 Sep 12 17:23:13.776463 kernel: hv_netvsc 000d3af5-af47-000d-3af5-af47000d3af5 eth0: VF registering: eth1 Sep 12 17:23:13.776632 kernel: mlx5_core 6071:00:02.0 eth1: joined to eth0 Sep 12 17:23:13.781530 kernel: mlx5_core 6071:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 12 17:23:13.789535 kernel: mlx5_core 6071:00:02.0 enP24689s1: renamed from eth1 Sep 12 17:23:14.012990 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 12 17:23:14.062699 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 12 17:23:14.078491 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 12 17:23:14.112550 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 12 17:23:14.117191 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 12 17:23:14.128647 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:23:14.143609 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:23:14.149136 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:23:14.157356 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:23:14.166189 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:23:14.183800 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#223 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:23:14.183667 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:23:14.197534 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:23:14.210577 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:23:15.209542 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#116 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:23:15.221290 disk-uuid[652]: The operation has completed successfully. Sep 12 17:23:15.225086 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:23:15.285171 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:23:15.285248 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:23:15.314493 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:23:15.335481 sh[818]: Success Sep 12 17:23:15.369509 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:23:15.369543 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:23:15.373910 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:23:15.382543 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 17:23:15.684819 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:23:15.689443 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:23:15.709911 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:23:15.729603 kernel: BTRFS: device fsid 752cb955-bdfa-486a-ad02-b54d5e61d194 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (836) Sep 12 17:23:15.738439 kernel: BTRFS info (device dm-0): first mount of filesystem 752cb955-bdfa-486a-ad02-b54d5e61d194 Sep 12 17:23:15.738461 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:23:16.089315 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:23:16.089380 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:23:16.124555 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:23:16.128281 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:23:16.134993 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:23:16.135629 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:23:16.158595 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:23:16.184712 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (862) Sep 12 17:23:16.195350 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:23:16.195378 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:23:16.245313 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:23:16.245343 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:23:16.246879 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:23:16.260463 kernel: BTRFS info (device sda6): last unmount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:23:16.263545 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:23:16.267905 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:23:16.281380 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:23:16.305147 systemd-networkd[1005]: lo: Link UP Sep 12 17:23:16.305156 systemd-networkd[1005]: lo: Gained carrier Sep 12 17:23:16.306197 systemd-networkd[1005]: Enumeration completed Sep 12 17:23:16.306740 systemd-networkd[1005]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:23:16.306742 systemd-networkd[1005]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:23:16.307622 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:23:16.314531 systemd[1]: Reached target network.target - Network. Sep 12 17:23:16.378240 kernel: mlx5_core 6071:00:02.0 enP24689s1: Link up Sep 12 17:23:16.378419 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 12 17:23:16.412535 kernel: hv_netvsc 000d3af5-af47-000d-3af5-af47000d3af5 eth0: Data path switched to VF: enP24689s1 Sep 12 17:23:16.412595 systemd-networkd[1005]: enP24689s1: Link UP Sep 12 17:23:16.412648 systemd-networkd[1005]: eth0: Link UP Sep 12 17:23:16.412722 systemd-networkd[1005]: eth0: Gained carrier Sep 12 17:23:16.412730 systemd-networkd[1005]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:23:16.428910 systemd-networkd[1005]: enP24689s1: Gained carrier Sep 12 17:23:16.438543 systemd-networkd[1005]: eth0: DHCPv4 address 10.200.20.21/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:23:17.648872 ignition[1004]: Ignition 2.21.0 Sep 12 17:23:17.648883 ignition[1004]: Stage: fetch-offline Sep 12 17:23:17.648949 ignition[1004]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:17.654068 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:23:17.648954 ignition[1004]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:17.662759 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:23:17.649026 ignition[1004]: parsed url from cmdline: "" Sep 12 17:23:17.649029 ignition[1004]: no config URL provided Sep 12 17:23:17.649032 ignition[1004]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:23:17.649036 ignition[1004]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:23:17.649039 ignition[1004]: failed to fetch config: resource requires networking Sep 12 17:23:17.649156 ignition[1004]: Ignition finished successfully Sep 12 17:23:17.693236 ignition[1017]: Ignition 2.21.0 Sep 12 17:23:17.693241 ignition[1017]: Stage: fetch Sep 12 17:23:17.693407 ignition[1017]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:17.693415 ignition[1017]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:17.693487 ignition[1017]: parsed url from cmdline: "" Sep 12 17:23:17.693489 ignition[1017]: no config URL provided Sep 12 17:23:17.693492 ignition[1017]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:23:17.693498 ignition[1017]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:23:17.693531 ignition[1017]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 12 17:23:17.805559 ignition[1017]: GET result: OK Sep 12 17:23:17.805612 ignition[1017]: config has been read from IMDS userdata Sep 12 17:23:17.805628 ignition[1017]: parsing config with SHA512: d2eb40ad780eb11199621747c17ef07fd9a80d4b103c8a0d16f1f875058f1d529be4cabaafa77041ebc8dc4f7f91fa6ca43a198cc9f939e07769821f28b99ee3 Sep 12 17:23:17.811411 unknown[1017]: fetched base config from "system" Sep 12 17:23:17.811419 unknown[1017]: fetched base config from "system" Sep 12 17:23:17.811715 ignition[1017]: fetch: fetch complete Sep 12 17:23:17.811423 unknown[1017]: fetched user config from "azure" Sep 12 17:23:17.811719 ignition[1017]: fetch: fetch passed Sep 12 17:23:17.815903 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:23:17.811772 ignition[1017]: Ignition finished successfully Sep 12 17:23:17.824168 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:23:17.853151 ignition[1024]: Ignition 2.21.0 Sep 12 17:23:17.853165 ignition[1024]: Stage: kargs Sep 12 17:23:17.853291 ignition[1024]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:17.859558 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:23:17.853297 ignition[1024]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:17.864584 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:23:17.854111 ignition[1024]: kargs: kargs passed Sep 12 17:23:17.854154 ignition[1024]: Ignition finished successfully Sep 12 17:23:17.894104 ignition[1030]: Ignition 2.21.0 Sep 12 17:23:17.894119 ignition[1030]: Stage: disks Sep 12 17:23:17.894246 ignition[1030]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:17.899602 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:23:17.894253 ignition[1030]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:17.907023 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:23:17.894763 ignition[1030]: disks: disks passed Sep 12 17:23:17.914512 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:23:17.894802 ignition[1030]: Ignition finished successfully Sep 12 17:23:17.923015 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:23:17.930481 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:23:17.936785 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:23:17.944851 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:23:18.011297 systemd-fsck[1038]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 12 17:23:18.019860 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:23:18.030284 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:23:18.321838 systemd-networkd[1005]: eth0: Gained IPv6LL Sep 12 17:23:20.097532 kernel: EXT4-fs (sda9): mounted filesystem c902100c-52b7-422c-84ac-d834d4db2717 r/w with ordered data mode. Quota mode: none. Sep 12 17:23:20.098707 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:23:20.101979 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:23:20.134443 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:23:20.151364 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:23:20.156605 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:23:20.186243 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1052) Sep 12 17:23:20.186258 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:23:20.186265 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:23:20.168264 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:23:20.168288 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:23:20.195322 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:23:20.213733 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:23:20.213748 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:23:20.210210 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:23:20.223312 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:23:20.874283 coreos-metadata[1054]: Sep 12 17:23:20.874 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 17:23:20.882176 coreos-metadata[1054]: Sep 12 17:23:20.882 INFO Fetch successful Sep 12 17:23:20.882176 coreos-metadata[1054]: Sep 12 17:23:20.882 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 12 17:23:20.893797 coreos-metadata[1054]: Sep 12 17:23:20.893 INFO Fetch successful Sep 12 17:23:20.906625 coreos-metadata[1054]: Sep 12 17:23:20.906 INFO wrote hostname ci-4426.1.0-a-2d28ed79c9 to /sysroot/etc/hostname Sep 12 17:23:20.912787 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:23:21.081791 initrd-setup-root[1082]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:23:21.140309 initrd-setup-root[1089]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:23:21.159315 initrd-setup-root[1096]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:23:21.165535 initrd-setup-root[1103]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:23:22.203902 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:23:22.209101 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:23:22.224988 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:23:22.233050 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:23:22.241549 kernel: BTRFS info (device sda6): last unmount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:23:22.253119 ignition[1176]: INFO : Ignition 2.21.0 Sep 12 17:23:22.257451 ignition[1176]: INFO : Stage: mount Sep 12 17:23:22.257451 ignition[1176]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:22.257451 ignition[1176]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:22.269851 ignition[1176]: INFO : mount: mount passed Sep 12 17:23:22.269851 ignition[1176]: INFO : Ignition finished successfully Sep 12 17:23:22.272658 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:23:22.282948 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:23:22.303695 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:23:22.311552 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:23:22.343061 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1188) Sep 12 17:23:22.343085 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:23:22.347185 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:23:22.355523 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:23:22.355541 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:23:22.356892 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:23:22.379801 ignition[1205]: INFO : Ignition 2.21.0 Sep 12 17:23:22.379801 ignition[1205]: INFO : Stage: files Sep 12 17:23:22.386238 ignition[1205]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:22.386238 ignition[1205]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:22.386238 ignition[1205]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:23:22.406319 ignition[1205]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:23:22.406319 ignition[1205]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:23:22.459162 ignition[1205]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:23:22.464227 ignition[1205]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:23:22.464227 ignition[1205]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:23:22.459451 unknown[1205]: wrote ssh authorized keys file for user: core Sep 12 17:23:22.499595 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 17:23:22.506742 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 12 17:23:22.524413 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:23:22.623491 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:23:22.630700 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 17:23:22.682299 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 12 17:23:23.196423 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:23:23.410032 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 17:23:23.410032 ignition[1205]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:23:23.472433 ignition[1205]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:23:23.484128 ignition[1205]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:23:23.484128 ignition[1205]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:23:23.496792 ignition[1205]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:23:23.496792 ignition[1205]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:23:23.496792 ignition[1205]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:23:23.496792 ignition[1205]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:23:23.496792 ignition[1205]: INFO : files: files passed Sep 12 17:23:23.496792 ignition[1205]: INFO : Ignition finished successfully Sep 12 17:23:23.493803 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:23:23.501489 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:23:23.524103 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:23:23.533452 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:23:23.533562 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:23:23.565564 initrd-setup-root-after-ignition[1235]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:23:23.565564 initrd-setup-root-after-ignition[1235]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:23:23.581956 initrd-setup-root-after-ignition[1239]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:23:23.566049 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:23:23.576120 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:23:23.586655 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:23:23.636980 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:23:23.637069 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:23:23.645316 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:23:23.653486 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:23:23.660967 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:23:23.661454 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:23:23.691176 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:23:23.696949 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:23:23.716233 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:23:23.720759 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:23:23.729325 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:23:23.737056 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:23:23.737130 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:23:23.748035 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:23:23.751974 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:23:23.759701 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:23:23.767417 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:23:23.775257 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:23:23.783285 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:23:23.791604 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:23:23.799372 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:23:23.808201 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:23:23.815838 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:23:23.824098 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:23:23.830715 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:23:23.830794 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:23:23.841088 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:23:23.845253 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:23:23.853375 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:23:23.853420 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:23:23.861649 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:23:23.861718 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:23:23.874163 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:23:23.874234 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:23:23.879191 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:23:23.879252 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:23:23.886549 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:23:23.945546 ignition[1259]: INFO : Ignition 2.21.0 Sep 12 17:23:23.945546 ignition[1259]: INFO : Stage: umount Sep 12 17:23:23.945546 ignition[1259]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:23:23.945546 ignition[1259]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:23:23.945546 ignition[1259]: INFO : umount: umount passed Sep 12 17:23:23.945546 ignition[1259]: INFO : Ignition finished successfully Sep 12 17:23:23.886612 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:23:23.896746 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:23:23.919972 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:23:23.927187 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:23:23.927310 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:23:23.938252 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:23:23.938342 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:23:23.954606 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:23:23.954675 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:23:23.959964 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:23:23.960030 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:23:23.967427 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:23:23.967468 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:23:23.975345 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:23:23.975372 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:23:23.981798 systemd[1]: Stopped target network.target - Network. Sep 12 17:23:23.989289 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:23:23.989333 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:23:23.998329 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:23:24.005090 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:23:24.008544 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:23:24.017060 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:23:24.021160 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:23:24.028625 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:23:24.028662 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:23:24.036085 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:23:24.036110 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:23:24.043680 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:23:24.043713 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:23:24.050858 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:23:24.050882 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:23:24.058451 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:23:24.065265 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:23:24.076127 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:23:24.082165 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:23:24.082258 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:23:24.091352 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:23:24.091559 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:23:24.091637 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:23:24.101711 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:23:24.270826 kernel: hv_netvsc 000d3af5-af47-000d-3af5-af47000d3af5 eth0: Data path switched from VF: enP24689s1 Sep 12 17:23:24.101893 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:23:24.101972 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:23:24.109618 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:23:24.109681 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:23:24.119184 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:23:24.124590 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:23:24.124638 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:23:24.132853 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:23:24.132896 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:23:24.142977 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:23:24.154468 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:23:24.154527 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:23:24.162540 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:23:24.162581 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:23:24.170019 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:23:24.170050 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:23:24.174107 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:23:24.174135 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:23:24.185428 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:23:24.192435 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:23:24.192480 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:23:24.217227 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:23:24.217345 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:23:24.225317 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:23:24.225349 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:23:24.232660 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:23:24.232685 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:23:24.240541 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:23:24.240580 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:23:24.258555 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:23:24.258598 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:23:24.270760 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:23:24.270818 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:23:24.283643 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:23:24.296482 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:23:24.296800 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:23:24.310922 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:23:24.310965 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:23:24.323442 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 17:23:24.323490 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:23:24.490184 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 12 17:23:24.331754 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:23:24.331786 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:23:24.336679 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:23:24.336711 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:24.349178 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 17:23:24.349224 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 17:23:24.349244 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 17:23:24.349265 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:23:24.349507 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:23:24.349602 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:23:24.357380 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:23:24.357449 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:23:24.365552 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:23:24.373325 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:23:24.394284 systemd[1]: Switching root. Sep 12 17:23:24.551811 systemd-journald[224]: Journal stopped Sep 12 17:23:32.247626 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:23:32.247649 kernel: SELinux: policy capability open_perms=1 Sep 12 17:23:32.247656 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:23:32.247662 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:23:32.247668 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:23:32.247674 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:23:32.247682 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:23:32.247687 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:23:32.247693 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:23:32.247698 kernel: audit: type=1403 audit(1757697805.933:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:23:32.247705 systemd[1]: Successfully loaded SELinux policy in 252.746ms. Sep 12 17:23:32.247713 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.175ms. Sep 12 17:23:32.247720 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:23:32.247726 systemd[1]: Detected virtualization microsoft. Sep 12 17:23:32.247732 systemd[1]: Detected architecture arm64. Sep 12 17:23:32.247739 systemd[1]: Detected first boot. Sep 12 17:23:32.247745 systemd[1]: Hostname set to . Sep 12 17:23:32.247751 systemd[1]: Initializing machine ID from random generator. Sep 12 17:23:32.247757 zram_generator::config[1301]: No configuration found. Sep 12 17:23:32.247763 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:23:32.247769 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:23:32.247775 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:23:32.247782 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:23:32.247788 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:23:32.247793 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:23:32.247799 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:23:32.247806 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:23:32.247812 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:23:32.247818 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:23:32.247825 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:23:32.247831 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:23:32.247838 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:23:32.247844 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:23:32.247850 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:23:32.247856 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:23:32.247862 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:23:32.247868 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:23:32.247874 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:23:32.247880 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:23:32.247887 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 17:23:32.247894 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:23:32.247900 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:23:32.247906 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:23:32.247913 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:23:32.247919 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:23:32.247926 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:23:32.247932 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:23:32.247938 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:23:32.247944 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:23:32.247951 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:23:32.247957 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:23:32.247963 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:23:32.247970 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:23:32.247977 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:23:32.247983 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:23:32.247989 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:23:32.247995 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:23:32.248001 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:23:32.248009 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:23:32.248015 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:23:32.248021 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:23:32.248027 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:23:32.248033 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:23:32.248040 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:23:32.248046 systemd[1]: Reached target machines.target - Containers. Sep 12 17:23:32.248052 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:23:32.248059 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:23:32.248066 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:23:32.248072 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:23:32.248078 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:23:32.248084 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:23:32.248091 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:23:32.248097 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:23:32.248103 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:23:32.248109 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:23:32.248116 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:23:32.248122 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:23:32.248128 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:23:32.248134 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:23:32.248141 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:23:32.248147 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:23:32.248153 kernel: fuse: init (API version 7.41) Sep 12 17:23:32.248159 kernel: loop: module loaded Sep 12 17:23:32.248166 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:23:32.248172 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:23:32.248179 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:23:32.248197 kernel: ACPI: bus type drm_connector registered Sep 12 17:23:32.248203 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:23:32.248209 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:23:32.248234 systemd-journald[1391]: Collecting audit messages is disabled. Sep 12 17:23:32.248250 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:23:32.248257 systemd-journald[1391]: Journal started Sep 12 17:23:32.248272 systemd-journald[1391]: Runtime Journal (/run/log/journal/a2bb947978b545db889ced777059a19b) is 8M, max 78.5M, 70.5M free. Sep 12 17:23:31.525961 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:23:31.537855 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:23:31.538191 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:23:31.538435 systemd[1]: systemd-journald.service: Consumed 2.188s CPU time. Sep 12 17:23:32.252541 systemd[1]: Stopped verity-setup.service. Sep 12 17:23:32.264140 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:23:32.265964 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:23:32.270385 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:23:32.274575 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:23:32.278171 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:23:32.282440 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:23:32.286712 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:23:32.290357 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:23:32.296987 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:23:32.302011 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:23:32.302133 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:23:32.306665 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:23:32.306777 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:23:32.311286 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:23:32.311396 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:23:32.315681 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:23:32.315794 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:23:32.320444 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:23:32.320555 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:23:32.324678 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:23:32.324793 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:23:32.329186 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:23:32.333663 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:23:32.338611 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:23:32.343593 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:23:32.348347 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:23:32.361057 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:23:32.366023 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:23:32.375881 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:23:32.381778 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:23:32.381861 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:23:32.386578 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:23:32.394605 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:23:32.398341 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:23:32.403995 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:23:32.408771 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:23:32.413022 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:23:32.414639 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:23:32.419236 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:23:32.420015 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:23:32.438581 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:23:32.445291 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:23:32.450642 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:23:32.456720 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:23:32.464063 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:23:32.469823 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:23:32.476538 systemd-journald[1391]: Time spent on flushing to /var/log/journal/a2bb947978b545db889ced777059a19b is 8.329ms for 944 entries. Sep 12 17:23:32.476538 systemd-journald[1391]: System Journal (/var/log/journal/a2bb947978b545db889ced777059a19b) is 8M, max 2.6G, 2.6G free. Sep 12 17:23:32.509664 systemd-journald[1391]: Received client request to flush runtime journal. Sep 12 17:23:32.509700 kernel: loop0: detected capacity change from 0 to 29264 Sep 12 17:23:32.476642 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:23:32.510826 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:23:32.543398 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:23:32.548692 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:23:32.551562 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:23:32.645128 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. Sep 12 17:23:32.645139 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. Sep 12 17:23:32.647836 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:23:32.653546 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:23:32.931541 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:23:33.061573 kernel: loop1: detected capacity change from 0 to 119320 Sep 12 17:23:33.277288 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:23:33.282235 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:23:33.295679 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Sep 12 17:23:33.295890 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Sep 12 17:23:33.298049 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:23:33.538540 kernel: loop2: detected capacity change from 0 to 100608 Sep 12 17:23:34.002591 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:23:34.010174 kernel: loop3: detected capacity change from 0 to 203944 Sep 12 17:23:34.013651 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:23:34.034776 systemd-udevd[1465]: Using default interface naming scheme 'v255'. Sep 12 17:23:34.035590 kernel: loop4: detected capacity change from 0 to 29264 Sep 12 17:23:34.046547 kernel: loop5: detected capacity change from 0 to 119320 Sep 12 17:23:34.058535 kernel: loop6: detected capacity change from 0 to 100608 Sep 12 17:23:34.069537 kernel: loop7: detected capacity change from 0 to 203944 Sep 12 17:23:34.079566 (sd-merge)[1467]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 12 17:23:34.079908 (sd-merge)[1467]: Merged extensions into '/usr'. Sep 12 17:23:34.082121 systemd[1]: Reload requested from client PID 1441 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:23:34.082132 systemd[1]: Reloading... Sep 12 17:23:34.126562 zram_generator::config[1493]: No configuration found. Sep 12 17:23:34.293228 systemd[1]: Reloading finished in 210 ms. Sep 12 17:23:34.317505 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:23:34.328391 systemd[1]: Starting ensure-sysext.service... Sep 12 17:23:34.332428 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:23:34.364868 systemd[1]: Reload requested from client PID 1548 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:23:34.364878 systemd[1]: Reloading... Sep 12 17:23:34.370644 systemd-tmpfiles[1549]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:23:34.370895 systemd-tmpfiles[1549]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:23:34.371126 systemd-tmpfiles[1549]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:23:34.371258 systemd-tmpfiles[1549]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:23:34.371687 systemd-tmpfiles[1549]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:23:34.371820 systemd-tmpfiles[1549]: ACLs are not supported, ignoring. Sep 12 17:23:34.371847 systemd-tmpfiles[1549]: ACLs are not supported, ignoring. Sep 12 17:23:34.427662 zram_generator::config[1595]: No configuration found. Sep 12 17:23:34.435266 systemd-tmpfiles[1549]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:23:34.435275 systemd-tmpfiles[1549]: Skipping /boot Sep 12 17:23:34.439635 systemd-tmpfiles[1549]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:23:34.439717 systemd-tmpfiles[1549]: Skipping /boot Sep 12 17:23:34.533371 systemd[1]: Reloading finished in 168 ms. Sep 12 17:23:34.549295 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:23:34.567749 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:23:34.609511 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:23:34.622880 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:23:34.635920 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:23:34.640600 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:23:34.646289 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:23:34.652671 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:23:34.663035 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:23:34.668672 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:23:34.672867 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:23:34.672948 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:23:34.673658 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:23:34.673782 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:23:34.678881 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:23:34.678990 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:23:34.683716 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:23:34.683819 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:23:34.689160 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:23:34.706563 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:23:34.720239 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 12 17:23:34.725882 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:23:34.728225 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:23:34.736397 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:23:34.743841 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:23:34.779864 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:23:34.786909 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:23:34.787214 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:23:34.789194 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:23:34.796092 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:23:34.805177 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:23:34.812183 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:23:34.813107 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:23:34.819572 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:23:34.819692 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:23:34.824328 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:23:34.824438 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:23:34.830085 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:23:34.830230 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:23:34.836883 systemd[1]: Finished ensure-sysext.service. Sep 12 17:23:34.847222 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:23:34.847271 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:23:34.849611 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 17:23:34.866971 augenrules[1718]: No rules Sep 12 17:23:34.868584 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:23:34.868763 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:23:34.888556 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#226 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:23:34.904282 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:23:34.958531 kernel: hv_vmbus: registering driver hv_balloon Sep 12 17:23:34.958588 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:23:34.960055 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:23:34.970592 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 12 17:23:34.974624 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 12 17:23:34.987002 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 12 17:23:35.010732 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:35.030567 kernel: hv_vmbus: registering driver hyperv_fb Sep 12 17:23:35.034536 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 12 17:23:35.041575 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 12 17:23:35.044596 kernel: Console: switching to colour dummy device 80x25 Sep 12 17:23:35.051566 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 17:23:35.088622 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:23:35.088800 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:35.094561 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:23:35.096058 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:23:35.121804 systemd-resolved[1643]: Positive Trust Anchors: Sep 12 17:23:35.122046 systemd-resolved[1643]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:23:35.122109 systemd-resolved[1643]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:23:35.188284 systemd-resolved[1643]: Using system hostname 'ci-4426.1.0-a-2d28ed79c9'. Sep 12 17:23:35.190577 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:23:35.197066 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:23:35.203857 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 12 17:23:35.212131 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:23:35.251272 systemd-networkd[1695]: lo: Link UP Sep 12 17:23:35.251480 systemd-networkd[1695]: lo: Gained carrier Sep 12 17:23:35.252534 systemd-networkd[1695]: Enumeration completed Sep 12 17:23:35.252915 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:23:35.253317 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:23:35.253384 systemd-networkd[1695]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:23:35.257429 systemd[1]: Reached target network.target - Network. Sep 12 17:23:35.263618 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:23:35.268729 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:23:35.302905 kernel: mlx5_core 6071:00:02.0 enP24689s1: Link up Sep 12 17:23:35.303128 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 12 17:23:35.345608 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:23:35.355617 kernel: hv_netvsc 000d3af5-af47-000d-3af5-af47000d3af5 eth0: Data path switched to VF: enP24689s1 Sep 12 17:23:35.356344 systemd-networkd[1695]: enP24689s1: Link UP Sep 12 17:23:35.356445 systemd-networkd[1695]: eth0: Link UP Sep 12 17:23:35.356447 systemd-networkd[1695]: eth0: Gained carrier Sep 12 17:23:35.356461 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:23:35.360686 systemd-networkd[1695]: enP24689s1: Gained carrier Sep 12 17:23:35.367544 systemd-networkd[1695]: eth0: DHCPv4 address 10.200.20.21/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:23:35.405539 kernel: MACsec IEEE 802.1AE Sep 12 17:23:35.408767 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:23:36.386475 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:23:36.817681 systemd-networkd[1695]: eth0: Gained IPv6LL Sep 12 17:23:36.819752 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:23:36.825727 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:23:37.207917 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:23:37.213003 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:23:40.682537 ldconfig[1435]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:23:40.691595 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:23:40.697359 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:23:40.743927 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:23:40.748213 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:23:40.752318 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:23:40.756918 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:23:40.761883 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:23:40.765913 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:23:40.770634 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:23:40.775196 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:23:40.775217 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:23:40.778590 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:23:40.797581 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:23:40.802983 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:23:40.807812 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:23:40.812625 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:23:40.817441 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:23:40.825975 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:23:40.841880 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:23:40.846826 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:23:40.850974 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:23:40.854401 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:23:40.857897 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:23:40.857916 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:23:40.893912 systemd[1]: Starting chronyd.service - NTP client/server... Sep 12 17:23:40.908604 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:23:40.913250 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:23:40.918626 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:23:40.931034 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:23:40.938692 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:23:40.944216 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:23:40.948163 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:23:40.949029 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 12 17:23:40.954214 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 12 17:23:40.954970 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:40.960872 jq[1843]: false Sep 12 17:23:40.961610 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:23:40.966808 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:23:40.973511 KVP[1845]: KVP starting; pid is:1845 Sep 12 17:23:40.974006 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:23:40.976413 chronyd[1835]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 12 17:23:40.979458 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:23:40.985656 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:23:40.995869 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:23:40.999773 kernel: hv_utils: KVP IC version 4.0 Sep 12 17:23:40.996584 KVP[1845]: KVP LIC Version: 3.1 Sep 12 17:23:41.000602 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:23:41.000994 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:23:41.001613 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:23:41.007307 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:23:41.015787 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:23:41.018148 jq[1861]: true Sep 12 17:23:41.020583 chronyd[1835]: Timezone right/UTC failed leap second check, ignoring Sep 12 17:23:41.020885 chronyd[1835]: Loaded seccomp filter (level 2) Sep 12 17:23:41.022281 systemd[1]: Started chronyd.service - NTP client/server. Sep 12 17:23:41.026616 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:23:41.029303 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:23:41.031708 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:23:41.031835 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:23:41.038099 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:23:41.038621 extend-filesystems[1844]: Found /dev/sda6 Sep 12 17:23:41.042722 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:23:41.064693 extend-filesystems[1844]: Found /dev/sda9 Sep 12 17:23:41.070509 (ntainerd)[1876]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:23:41.071592 extend-filesystems[1844]: Checking size of /dev/sda9 Sep 12 17:23:41.073065 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:23:41.083480 jq[1875]: true Sep 12 17:23:41.112823 extend-filesystems[1844]: Old size kept for /dev/sda9 Sep 12 17:23:41.117105 systemd-logind[1855]: New seat seat0. Sep 12 17:23:41.117710 systemd-logind[1855]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Sep 12 17:23:41.117860 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:23:41.125067 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:23:41.125243 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:23:41.146487 update_engine[1860]: I20250912 17:23:41.145941 1860 main.cc:92] Flatcar Update Engine starting Sep 12 17:23:41.152058 tar[1871]: linux-arm64/helm Sep 12 17:23:41.217410 bash[1911]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:23:41.219634 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:23:41.228173 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 17:23:41.365328 dbus-daemon[1838]: [system] SELinux support is enabled Sep 12 17:23:41.365595 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:23:41.374628 update_engine[1860]: I20250912 17:23:41.374577 1860 update_check_scheduler.cc:74] Next update check in 5m34s Sep 12 17:23:41.375392 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:23:41.375417 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:23:41.376245 dbus-daemon[1838]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 17:23:41.382905 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:23:41.382923 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:23:41.389692 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:23:41.397239 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:23:41.420739 sshd_keygen[1872]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:23:41.449046 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:23:41.455784 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:23:41.462177 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 12 17:23:41.466830 coreos-metadata[1837]: Sep 12 17:23:41.466 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 17:23:41.475904 coreos-metadata[1837]: Sep 12 17:23:41.475 INFO Fetch successful Sep 12 17:23:41.475904 coreos-metadata[1837]: Sep 12 17:23:41.475 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 12 17:23:41.481874 coreos-metadata[1837]: Sep 12 17:23:41.480 INFO Fetch successful Sep 12 17:23:41.481874 coreos-metadata[1837]: Sep 12 17:23:41.481 INFO Fetching http://168.63.129.16/machine/83b83ced-d422-4003-8af6-87eeecd84e07/f2ceb0fc%2D8ede%2D4fd7%2Da90c%2De0a0801c4d2a.%5Fci%2D4426.1.0%2Da%2D2d28ed79c9?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 12 17:23:41.482405 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:23:41.483594 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:23:41.494755 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:23:41.508623 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 12 17:23:41.517202 coreos-metadata[1837]: Sep 12 17:23:41.516 INFO Fetch successful Sep 12 17:23:41.517392 coreos-metadata[1837]: Sep 12 17:23:41.517 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 12 17:23:41.526854 coreos-metadata[1837]: Sep 12 17:23:41.526 INFO Fetch successful Sep 12 17:23:41.530335 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:23:41.540479 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:23:41.547867 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 17:23:41.554704 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:23:41.574132 tar[1871]: linux-arm64/LICENSE Sep 12 17:23:41.574244 tar[1871]: linux-arm64/README.md Sep 12 17:23:41.577052 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:23:41.582726 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:23:41.592145 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:23:41.610920 locksmithd[1981]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:23:41.895249 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:41.907711 containerd[1876]: time="2025-09-12T17:23:41Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:23:41.908190 containerd[1876]: time="2025-09-12T17:23:41.908164020Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:23:41.913235 containerd[1876]: time="2025-09-12T17:23:41.913206756Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.52µs" Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913289996Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913310268Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913434820Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913445956Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913467532Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913501860Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913515036Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913678956Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913692772Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913700060Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913705236Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914262 containerd[1876]: time="2025-09-12T17:23:41.913757236Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.913896684Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.913914292Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.913920124Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.913948108Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.914089852Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:23:41.914442 containerd[1876]: time="2025-09-12T17:23:41.914146044Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:23:41.927529 containerd[1876]: time="2025-09-12T17:23:41.927494924Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:23:41.927623 containerd[1876]: time="2025-09-12T17:23:41.927609436Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:23:41.927666 containerd[1876]: time="2025-09-12T17:23:41.927658100Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:23:41.927724 containerd[1876]: time="2025-09-12T17:23:41.927714076Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:23:41.927770 containerd[1876]: time="2025-09-12T17:23:41.927759940Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:23:41.927812 containerd[1876]: time="2025-09-12T17:23:41.927802276Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:23:41.927849 containerd[1876]: time="2025-09-12T17:23:41.927839932Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:23:41.927884 containerd[1876]: time="2025-09-12T17:23:41.927875740Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:23:41.927927 containerd[1876]: time="2025-09-12T17:23:41.927917412Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:23:41.927962 containerd[1876]: time="2025-09-12T17:23:41.927953364Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:23:41.927994 containerd[1876]: time="2025-09-12T17:23:41.927984820Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:23:41.928029 containerd[1876]: time="2025-09-12T17:23:41.928019956Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:23:41.928158 containerd[1876]: time="2025-09-12T17:23:41.928143388Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:23:41.928216 containerd[1876]: time="2025-09-12T17:23:41.928205404Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:23:41.928269 containerd[1876]: time="2025-09-12T17:23:41.928258508Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:23:41.928308 containerd[1876]: time="2025-09-12T17:23:41.928297812Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:23:41.928348 containerd[1876]: time="2025-09-12T17:23:41.928338236Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:23:41.928391 containerd[1876]: time="2025-09-12T17:23:41.928380260Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:23:41.928436 containerd[1876]: time="2025-09-12T17:23:41.928425356Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:23:41.928477 containerd[1876]: time="2025-09-12T17:23:41.928466964Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:23:41.928513 containerd[1876]: time="2025-09-12T17:23:41.928503708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:23:41.928564 containerd[1876]: time="2025-09-12T17:23:41.928553644Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:23:41.928599 containerd[1876]: time="2025-09-12T17:23:41.928589764Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:23:41.928691 containerd[1876]: time="2025-09-12T17:23:41.928678660Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:23:41.928740 containerd[1876]: time="2025-09-12T17:23:41.928731620Z" level=info msg="Start snapshots syncer" Sep 12 17:23:41.928806 containerd[1876]: time="2025-09-12T17:23:41.928793868Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:23:41.929011 containerd[1876]: time="2025-09-12T17:23:41.928982508Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:23:41.929137 containerd[1876]: time="2025-09-12T17:23:41.929124260Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:23:41.929247 containerd[1876]: time="2025-09-12T17:23:41.929233940Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:23:41.929386 containerd[1876]: time="2025-09-12T17:23:41.929372220Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:23:41.929451 containerd[1876]: time="2025-09-12T17:23:41.929441172Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:23:41.929487 containerd[1876]: time="2025-09-12T17:23:41.929480492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:23:41.929536 containerd[1876]: time="2025-09-12T17:23:41.929528340Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:23:41.929585 containerd[1876]: time="2025-09-12T17:23:41.929576524Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:23:41.929624 containerd[1876]: time="2025-09-12T17:23:41.929615820Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:23:41.929663 containerd[1876]: time="2025-09-12T17:23:41.929654516Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:23:41.929712 containerd[1876]: time="2025-09-12T17:23:41.929702716Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:23:41.929759 containerd[1876]: time="2025-09-12T17:23:41.929749188Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:23:41.929804 containerd[1876]: time="2025-09-12T17:23:41.929792828Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:23:41.929887 containerd[1876]: time="2025-09-12T17:23:41.929873652Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:23:41.929945 containerd[1876]: time="2025-09-12T17:23:41.929931516Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:23:41.929991 containerd[1876]: time="2025-09-12T17:23:41.929978948Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:23:41.930030 containerd[1876]: time="2025-09-12T17:23:41.930018172Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:23:41.930068 containerd[1876]: time="2025-09-12T17:23:41.930058972Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:23:41.930102 containerd[1876]: time="2025-09-12T17:23:41.930093692Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:23:41.930139 containerd[1876]: time="2025-09-12T17:23:41.930130204Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:23:41.930187 containerd[1876]: time="2025-09-12T17:23:41.930179660Z" level=info msg="runtime interface created" Sep 12 17:23:41.930223 containerd[1876]: time="2025-09-12T17:23:41.930215148Z" level=info msg="created NRI interface" Sep 12 17:23:41.930256 containerd[1876]: time="2025-09-12T17:23:41.930248452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:23:41.930298 containerd[1876]: time="2025-09-12T17:23:41.930288756Z" level=info msg="Connect containerd service" Sep 12 17:23:41.930356 containerd[1876]: time="2025-09-12T17:23:41.930344844Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:23:41.931158 containerd[1876]: time="2025-09-12T17:23:41.930988588Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:23:42.140981 (kubelet)[2028]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:23:42.473833 kubelet[2028]: E0912 17:23:42.473781 2028 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:23:42.475868 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:23:42.476074 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:23:42.477627 systemd[1]: kubelet.service: Consumed 532ms CPU time, 255.5M memory peak. Sep 12 17:23:42.499715 containerd[1876]: time="2025-09-12T17:23:42.499679100Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:23:42.499773 containerd[1876]: time="2025-09-12T17:23:42.499734964Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:23:42.499862 containerd[1876]: time="2025-09-12T17:23:42.499828164Z" level=info msg="Start subscribing containerd event" Sep 12 17:23:42.499951 containerd[1876]: time="2025-09-12T17:23:42.499935492Z" level=info msg="Start recovering state" Sep 12 17:23:42.500074 containerd[1876]: time="2025-09-12T17:23:42.500062428Z" level=info msg="Start event monitor" Sep 12 17:23:42.500121 containerd[1876]: time="2025-09-12T17:23:42.500113332Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:23:42.500161 containerd[1876]: time="2025-09-12T17:23:42.500151412Z" level=info msg="Start streaming server" Sep 12 17:23:42.500206 containerd[1876]: time="2025-09-12T17:23:42.500195060Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:23:42.500262 containerd[1876]: time="2025-09-12T17:23:42.500237500Z" level=info msg="runtime interface starting up..." Sep 12 17:23:42.500361 containerd[1876]: time="2025-09-12T17:23:42.500253164Z" level=info msg="starting plugins..." Sep 12 17:23:42.500361 containerd[1876]: time="2025-09-12T17:23:42.500300100Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:23:42.501192 containerd[1876]: time="2025-09-12T17:23:42.500760940Z" level=info msg="containerd successfully booted in 0.593321s" Sep 12 17:23:42.500842 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:23:42.505981 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:23:42.513575 systemd[1]: Startup finished in 1.582s (kernel) + 13.745s (initrd) + 16.831s (userspace) = 32.159s. Sep 12 17:23:43.060627 login[2006]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Sep 12 17:23:43.088463 login[2007]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:43.099310 systemd-logind[1855]: New session 1 of user core. Sep 12 17:23:43.100446 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:23:43.102038 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:23:43.132877 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:23:43.134669 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:23:43.156789 (systemd)[2055]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:23:43.158382 systemd-logind[1855]: New session c1 of user core. Sep 12 17:23:43.422043 systemd[2055]: Queued start job for default target default.target. Sep 12 17:23:43.436171 systemd[2055]: Created slice app.slice - User Application Slice. Sep 12 17:23:43.436552 systemd[2055]: Reached target paths.target - Paths. Sep 12 17:23:43.436662 systemd[2055]: Reached target timers.target - Timers. Sep 12 17:23:43.437686 systemd[2055]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:23:43.444411 systemd[2055]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:23:43.444455 systemd[2055]: Reached target sockets.target - Sockets. Sep 12 17:23:43.444666 systemd[2055]: Reached target basic.target - Basic System. Sep 12 17:23:43.444732 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:23:43.445500 systemd[2055]: Reached target default.target - Main User Target. Sep 12 17:23:43.445539 systemd[2055]: Startup finished in 283ms. Sep 12 17:23:43.453934 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:23:43.631002 waagent[2001]: 2025-09-12T17:23:43.630944Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 12 17:23:43.634870 waagent[2001]: 2025-09-12T17:23:43.634833Z INFO Daemon Daemon OS: flatcar 4426.1.0 Sep 12 17:23:43.637854 waagent[2001]: 2025-09-12T17:23:43.637828Z INFO Daemon Daemon Python: 3.11.13 Sep 12 17:23:43.641604 waagent[2001]: 2025-09-12T17:23:43.641566Z INFO Daemon Daemon Run daemon Sep 12 17:23:43.644333 waagent[2001]: 2025-09-12T17:23:43.644303Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4426.1.0' Sep 12 17:23:43.650325 waagent[2001]: 2025-09-12T17:23:43.650287Z INFO Daemon Daemon Using waagent for provisioning Sep 12 17:23:43.653728 waagent[2001]: 2025-09-12T17:23:43.653699Z INFO Daemon Daemon Activate resource disk Sep 12 17:23:43.656705 waagent[2001]: 2025-09-12T17:23:43.656682Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 12 17:23:43.664166 waagent[2001]: 2025-09-12T17:23:43.664128Z INFO Daemon Daemon Found device: None Sep 12 17:23:43.666993 waagent[2001]: 2025-09-12T17:23:43.666967Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 12 17:23:43.672349 waagent[2001]: 2025-09-12T17:23:43.672303Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 12 17:23:43.679905 waagent[2001]: 2025-09-12T17:23:43.679872Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 17:23:43.683612 waagent[2001]: 2025-09-12T17:23:43.683586Z INFO Daemon Daemon Running default provisioning handler Sep 12 17:23:43.691219 waagent[2001]: 2025-09-12T17:23:43.691173Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 12 17:23:43.700137 waagent[2001]: 2025-09-12T17:23:43.700103Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 12 17:23:43.706306 waagent[2001]: 2025-09-12T17:23:43.706278Z INFO Daemon Daemon cloud-init is enabled: False Sep 12 17:23:43.709647 waagent[2001]: 2025-09-12T17:23:43.709622Z INFO Daemon Daemon Copying ovf-env.xml Sep 12 17:23:43.806890 waagent[2001]: 2025-09-12T17:23:43.806844Z INFO Daemon Daemon Successfully mounted dvd Sep 12 17:23:43.831444 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 12 17:23:43.833165 waagent[2001]: 2025-09-12T17:23:43.833125Z INFO Daemon Daemon Detect protocol endpoint Sep 12 17:23:43.836600 waagent[2001]: 2025-09-12T17:23:43.836567Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 17:23:43.840324 waagent[2001]: 2025-09-12T17:23:43.840298Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 12 17:23:43.844640 waagent[2001]: 2025-09-12T17:23:43.844606Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 12 17:23:43.848105 waagent[2001]: 2025-09-12T17:23:43.848078Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 12 17:23:43.851510 waagent[2001]: 2025-09-12T17:23:43.851479Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 12 17:23:43.892690 waagent[2001]: 2025-09-12T17:23:43.892654Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 12 17:23:43.897058 waagent[2001]: 2025-09-12T17:23:43.897039Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 12 17:23:43.900653 waagent[2001]: 2025-09-12T17:23:43.900633Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 12 17:23:44.047409 waagent[2001]: 2025-09-12T17:23:44.047302Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 12 17:23:44.051766 waagent[2001]: 2025-09-12T17:23:44.051736Z INFO Daemon Daemon Forcing an update of the goal state. Sep 12 17:23:44.058764 waagent[2001]: 2025-09-12T17:23:44.058730Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 17:23:44.064425 login[2006]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:44.068540 systemd-logind[1855]: New session 2 of user core. Sep 12 17:23:44.074616 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:23:44.077036 waagent[2001]: 2025-09-12T17:23:44.077001Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 12 17:23:44.082534 waagent[2001]: 2025-09-12T17:23:44.081125Z INFO Daemon Sep 12 17:23:44.083109 waagent[2001]: 2025-09-12T17:23:44.083078Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: d8c74284-704c-4e97-b14d-e7b8027d9bbe eTag: 15604660883237070561 source: Fabric] Sep 12 17:23:44.090801 waagent[2001]: 2025-09-12T17:23:44.090772Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 12 17:23:44.095118 waagent[2001]: 2025-09-12T17:23:44.095092Z INFO Daemon Sep 12 17:23:44.097474 waagent[2001]: 2025-09-12T17:23:44.096971Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 12 17:23:44.105392 waagent[2001]: 2025-09-12T17:23:44.105361Z INFO Daemon Daemon Downloading artifacts profile blob Sep 12 17:23:44.220885 waagent[2001]: 2025-09-12T17:23:44.220838Z INFO Daemon Downloaded certificate {'thumbprint': 'DDD14F12658954299460AE05CD2E3E277979A9F8', 'hasPrivateKey': True} Sep 12 17:23:44.227461 waagent[2001]: 2025-09-12T17:23:44.227430Z INFO Daemon Fetch goal state completed Sep 12 17:23:44.268563 waagent[2001]: 2025-09-12T17:23:44.268515Z INFO Daemon Daemon Starting provisioning Sep 12 17:23:44.271806 waagent[2001]: 2025-09-12T17:23:44.271779Z INFO Daemon Daemon Handle ovf-env.xml. Sep 12 17:23:44.274759 waagent[2001]: 2025-09-12T17:23:44.274738Z INFO Daemon Daemon Set hostname [ci-4426.1.0-a-2d28ed79c9] Sep 12 17:23:44.305905 waagent[2001]: 2025-09-12T17:23:44.305870Z INFO Daemon Daemon Publish hostname [ci-4426.1.0-a-2d28ed79c9] Sep 12 17:23:44.310349 waagent[2001]: 2025-09-12T17:23:44.310313Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 12 17:23:44.317739 waagent[2001]: 2025-09-12T17:23:44.314552Z INFO Daemon Daemon Primary interface is [eth0] Sep 12 17:23:44.336742 systemd-networkd[1695]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:23:44.336947 systemd-networkd[1695]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:23:44.336989 systemd-networkd[1695]: eth0: DHCP lease lost Sep 12 17:23:44.337743 waagent[2001]: 2025-09-12T17:23:44.337707Z INFO Daemon Daemon Create user account if not exists Sep 12 17:23:44.341866 waagent[2001]: 2025-09-12T17:23:44.341832Z INFO Daemon Daemon User core already exists, skip useradd Sep 12 17:23:44.345664 waagent[2001]: 2025-09-12T17:23:44.345633Z INFO Daemon Daemon Configure sudoer Sep 12 17:23:44.355187 waagent[2001]: 2025-09-12T17:23:44.355149Z INFO Daemon Daemon Configure sshd Sep 12 17:23:44.359559 systemd-networkd[1695]: eth0: DHCPv4 address 10.200.20.21/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:23:44.361742 waagent[2001]: 2025-09-12T17:23:44.361708Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 12 17:23:44.370011 waagent[2001]: 2025-09-12T17:23:44.369978Z INFO Daemon Daemon Deploy ssh public key. Sep 12 17:23:45.487537 waagent[2001]: 2025-09-12T17:23:45.484829Z INFO Daemon Daemon Provisioning complete Sep 12 17:23:45.496974 waagent[2001]: 2025-09-12T17:23:45.496946Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 12 17:23:45.500969 waagent[2001]: 2025-09-12T17:23:45.500939Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 12 17:23:45.507103 waagent[2001]: 2025-09-12T17:23:45.507075Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 12 17:23:45.600567 waagent[2107]: 2025-09-12T17:23:45.599812Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 12 17:23:45.600567 waagent[2107]: 2025-09-12T17:23:45.599915Z INFO ExtHandler ExtHandler OS: flatcar 4426.1.0 Sep 12 17:23:45.600567 waagent[2107]: 2025-09-12T17:23:45.599950Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 12 17:23:45.600567 waagent[2107]: 2025-09-12T17:23:45.599981Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 12 17:23:45.660548 waagent[2107]: 2025-09-12T17:23:45.659979Z INFO ExtHandler ExtHandler Distro: flatcar-4426.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 12 17:23:45.660548 waagent[2107]: 2025-09-12T17:23:45.660140Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:23:45.660548 waagent[2107]: 2025-09-12T17:23:45.660183Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:23:45.665578 waagent[2107]: 2025-09-12T17:23:45.665509Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 17:23:45.670321 waagent[2107]: 2025-09-12T17:23:45.670293Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 12 17:23:45.670678 waagent[2107]: 2025-09-12T17:23:45.670648Z INFO ExtHandler Sep 12 17:23:45.670728 waagent[2107]: 2025-09-12T17:23:45.670711Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 35085fe9-85c2-4bdd-9b5c-0ace524b12b3 eTag: 15604660883237070561 source: Fabric] Sep 12 17:23:45.670939 waagent[2107]: 2025-09-12T17:23:45.670916Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 12 17:23:45.671313 waagent[2107]: 2025-09-12T17:23:45.671286Z INFO ExtHandler Sep 12 17:23:45.671347 waagent[2107]: 2025-09-12T17:23:45.671332Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 12 17:23:45.674654 waagent[2107]: 2025-09-12T17:23:45.674630Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 12 17:23:45.720132 waagent[2107]: 2025-09-12T17:23:45.720083Z INFO ExtHandler Downloaded certificate {'thumbprint': 'DDD14F12658954299460AE05CD2E3E277979A9F8', 'hasPrivateKey': True} Sep 12 17:23:45.720437 waagent[2107]: 2025-09-12T17:23:45.720406Z INFO ExtHandler Fetch goal state completed Sep 12 17:23:45.731269 waagent[2107]: 2025-09-12T17:23:45.731228Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Sep 12 17:23:45.734211 waagent[2107]: 2025-09-12T17:23:45.734172Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2107 Sep 12 17:23:45.734300 waagent[2107]: 2025-09-12T17:23:45.734278Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 12 17:23:45.734536 waagent[2107]: 2025-09-12T17:23:45.734492Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 12 17:23:45.735507 waagent[2107]: 2025-09-12T17:23:45.735475Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4426.1.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 12 17:23:45.735829 waagent[2107]: 2025-09-12T17:23:45.735802Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4426.1.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 12 17:23:45.735927 waagent[2107]: 2025-09-12T17:23:45.735907Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 12 17:23:45.736316 waagent[2107]: 2025-09-12T17:23:45.736290Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 12 17:23:45.791588 waagent[2107]: 2025-09-12T17:23:45.791512Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 12 17:23:45.791678 waagent[2107]: 2025-09-12T17:23:45.791651Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 12 17:23:45.795681 waagent[2107]: 2025-09-12T17:23:45.795646Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 12 17:23:45.799913 systemd[1]: Reload requested from client PID 2122 ('systemctl') (unit waagent.service)... Sep 12 17:23:45.800106 systemd[1]: Reloading... Sep 12 17:23:45.871576 zram_generator::config[2182]: No configuration found. Sep 12 17:23:46.001208 systemd[1]: Reloading finished in 200 ms. Sep 12 17:23:46.015370 waagent[2107]: 2025-09-12T17:23:46.013096Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 12 17:23:46.015370 waagent[2107]: 2025-09-12T17:23:46.013212Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 12 17:23:46.939889 waagent[2107]: 2025-09-12T17:23:46.939819Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 12 17:23:46.940175 waagent[2107]: 2025-09-12T17:23:46.940139Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 12 17:23:46.940815 waagent[2107]: 2025-09-12T17:23:46.940755Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 12 17:23:46.940955 waagent[2107]: 2025-09-12T17:23:46.940858Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:23:46.941008 waagent[2107]: 2025-09-12T17:23:46.940987Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:23:46.941165 waagent[2107]: 2025-09-12T17:23:46.941141Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 12 17:23:46.941424 waagent[2107]: 2025-09-12T17:23:46.941388Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 12 17:23:46.941594 waagent[2107]: 2025-09-12T17:23:46.941509Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 12 17:23:46.941594 waagent[2107]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 12 17:23:46.941594 waagent[2107]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 12 17:23:46.941594 waagent[2107]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 12 17:23:46.941594 waagent[2107]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:23:46.941594 waagent[2107]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:23:46.941594 waagent[2107]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:23:46.942002 waagent[2107]: 2025-09-12T17:23:46.941966Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 12 17:23:46.942140 waagent[2107]: 2025-09-12T17:23:46.942023Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:23:46.942140 waagent[2107]: 2025-09-12T17:23:46.942064Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 12 17:23:46.942204 waagent[2107]: 2025-09-12T17:23:46.942175Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:23:46.942515 waagent[2107]: 2025-09-12T17:23:46.942462Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 12 17:23:46.942638 waagent[2107]: 2025-09-12T17:23:46.942598Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 12 17:23:46.942707 waagent[2107]: 2025-09-12T17:23:46.942683Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 12 17:23:46.942862 waagent[2107]: 2025-09-12T17:23:46.942838Z INFO EnvHandler ExtHandler Configure routes Sep 12 17:23:46.942903 waagent[2107]: 2025-09-12T17:23:46.942888Z INFO EnvHandler ExtHandler Gateway:None Sep 12 17:23:46.943381 waagent[2107]: 2025-09-12T17:23:46.943352Z INFO EnvHandler ExtHandler Routes:None Sep 12 17:23:46.947735 waagent[2107]: 2025-09-12T17:23:46.947699Z INFO ExtHandler ExtHandler Sep 12 17:23:46.947890 waagent[2107]: 2025-09-12T17:23:46.947822Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 9facebeb-727a-4e04-a105-ad2fab451349 correlation 96a7106d-f54e-4174-aac6-e37aa58a2733 created: 2025-09-12T17:22:30.725838Z] Sep 12 17:23:46.948593 waagent[2107]: 2025-09-12T17:23:46.948558Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 12 17:23:46.949083 waagent[2107]: 2025-09-12T17:23:46.949054Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Sep 12 17:23:46.996692 waagent[2107]: 2025-09-12T17:23:46.996652Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 12 17:23:46.996692 waagent[2107]: Try `iptables -h' or 'iptables --help' for more information.) Sep 12 17:23:46.996950 waagent[2107]: 2025-09-12T17:23:46.996919Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: E32BC451-361A-4A28-B4CA-CE85BDAF60EC;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 12 17:23:47.063842 waagent[2107]: 2025-09-12T17:23:47.063796Z INFO MonitorHandler ExtHandler Network interfaces: Sep 12 17:23:47.063842 waagent[2107]: Executing ['ip', '-a', '-o', 'link']: Sep 12 17:23:47.063842 waagent[2107]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 12 17:23:47.063842 waagent[2107]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:f5:af:47 brd ff:ff:ff:ff:ff:ff Sep 12 17:23:47.063842 waagent[2107]: 3: enP24689s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:f5:af:47 brd ff:ff:ff:ff:ff:ff\ altname enP24689p0s2 Sep 12 17:23:47.063842 waagent[2107]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 12 17:23:47.063842 waagent[2107]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 12 17:23:47.063842 waagent[2107]: 2: eth0 inet 10.200.20.21/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 12 17:23:47.063842 waagent[2107]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 12 17:23:47.063842 waagent[2107]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 12 17:23:47.063842 waagent[2107]: 2: eth0 inet6 fe80::20d:3aff:fef5:af47/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 12 17:23:47.120540 waagent[2107]: 2025-09-12T17:23:47.120349Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 12 17:23:47.120540 waagent[2107]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:23:47.120540 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.120540 waagent[2107]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:23:47.120540 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.120540 waagent[2107]: Chain OUTPUT (policy ACCEPT 3 packets, 354 bytes) Sep 12 17:23:47.120540 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.120540 waagent[2107]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 17:23:47.120540 waagent[2107]: 3 534 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 17:23:47.120540 waagent[2107]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 17:23:47.123258 waagent[2107]: 2025-09-12T17:23:47.123221Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 12 17:23:47.123258 waagent[2107]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:23:47.123258 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.123258 waagent[2107]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:23:47.123258 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.123258 waagent[2107]: Chain OUTPUT (policy ACCEPT 3 packets, 354 bytes) Sep 12 17:23:47.123258 waagent[2107]: pkts bytes target prot opt in out source destination Sep 12 17:23:47.123258 waagent[2107]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 17:23:47.123258 waagent[2107]: 7 950 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 17:23:47.123258 waagent[2107]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 17:23:47.123436 waagent[2107]: 2025-09-12T17:23:47.123413Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 12 17:23:47.648923 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:23:47.649748 systemd[1]: Started sshd@0-10.200.20.21:22-10.200.16.10:51944.service - OpenSSH per-connection server daemon (10.200.16.10:51944). Sep 12 17:23:48.274721 sshd[2249]: Accepted publickey for core from 10.200.16.10 port 51944 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:48.275693 sshd-session[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:48.279147 systemd-logind[1855]: New session 3 of user core. Sep 12 17:23:48.285636 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:23:48.667991 systemd[1]: Started sshd@1-10.200.20.21:22-10.200.16.10:51954.service - OpenSSH per-connection server daemon (10.200.16.10:51954). Sep 12 17:23:49.083206 sshd[2255]: Accepted publickey for core from 10.200.16.10 port 51954 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:49.084028 sshd-session[2255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:49.087174 systemd-logind[1855]: New session 4 of user core. Sep 12 17:23:49.098757 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:23:49.382711 sshd[2258]: Connection closed by 10.200.16.10 port 51954 Sep 12 17:23:49.383097 sshd-session[2255]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:49.386073 systemd[1]: sshd@1-10.200.20.21:22-10.200.16.10:51954.service: Deactivated successfully. Sep 12 17:23:49.387265 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:23:49.388712 systemd-logind[1855]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:23:49.389590 systemd-logind[1855]: Removed session 4. Sep 12 17:23:49.462490 systemd[1]: Started sshd@2-10.200.20.21:22-10.200.16.10:51970.service - OpenSSH per-connection server daemon (10.200.16.10:51970). Sep 12 17:23:49.913262 sshd[2264]: Accepted publickey for core from 10.200.16.10 port 51970 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:49.914127 sshd-session[2264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:49.917228 systemd-logind[1855]: New session 5 of user core. Sep 12 17:23:49.925692 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:23:50.248904 sshd[2267]: Connection closed by 10.200.16.10 port 51970 Sep 12 17:23:50.248365 sshd-session[2264]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:50.250818 systemd-logind[1855]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:23:50.250999 systemd[1]: sshd@2-10.200.20.21:22-10.200.16.10:51970.service: Deactivated successfully. Sep 12 17:23:50.252308 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:23:50.254192 systemd-logind[1855]: Removed session 5. Sep 12 17:23:50.325447 systemd[1]: Started sshd@3-10.200.20.21:22-10.200.16.10:58006.service - OpenSSH per-connection server daemon (10.200.16.10:58006). Sep 12 17:23:50.735498 sshd[2273]: Accepted publickey for core from 10.200.16.10 port 58006 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:50.736352 sshd-session[2273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:50.739409 systemd-logind[1855]: New session 6 of user core. Sep 12 17:23:50.750619 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:23:51.051406 sshd[2276]: Connection closed by 10.200.16.10 port 58006 Sep 12 17:23:51.051781 sshd-session[2273]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:51.054353 systemd[1]: sshd@3-10.200.20.21:22-10.200.16.10:58006.service: Deactivated successfully. Sep 12 17:23:51.055787 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:23:51.056475 systemd-logind[1855]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:23:51.057611 systemd-logind[1855]: Removed session 6. Sep 12 17:23:51.125701 systemd[1]: Started sshd@4-10.200.20.21:22-10.200.16.10:58020.service - OpenSSH per-connection server daemon (10.200.16.10:58020). Sep 12 17:23:51.536324 sshd[2282]: Accepted publickey for core from 10.200.16.10 port 58020 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:51.537203 sshd-session[2282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:51.540508 systemd-logind[1855]: New session 7 of user core. Sep 12 17:23:51.548635 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:23:52.006961 sudo[2286]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:23:52.007158 sudo[2286]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:23:52.031071 sudo[2286]: pam_unix(sudo:session): session closed for user root Sep 12 17:23:52.101560 sshd[2285]: Connection closed by 10.200.16.10 port 58020 Sep 12 17:23:52.102014 sshd-session[2282]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:52.104948 systemd[1]: sshd@4-10.200.20.21:22-10.200.16.10:58020.service: Deactivated successfully. Sep 12 17:23:52.106403 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:23:52.109448 systemd-logind[1855]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:23:52.110494 systemd-logind[1855]: Removed session 7. Sep 12 17:23:52.196706 systemd[1]: Started sshd@5-10.200.20.21:22-10.200.16.10:58028.service - OpenSSH per-connection server daemon (10.200.16.10:58028). Sep 12 17:23:52.602356 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:23:52.604264 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:52.689393 sshd[2292]: Accepted publickey for core from 10.200.16.10 port 58028 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:52.690128 sshd-session[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:52.693542 systemd-logind[1855]: New session 8 of user core. Sep 12 17:23:52.703623 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:23:52.745024 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:52.747345 (kubelet)[2304]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:23:52.822726 kubelet[2304]: E0912 17:23:52.822688 2304 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:23:52.825210 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:23:52.825294 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:23:52.825503 systemd[1]: kubelet.service: Consumed 103ms CPU time, 105.9M memory peak. Sep 12 17:23:52.963418 sudo[2312]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:23:52.963626 sudo[2312]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:23:53.104478 sudo[2312]: pam_unix(sudo:session): session closed for user root Sep 12 17:23:53.108073 sudo[2311]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:23:53.108265 sudo[2311]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:23:53.115340 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:23:53.138982 augenrules[2334]: No rules Sep 12 17:23:53.140012 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:23:53.140239 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:23:53.142356 sudo[2311]: pam_unix(sudo:session): session closed for user root Sep 12 17:23:53.213640 sshd[2298]: Connection closed by 10.200.16.10 port 58028 Sep 12 17:23:53.214088 sshd-session[2292]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:53.217348 systemd[1]: sshd@5-10.200.20.21:22-10.200.16.10:58028.service: Deactivated successfully. Sep 12 17:23:53.218681 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:23:53.219361 systemd-logind[1855]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:23:53.220479 systemd-logind[1855]: Removed session 8. Sep 12 17:23:53.299404 systemd[1]: Started sshd@6-10.200.20.21:22-10.200.16.10:58040.service - OpenSSH per-connection server daemon (10.200.16.10:58040). Sep 12 17:23:53.747197 sshd[2343]: Accepted publickey for core from 10.200.16.10 port 58040 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:23:53.748103 sshd-session[2343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:23:53.751222 systemd-logind[1855]: New session 9 of user core. Sep 12 17:23:53.758615 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:23:54.001723 sudo[2347]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:23:54.001907 sudo[2347]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:23:55.529309 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:23:55.540881 (dockerd)[2364]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:23:56.703548 dockerd[2364]: time="2025-09-12T17:23:56.703370140Z" level=info msg="Starting up" Sep 12 17:23:56.705513 dockerd[2364]: time="2025-09-12T17:23:56.705179972Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:23:56.712872 dockerd[2364]: time="2025-09-12T17:23:56.712849004Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:23:56.849250 dockerd[2364]: time="2025-09-12T17:23:56.849227988Z" level=info msg="Loading containers: start." Sep 12 17:23:56.904533 kernel: Initializing XFRM netlink socket Sep 12 17:23:57.320231 systemd-networkd[1695]: docker0: Link UP Sep 12 17:23:57.343600 dockerd[2364]: time="2025-09-12T17:23:57.343570284Z" level=info msg="Loading containers: done." Sep 12 17:23:57.352021 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2373861543-merged.mount: Deactivated successfully. Sep 12 17:23:57.362860 dockerd[2364]: time="2025-09-12T17:23:57.362622580Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:23:57.362860 dockerd[2364]: time="2025-09-12T17:23:57.362673916Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:23:57.362860 dockerd[2364]: time="2025-09-12T17:23:57.362740996Z" level=info msg="Initializing buildkit" Sep 12 17:23:57.415797 dockerd[2364]: time="2025-09-12T17:23:57.415777612Z" level=info msg="Completed buildkit initialization" Sep 12 17:23:57.420533 dockerd[2364]: time="2025-09-12T17:23:57.420499852Z" level=info msg="Daemon has completed initialization" Sep 12 17:23:57.420697 dockerd[2364]: time="2025-09-12T17:23:57.420667052Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:23:57.421579 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:23:58.307095 containerd[1876]: time="2025-09-12T17:23:58.306923308Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 17:23:59.205897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4267595624.mount: Deactivated successfully. Sep 12 17:24:00.437443 containerd[1876]: time="2025-09-12T17:24:00.437388020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:00.439890 containerd[1876]: time="2025-09-12T17:24:00.439865212Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=25687325" Sep 12 17:24:00.442573 containerd[1876]: time="2025-09-12T17:24:00.442551412Z" level=info msg="ImageCreate event name:\"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:00.446607 containerd[1876]: time="2025-09-12T17:24:00.446564708Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:00.447060 containerd[1876]: time="2025-09-12T17:24:00.446909380Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"25683924\" in 2.139951504s" Sep 12 17:24:00.447060 containerd[1876]: time="2025-09-12T17:24:00.446938716Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\"" Sep 12 17:24:00.448056 containerd[1876]: time="2025-09-12T17:24:00.448035500Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 17:24:01.680430 containerd[1876]: time="2025-09-12T17:24:01.680380580Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.682559 containerd[1876]: time="2025-09-12T17:24:01.682537556Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=22459767" Sep 12 17:24:01.685628 containerd[1876]: time="2025-09-12T17:24:01.685556140Z" level=info msg="ImageCreate event name:\"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.689111 containerd[1876]: time="2025-09-12T17:24:01.689088524Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.689680 containerd[1876]: time="2025-09-12T17:24:01.689540500Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"24028542\" in 1.241480864s" Sep 12 17:24:01.689680 containerd[1876]: time="2025-09-12T17:24:01.689566452Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\"" Sep 12 17:24:01.690002 containerd[1876]: time="2025-09-12T17:24:01.689969284Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 17:24:02.655241 containerd[1876]: time="2025-09-12T17:24:02.655193388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:02.658002 containerd[1876]: time="2025-09-12T17:24:02.657977436Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=17127506" Sep 12 17:24:02.660780 containerd[1876]: time="2025-09-12T17:24:02.660746476Z" level=info msg="ImageCreate event name:\"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:02.666478 containerd[1876]: time="2025-09-12T17:24:02.666447740Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:02.667536 containerd[1876]: time="2025-09-12T17:24:02.667364556Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"18696299\" in 977.375872ms" Sep 12 17:24:02.667536 containerd[1876]: time="2025-09-12T17:24:02.667388828Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\"" Sep 12 17:24:02.668075 containerd[1876]: time="2025-09-12T17:24:02.668028340Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 17:24:02.871898 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:24:02.873420 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:24:02.971094 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:02.973503 (kubelet)[2650]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:24:03.059905 kubelet[2650]: E0912 17:24:03.059859 2650 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:24:03.061723 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:24:03.061816 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:24:03.062229 systemd[1]: kubelet.service: Consumed 99ms CPU time, 105.3M memory peak. Sep 12 17:24:04.804914 chronyd[1835]: Selected source PHC0 Sep 12 17:24:04.980451 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount216529842.mount: Deactivated successfully. Sep 12 17:24:05.255913 containerd[1876]: time="2025-09-12T17:24:05.255869748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:05.258579 containerd[1876]: time="2025-09-12T17:24:05.258557729Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=26954907" Sep 12 17:24:05.261692 containerd[1876]: time="2025-09-12T17:24:05.261668814Z" level=info msg="ImageCreate event name:\"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:05.265347 containerd[1876]: time="2025-09-12T17:24:05.265307776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:05.265721 containerd[1876]: time="2025-09-12T17:24:05.265539356Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"26953926\" in 2.597311391s" Sep 12 17:24:05.265721 containerd[1876]: time="2025-09-12T17:24:05.265562510Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\"" Sep 12 17:24:05.265928 containerd[1876]: time="2025-09-12T17:24:05.265898790Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:24:06.015923 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3570130335.mount: Deactivated successfully. Sep 12 17:24:06.875549 containerd[1876]: time="2025-09-12T17:24:06.875313352Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:06.878473 containerd[1876]: time="2025-09-12T17:24:06.878440032Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 12 17:24:06.881371 containerd[1876]: time="2025-09-12T17:24:06.881329488Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:06.885753 containerd[1876]: time="2025-09-12T17:24:06.885714192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:06.890051 containerd[1876]: time="2025-09-12T17:24:06.890015944Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.62409172s" Sep 12 17:24:06.890289 containerd[1876]: time="2025-09-12T17:24:06.890231736Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 17:24:06.891941 containerd[1876]: time="2025-09-12T17:24:06.891918168Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:24:07.526120 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount590446137.mount: Deactivated successfully. Sep 12 17:24:07.542543 containerd[1876]: time="2025-09-12T17:24:07.542127910Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:24:07.544427 containerd[1876]: time="2025-09-12T17:24:07.544410998Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 12 17:24:07.546724 containerd[1876]: time="2025-09-12T17:24:07.546707470Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:24:07.550654 containerd[1876]: time="2025-09-12T17:24:07.550628230Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:24:07.551046 containerd[1876]: time="2025-09-12T17:24:07.551027182Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 659.081398ms" Sep 12 17:24:07.551122 containerd[1876]: time="2025-09-12T17:24:07.551110254Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 17:24:07.551503 containerd[1876]: time="2025-09-12T17:24:07.551487806Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 17:24:08.193932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount752005242.mount: Deactivated successfully. Sep 12 17:24:10.125544 containerd[1876]: time="2025-09-12T17:24:10.125195893Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:10.130055 containerd[1876]: time="2025-09-12T17:24:10.130028629Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537161" Sep 12 17:24:10.133182 containerd[1876]: time="2025-09-12T17:24:10.133147125Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:10.136853 containerd[1876]: time="2025-09-12T17:24:10.136612637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:10.137203 containerd[1876]: time="2025-09-12T17:24:10.137183261Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.585586335s" Sep 12 17:24:10.137203 containerd[1876]: time="2025-09-12T17:24:10.137204285Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 12 17:24:12.864844 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:12.865120 systemd[1]: kubelet.service: Consumed 99ms CPU time, 105.3M memory peak. Sep 12 17:24:12.868088 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:24:12.886679 systemd[1]: Reload requested from client PID 2800 ('systemctl') (unit session-9.scope)... Sep 12 17:24:12.886771 systemd[1]: Reloading... Sep 12 17:24:12.982719 zram_generator::config[2864]: No configuration found. Sep 12 17:24:13.109829 systemd[1]: Reloading finished in 222 ms. Sep 12 17:24:13.147839 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:24:13.147897 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:24:13.148092 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:13.148130 systemd[1]: kubelet.service: Consumed 70ms CPU time, 95M memory peak. Sep 12 17:24:13.149168 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:24:13.368612 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:13.373746 (kubelet)[2913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:24:13.493036 kubelet[2913]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:24:13.493036 kubelet[2913]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:24:13.493036 kubelet[2913]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:24:13.493262 kubelet[2913]: I0912 17:24:13.493100 2913 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:24:13.913710 kubelet[2913]: I0912 17:24:13.913672 2913 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:24:13.913710 kubelet[2913]: I0912 17:24:13.913698 2913 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:24:13.913979 kubelet[2913]: I0912 17:24:13.913960 2913 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:24:13.931249 kubelet[2913]: E0912 17:24:13.931227 2913 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.21:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:13.931417 kubelet[2913]: I0912 17:24:13.931303 2913 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:24:13.936564 kubelet[2913]: I0912 17:24:13.936475 2913 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:24:13.939633 kubelet[2913]: I0912 17:24:13.939570 2913 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:24:13.940533 kubelet[2913]: I0912 17:24:13.940249 2913 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:24:13.940533 kubelet[2913]: I0912 17:24:13.940346 2913 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:24:13.940533 kubelet[2913]: I0912 17:24:13.940364 2913 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-2d28ed79c9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:24:13.940533 kubelet[2913]: I0912 17:24:13.940470 2913 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:24:13.940679 kubelet[2913]: I0912 17:24:13.940477 2913 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:24:13.940768 kubelet[2913]: I0912 17:24:13.940757 2913 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:24:13.942673 kubelet[2913]: I0912 17:24:13.942657 2913 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:24:13.942751 kubelet[2913]: I0912 17:24:13.942741 2913 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:24:13.942804 kubelet[2913]: I0912 17:24:13.942798 2913 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:24:13.942854 kubelet[2913]: I0912 17:24:13.942846 2913 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:24:13.946477 kubelet[2913]: W0912 17:24:13.946342 2913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-2d28ed79c9&limit=500&resourceVersion=0": dial tcp 10.200.20.21:6443: connect: connection refused Sep 12 17:24:13.946477 kubelet[2913]: E0912 17:24:13.946383 2913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-2d28ed79c9&limit=500&resourceVersion=0\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:13.946992 kubelet[2913]: W0912 17:24:13.946663 2913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.21:6443: connect: connection refused Sep 12 17:24:13.946992 kubelet[2913]: E0912 17:24:13.946690 2913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:13.946992 kubelet[2913]: I0912 17:24:13.946977 2913 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:24:13.947269 kubelet[2913]: I0912 17:24:13.947252 2913 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:24:13.947301 kubelet[2913]: W0912 17:24:13.947293 2913 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:24:13.948324 kubelet[2913]: I0912 17:24:13.948305 2913 server.go:1274] "Started kubelet" Sep 12 17:24:13.954988 kubelet[2913]: I0912 17:24:13.954788 2913 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:24:13.955397 kubelet[2913]: I0912 17:24:13.955380 2913 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:24:13.956177 kubelet[2913]: I0912 17:24:13.956136 2913 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:24:13.956753 kubelet[2913]: I0912 17:24:13.956347 2913 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:24:13.956753 kubelet[2913]: E0912 17:24:13.955948 2913 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.21:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.21:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.1.0-a-2d28ed79c9.186498e2af0de043 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.1.0-a-2d28ed79c9,UID:ci-4426.1.0-a-2d28ed79c9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.1.0-a-2d28ed79c9,},FirstTimestamp:2025-09-12 17:24:13.948289091 +0000 UTC m=+0.572130357,LastTimestamp:2025-09-12 17:24:13.948289091 +0000 UTC m=+0.572130357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.1.0-a-2d28ed79c9,}" Sep 12 17:24:13.956954 kubelet[2913]: I0912 17:24:13.956932 2913 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:24:13.957450 kubelet[2913]: I0912 17:24:13.957432 2913 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:24:13.959502 kubelet[2913]: I0912 17:24:13.958640 2913 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:24:13.959502 kubelet[2913]: I0912 17:24:13.958712 2913 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:24:13.959502 kubelet[2913]: I0912 17:24:13.958753 2913 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:24:13.959502 kubelet[2913]: W0912 17:24:13.958943 2913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.21:6443: connect: connection refused Sep 12 17:24:13.959502 kubelet[2913]: E0912 17:24:13.958967 2913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:13.959502 kubelet[2913]: E0912 17:24:13.959085 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:13.959502 kubelet[2913]: E0912 17:24:13.959130 2913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-2d28ed79c9?timeout=10s\": dial tcp 10.200.20.21:6443: connect: connection refused" interval="200ms" Sep 12 17:24:13.960190 kubelet[2913]: I0912 17:24:13.960174 2913 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:24:13.960360 kubelet[2913]: I0912 17:24:13.960344 2913 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:24:13.961028 kubelet[2913]: E0912 17:24:13.961015 2913 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:24:13.961379 kubelet[2913]: I0912 17:24:13.961367 2913 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:24:13.979688 kubelet[2913]: I0912 17:24:13.979672 2913 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:24:13.979688 kubelet[2913]: I0912 17:24:13.979684 2913 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:24:13.979792 kubelet[2913]: I0912 17:24:13.979713 2913 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:24:13.983784 kubelet[2913]: I0912 17:24:13.983768 2913 policy_none.go:49] "None policy: Start" Sep 12 17:24:13.984168 kubelet[2913]: I0912 17:24:13.984152 2913 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:24:13.984168 kubelet[2913]: I0912 17:24:13.984170 2913 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:24:13.991770 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:24:14.000515 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:24:14.002983 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:24:14.013281 kubelet[2913]: I0912 17:24:14.013255 2913 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:24:14.013403 kubelet[2913]: I0912 17:24:14.013387 2913 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:24:14.013443 kubelet[2913]: I0912 17:24:14.013399 2913 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:24:14.013861 kubelet[2913]: I0912 17:24:14.013761 2913 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:24:14.015110 kubelet[2913]: I0912 17:24:14.015092 2913 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:24:14.015948 kubelet[2913]: I0912 17:24:14.015934 2913 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:24:14.016188 kubelet[2913]: I0912 17:24:14.016080 2913 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:24:14.016188 kubelet[2913]: I0912 17:24:14.016097 2913 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:24:14.016188 kubelet[2913]: E0912 17:24:14.016129 2913 kubelet.go:2345] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 12 17:24:14.016893 kubelet[2913]: W0912 17:24:14.016832 2913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.21:6443: connect: connection refused Sep 12 17:24:14.016893 kubelet[2913]: E0912 17:24:14.016870 2913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:14.017108 kubelet[2913]: E0912 17:24:14.017045 2913 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:14.115743 kubelet[2913]: I0912 17:24:14.115717 2913 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.116057 kubelet[2913]: E0912 17:24:14.116036 2913 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.21:6443/api/v1/nodes\": dial tcp 10.200.20.21:6443: connect: connection refused" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.124113 systemd[1]: Created slice kubepods-burstable-podfa380bcdb1988a1bc9e53a2b42cbe617.slice - libcontainer container kubepods-burstable-podfa380bcdb1988a1bc9e53a2b42cbe617.slice. Sep 12 17:24:14.141024 systemd[1]: Created slice kubepods-burstable-pod2f1989cec625b545d52ba26e35dfc086.slice - libcontainer container kubepods-burstable-pod2f1989cec625b545d52ba26e35dfc086.slice. Sep 12 17:24:14.146942 systemd[1]: Created slice kubepods-burstable-podf2364b5959a20451d072a1601d2089a1.slice - libcontainer container kubepods-burstable-podf2364b5959a20451d072a1601d2089a1.slice. Sep 12 17:24:14.159610 kubelet[2913]: E0912 17:24:14.159571 2913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-2d28ed79c9?timeout=10s\": dial tcp 10.200.20.21:6443: connect: connection refused" interval="400ms" Sep 12 17:24:14.264497 kubelet[2913]: I0912 17:24:14.264252 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264497 kubelet[2913]: I0912 17:24:14.264285 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264497 kubelet[2913]: I0912 17:24:14.264301 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264497 kubelet[2913]: I0912 17:24:14.264310 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f2364b5959a20451d072a1601d2089a1-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-2d28ed79c9\" (UID: \"f2364b5959a20451d072a1601d2089a1\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264497 kubelet[2913]: I0912 17:24:14.264320 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264644 kubelet[2913]: I0912 17:24:14.264332 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264644 kubelet[2913]: I0912 17:24:14.264341 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264644 kubelet[2913]: I0912 17:24:14.264350 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.264644 kubelet[2913]: I0912 17:24:14.264360 2913 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.317875 kubelet[2913]: I0912 17:24:14.317846 2913 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.318118 kubelet[2913]: E0912 17:24:14.318093 2913 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.21:6443/api/v1/nodes\": dial tcp 10.200.20.21:6443: connect: connection refused" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.439943 containerd[1876]: time="2025-09-12T17:24:14.439642942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-2d28ed79c9,Uid:fa380bcdb1988a1bc9e53a2b42cbe617,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:14.445685 containerd[1876]: time="2025-09-12T17:24:14.445659974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-2d28ed79c9,Uid:2f1989cec625b545d52ba26e35dfc086,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:14.449269 containerd[1876]: time="2025-09-12T17:24:14.449201772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-2d28ed79c9,Uid:f2364b5959a20451d072a1601d2089a1,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:14.528247 containerd[1876]: time="2025-09-12T17:24:14.528083529Z" level=info msg="connecting to shim 66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97" address="unix:///run/containerd/s/24a066906490cad0b92621183b744491951d9db26c87fd7127503289e8c52530" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:14.538445 containerd[1876]: time="2025-09-12T17:24:14.538417555Z" level=info msg="connecting to shim 46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3" address="unix:///run/containerd/s/7f5e405d6076e5284677238b3d9247ceb82efeb187eceb8daaea531ccd2abcf0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:14.547529 containerd[1876]: time="2025-09-12T17:24:14.547484053Z" level=info msg="connecting to shim b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe" address="unix:///run/containerd/s/631d3dc6e57313fb54708b50af1572ee33b0bbf9c0cb59d310b9969bb8ecbe17" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:14.558773 systemd[1]: Started cri-containerd-46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3.scope - libcontainer container 46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3. Sep 12 17:24:14.561059 systemd[1]: Started cri-containerd-66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97.scope - libcontainer container 66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97. Sep 12 17:24:14.561314 kubelet[2913]: E0912 17:24:14.561251 2913 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-2d28ed79c9?timeout=10s\": dial tcp 10.200.20.21:6443: connect: connection refused" interval="800ms" Sep 12 17:24:14.578627 systemd[1]: Started cri-containerd-b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe.scope - libcontainer container b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe. Sep 12 17:24:14.615759 containerd[1876]: time="2025-09-12T17:24:14.615721715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-2d28ed79c9,Uid:2f1989cec625b545d52ba26e35dfc086,Namespace:kube-system,Attempt:0,} returns sandbox id \"46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3\"" Sep 12 17:24:14.617817 containerd[1876]: time="2025-09-12T17:24:14.617717737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-2d28ed79c9,Uid:fa380bcdb1988a1bc9e53a2b42cbe617,Namespace:kube-system,Attempt:0,} returns sandbox id \"66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97\"" Sep 12 17:24:14.621736 containerd[1876]: time="2025-09-12T17:24:14.621710029Z" level=info msg="CreateContainer within sandbox \"66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:24:14.622344 containerd[1876]: time="2025-09-12T17:24:14.622221871Z" level=info msg="CreateContainer within sandbox \"46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:24:14.634510 containerd[1876]: time="2025-09-12T17:24:14.634455741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-2d28ed79c9,Uid:f2364b5959a20451d072a1601d2089a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe\"" Sep 12 17:24:14.638609 containerd[1876]: time="2025-09-12T17:24:14.638567794Z" level=info msg="CreateContainer within sandbox \"b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:24:14.640926 containerd[1876]: time="2025-09-12T17:24:14.640904852Z" level=info msg="Container 30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:14.643484 kubelet[2913]: E0912 17:24:14.643403 2913 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.21:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.21:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.1.0-a-2d28ed79c9.186498e2af0de043 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.1.0-a-2d28ed79c9,UID:ci-4426.1.0-a-2d28ed79c9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.1.0-a-2d28ed79c9,},FirstTimestamp:2025-09-12 17:24:13.948289091 +0000 UTC m=+0.572130357,LastTimestamp:2025-09-12 17:24:13.948289091 +0000 UTC m=+0.572130357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.1.0-a-2d28ed79c9,}" Sep 12 17:24:14.657287 containerd[1876]: time="2025-09-12T17:24:14.657262982Z" level=info msg="Container 11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:14.666221 containerd[1876]: time="2025-09-12T17:24:14.666196271Z" level=info msg="Container 183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:14.676977 containerd[1876]: time="2025-09-12T17:24:14.676952065Z" level=info msg="CreateContainer within sandbox \"66a1053039b2a129b6ec001f094b4826e6a86444af0ca0ff54f3680fa74ebe97\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c\"" Sep 12 17:24:14.677345 containerd[1876]: time="2025-09-12T17:24:14.677323371Z" level=info msg="StartContainer for \"30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c\"" Sep 12 17:24:14.678025 containerd[1876]: time="2025-09-12T17:24:14.678000789Z" level=info msg="connecting to shim 30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c" address="unix:///run/containerd/s/24a066906490cad0b92621183b744491951d9db26c87fd7127503289e8c52530" protocol=ttrpc version=3 Sep 12 17:24:14.689160 containerd[1876]: time="2025-09-12T17:24:14.688793492Z" level=info msg="CreateContainer within sandbox \"46e7195c6552753b1d04979813a9f9b84e0a66baa3ee81431f0486e776a2d1e3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec\"" Sep 12 17:24:14.689735 containerd[1876]: time="2025-09-12T17:24:14.689649907Z" level=info msg="StartContainer for \"11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec\"" Sep 12 17:24:14.690883 containerd[1876]: time="2025-09-12T17:24:14.690860342Z" level=info msg="connecting to shim 11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec" address="unix:///run/containerd/s/7f5e405d6076e5284677238b3d9247ceb82efeb187eceb8daaea531ccd2abcf0" protocol=ttrpc version=3 Sep 12 17:24:14.691633 systemd[1]: Started cri-containerd-30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c.scope - libcontainer container 30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c. Sep 12 17:24:14.694500 containerd[1876]: time="2025-09-12T17:24:14.694429538Z" level=info msg="CreateContainer within sandbox \"b0431debbe17f4191a15c2d4937dfe23b6c36060f759f49fc7c8b439de2814fe\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679\"" Sep 12 17:24:14.696567 containerd[1876]: time="2025-09-12T17:24:14.696337453Z" level=info msg="StartContainer for \"183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679\"" Sep 12 17:24:14.704452 containerd[1876]: time="2025-09-12T17:24:14.704355675Z" level=info msg="connecting to shim 183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679" address="unix:///run/containerd/s/631d3dc6e57313fb54708b50af1572ee33b0bbf9c0cb59d310b9969bb8ecbe17" protocol=ttrpc version=3 Sep 12 17:24:14.713632 systemd[1]: Started cri-containerd-11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec.scope - libcontainer container 11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec. Sep 12 17:24:14.720667 kubelet[2913]: I0912 17:24:14.720536 2913 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.721034 kubelet[2913]: E0912 17:24:14.721012 2913 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.21:6443/api/v1/nodes\": dial tcp 10.200.20.21:6443: connect: connection refused" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:14.730634 systemd[1]: Started cri-containerd-183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679.scope - libcontainer container 183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679. Sep 12 17:24:14.741125 containerd[1876]: time="2025-09-12T17:24:14.741088073Z" level=info msg="StartContainer for \"30858165e438434a4a8eaaf7fca60c6cc8169801fcc64e046bde461d2d81662c\" returns successfully" Sep 12 17:24:14.768428 kubelet[2913]: W0912 17:24:14.768322 2913 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-2d28ed79c9&limit=500&resourceVersion=0": dial tcp 10.200.20.21:6443: connect: connection refused Sep 12 17:24:14.769614 kubelet[2913]: E0912 17:24:14.768508 2913 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-2d28ed79c9&limit=500&resourceVersion=0\": dial tcp 10.200.20.21:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:24:14.773296 containerd[1876]: time="2025-09-12T17:24:14.773172600Z" level=info msg="StartContainer for \"11b459affb9922f75101b5bcc9cee8d03b4b5a81548b68575e46a773ab02c5ec\" returns successfully" Sep 12 17:24:14.788394 containerd[1876]: time="2025-09-12T17:24:14.788266793Z" level=info msg="StartContainer for \"183df6cefb6abae81dac89392877bf22a659dc9234a46f421d6f54c767f75679\" returns successfully" Sep 12 17:24:15.525145 kubelet[2913]: I0912 17:24:15.524629 2913 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:15.906038 kubelet[2913]: E0912 17:24:15.905966 2913 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426.1.0-a-2d28ed79c9\" not found" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:16.155512 kubelet[2913]: I0912 17:24:16.155372 2913 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:16.155512 kubelet[2913]: E0912 17:24:16.155404 2913 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4426.1.0-a-2d28ed79c9\": node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.306076 kubelet[2913]: E0912 17:24:16.305948 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.406624 kubelet[2913]: E0912 17:24:16.406593 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.506701 kubelet[2913]: E0912 17:24:16.506672 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.607133 kubelet[2913]: E0912 17:24:16.607033 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.707780 kubelet[2913]: E0912 17:24:16.707745 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.808173 kubelet[2913]: E0912 17:24:16.808145 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:16.908541 kubelet[2913]: E0912 17:24:16.908503 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:17.009162 kubelet[2913]: E0912 17:24:17.009135 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:17.109503 kubelet[2913]: E0912 17:24:17.109477 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:17.210177 kubelet[2913]: E0912 17:24:17.210002 2913 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:17.948344 kubelet[2913]: I0912 17:24:17.948146 2913 apiserver.go:52] "Watching apiserver" Sep 12 17:24:17.958914 kubelet[2913]: I0912 17:24:17.958893 2913 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:24:18.424187 systemd[1]: Reload requested from client PID 3179 ('systemctl') (unit session-9.scope)... Sep 12 17:24:18.424418 systemd[1]: Reloading... Sep 12 17:24:18.494542 zram_generator::config[3226]: No configuration found. Sep 12 17:24:18.653992 systemd[1]: Reloading finished in 229 ms. Sep 12 17:24:18.676957 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:24:18.692958 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:24:18.693182 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:18.693220 systemd[1]: kubelet.service: Consumed 685ms CPU time, 125.3M memory peak. Sep 12 17:24:18.696704 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:24:18.786035 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:24:18.793749 (kubelet)[3290]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:24:18.821844 kubelet[3290]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:24:18.822547 kubelet[3290]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:24:18.822547 kubelet[3290]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:24:18.822547 kubelet[3290]: I0912 17:24:18.822121 3290 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:24:18.828472 kubelet[3290]: I0912 17:24:18.828448 3290 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:24:18.828589 kubelet[3290]: I0912 17:24:18.828575 3290 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:24:18.828805 kubelet[3290]: I0912 17:24:18.828788 3290 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:24:18.829822 kubelet[3290]: I0912 17:24:18.829803 3290 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:24:18.901843 kubelet[3290]: I0912 17:24:18.901801 3290 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:24:18.905657 kubelet[3290]: I0912 17:24:18.905643 3290 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:24:18.907914 kubelet[3290]: I0912 17:24:18.907897 3290 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:24:18.908255 kubelet[3290]: I0912 17:24:18.908240 3290 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:24:18.908422 kubelet[3290]: I0912 17:24:18.908398 3290 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:24:18.908610 kubelet[3290]: I0912 17:24:18.908477 3290 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-2d28ed79c9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:24:18.908732 kubelet[3290]: I0912 17:24:18.908719 3290 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:24:18.908778 kubelet[3290]: I0912 17:24:18.908772 3290 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:24:18.908842 kubelet[3290]: I0912 17:24:18.908836 3290 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:24:18.908973 kubelet[3290]: I0912 17:24:18.908964 3290 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:24:18.909026 kubelet[3290]: I0912 17:24:18.909020 3290 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:24:18.909077 kubelet[3290]: I0912 17:24:18.909071 3290 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:24:18.909125 kubelet[3290]: I0912 17:24:18.909117 3290 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:24:18.911698 kubelet[3290]: I0912 17:24:18.911679 3290 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:24:18.911969 kubelet[3290]: I0912 17:24:18.911952 3290 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:24:18.912504 kubelet[3290]: I0912 17:24:18.912200 3290 server.go:1274] "Started kubelet" Sep 12 17:24:18.914334 kubelet[3290]: I0912 17:24:18.914299 3290 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:24:18.918049 kubelet[3290]: I0912 17:24:18.918030 3290 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:24:18.918658 kubelet[3290]: I0912 17:24:18.918598 3290 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:24:18.919449 kubelet[3290]: I0912 17:24:18.919432 3290 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:24:18.922007 kubelet[3290]: I0912 17:24:18.919637 3290 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:24:18.922285 kubelet[3290]: I0912 17:24:18.922272 3290 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:24:18.922397 kubelet[3290]: I0912 17:24:18.919881 3290 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:24:18.925330 kubelet[3290]: I0912 17:24:18.919873 3290 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:24:18.925451 kubelet[3290]: E0912 17:24:18.919976 3290 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-2d28ed79c9\" not found" Sep 12 17:24:18.926106 kubelet[3290]: I0912 17:24:18.926090 3290 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:24:18.931547 kubelet[3290]: I0912 17:24:18.931470 3290 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:24:18.932109 kubelet[3290]: I0912 17:24:18.932082 3290 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:24:18.938692 kubelet[3290]: I0912 17:24:18.938674 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:24:18.940266 kubelet[3290]: I0912 17:24:18.940241 3290 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:24:18.940266 kubelet[3290]: I0912 17:24:18.940262 3290 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:24:18.940347 kubelet[3290]: I0912 17:24:18.940275 3290 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:24:18.940347 kubelet[3290]: E0912 17:24:18.940305 3290 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:24:18.945978 kubelet[3290]: I0912 17:24:18.945953 3290 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:24:18.988867 kubelet[3290]: I0912 17:24:18.988847 3290 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:24:18.988867 kubelet[3290]: I0912 17:24:18.988860 3290 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:24:18.988997 kubelet[3290]: I0912 17:24:18.988876 3290 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:24:18.988997 kubelet[3290]: I0912 17:24:18.988974 3290 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:24:18.988997 kubelet[3290]: I0912 17:24:18.988981 3290 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:24:18.988997 kubelet[3290]: I0912 17:24:18.988993 3290 policy_none.go:49] "None policy: Start" Sep 12 17:24:18.989881 kubelet[3290]: I0912 17:24:18.989865 3290 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:24:18.989881 kubelet[3290]: I0912 17:24:18.989885 3290 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:24:18.990014 kubelet[3290]: I0912 17:24:18.989996 3290 state_mem.go:75] "Updated machine memory state" Sep 12 17:24:18.993236 kubelet[3290]: I0912 17:24:18.993217 3290 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:24:18.993362 kubelet[3290]: I0912 17:24:18.993345 3290 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:24:18.993406 kubelet[3290]: I0912 17:24:18.993359 3290 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:24:18.993844 kubelet[3290]: I0912 17:24:18.993824 3290 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:24:19.052188 kubelet[3290]: W0912 17:24:19.052116 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:24:19.057645 kubelet[3290]: W0912 17:24:19.057538 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:24:19.057645 kubelet[3290]: W0912 17:24:19.057578 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:24:19.095779 kubelet[3290]: I0912 17:24:19.095742 3290 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.111066 kubelet[3290]: I0912 17:24:19.110850 3290 kubelet_node_status.go:111] "Node was previously registered" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.111066 kubelet[3290]: I0912 17:24:19.110910 3290 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.126970 kubelet[3290]: I0912 17:24:19.126948 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127091 kubelet[3290]: I0912 17:24:19.127074 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127184 kubelet[3290]: I0912 17:24:19.127168 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127256 kubelet[3290]: I0912 17:24:19.127245 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127329 kubelet[3290]: I0912 17:24:19.127317 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127397 kubelet[3290]: I0912 17:24:19.127384 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f2364b5959a20451d072a1601d2089a1-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-2d28ed79c9\" (UID: \"f2364b5959a20451d072a1601d2089a1\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127463 kubelet[3290]: I0912 17:24:19.127453 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa380bcdb1988a1bc9e53a2b42cbe617-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" (UID: \"fa380bcdb1988a1bc9e53a2b42cbe617\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127580 kubelet[3290]: I0912 17:24:19.127534 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.127580 kubelet[3290]: I0912 17:24:19.127563 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f1989cec625b545d52ba26e35dfc086-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-2d28ed79c9\" (UID: \"2f1989cec625b545d52ba26e35dfc086\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.910069 kubelet[3290]: I0912 17:24:19.909888 3290 apiserver.go:52] "Watching apiserver" Sep 12 17:24:19.923087 kubelet[3290]: I0912 17:24:19.923045 3290 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:24:19.992647 kubelet[3290]: W0912 17:24:19.992619 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:24:19.992801 kubelet[3290]: E0912 17:24:19.992771 3290 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4426.1.0-a-2d28ed79c9\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:19.993045 kubelet[3290]: W0912 17:24:19.992630 3290 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 12 17:24:19.993045 kubelet[3290]: E0912 17:24:19.992942 3290 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4426.1.0-a-2d28ed79c9\" already exists" pod="kube-system/kube-scheduler-ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:20.025809 kubelet[3290]: I0912 17:24:20.025741 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.1.0-a-2d28ed79c9" podStartSLOduration=1.025728295 podStartE2EDuration="1.025728295s" podCreationTimestamp="2025-09-12 17:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:20.008150804 +0000 UTC m=+1.211373319" watchObservedRunningTime="2025-09-12 17:24:20.025728295 +0000 UTC m=+1.228950802" Sep 12 17:24:20.044810 kubelet[3290]: I0912 17:24:20.044719 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.1.0-a-2d28ed79c9" podStartSLOduration=1.044709963 podStartE2EDuration="1.044709963s" podCreationTimestamp="2025-09-12 17:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:20.026463637 +0000 UTC m=+1.229686144" watchObservedRunningTime="2025-09-12 17:24:20.044709963 +0000 UTC m=+1.247932470" Sep 12 17:24:20.055804 kubelet[3290]: I0912 17:24:20.055770 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-2d28ed79c9" podStartSLOduration=1.055760852 podStartE2EDuration="1.055760852s" podCreationTimestamp="2025-09-12 17:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:20.04574996 +0000 UTC m=+1.248972475" watchObservedRunningTime="2025-09-12 17:24:20.055760852 +0000 UTC m=+1.258983359" Sep 12 17:24:22.796744 kubelet[3290]: I0912 17:24:22.796656 3290 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:24:22.797373 containerd[1876]: time="2025-09-12T17:24:22.797288742Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:24:22.797923 kubelet[3290]: I0912 17:24:22.797494 3290 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:24:22.950915 systemd[1]: Created slice kubepods-besteffort-poda6c2c81f_2ea1_4da3_aa21_934956a80450.slice - libcontainer container kubepods-besteffort-poda6c2c81f_2ea1_4da3_aa21_934956a80450.slice. Sep 12 17:24:23.054136 kubelet[3290]: I0912 17:24:23.053953 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-proxy\") pod \"kube-proxy-4nvrk\" (UID: \"a6c2c81f-2ea1-4da3-aa21-934956a80450\") " pod="kube-system/kube-proxy-4nvrk" Sep 12 17:24:23.054136 kubelet[3290]: I0912 17:24:23.053983 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9grt2\" (UniqueName: \"kubernetes.io/projected/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-api-access-9grt2\") pod \"kube-proxy-4nvrk\" (UID: \"a6c2c81f-2ea1-4da3-aa21-934956a80450\") " pod="kube-system/kube-proxy-4nvrk" Sep 12 17:24:23.054136 kubelet[3290]: I0912 17:24:23.054003 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a6c2c81f-2ea1-4da3-aa21-934956a80450-xtables-lock\") pod \"kube-proxy-4nvrk\" (UID: \"a6c2c81f-2ea1-4da3-aa21-934956a80450\") " pod="kube-system/kube-proxy-4nvrk" Sep 12 17:24:23.054136 kubelet[3290]: I0912 17:24:23.054014 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6c2c81f-2ea1-4da3-aa21-934956a80450-lib-modules\") pod \"kube-proxy-4nvrk\" (UID: \"a6c2c81f-2ea1-4da3-aa21-934956a80450\") " pod="kube-system/kube-proxy-4nvrk" Sep 12 17:24:23.065956 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 12 17:24:23.159066 kubelet[3290]: E0912 17:24:23.158829 3290 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 17:24:23.159066 kubelet[3290]: E0912 17:24:23.158855 3290 projected.go:194] Error preparing data for projected volume kube-api-access-9grt2 for pod kube-system/kube-proxy-4nvrk: configmap "kube-root-ca.crt" not found Sep 12 17:24:23.159066 kubelet[3290]: E0912 17:24:23.158909 3290 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-api-access-9grt2 podName:a6c2c81f-2ea1-4da3-aa21-934956a80450 nodeName:}" failed. No retries permitted until 2025-09-12 17:24:23.65888832 +0000 UTC m=+4.862110827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9grt2" (UniqueName: "kubernetes.io/projected/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-api-access-9grt2") pod "kube-proxy-4nvrk" (UID: "a6c2c81f-2ea1-4da3-aa21-934956a80450") : configmap "kube-root-ca.crt" not found Sep 12 17:24:23.758670 kubelet[3290]: E0912 17:24:23.758569 3290 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 12 17:24:23.758670 kubelet[3290]: E0912 17:24:23.758598 3290 projected.go:194] Error preparing data for projected volume kube-api-access-9grt2 for pod kube-system/kube-proxy-4nvrk: configmap "kube-root-ca.crt" not found Sep 12 17:24:23.758670 kubelet[3290]: E0912 17:24:23.758636 3290 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-api-access-9grt2 podName:a6c2c81f-2ea1-4da3-aa21-934956a80450 nodeName:}" failed. No retries permitted until 2025-09-12 17:24:24.758623558 +0000 UTC m=+5.961846065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9grt2" (UniqueName: "kubernetes.io/projected/a6c2c81f-2ea1-4da3-aa21-934956a80450-kube-api-access-9grt2") pod "kube-proxy-4nvrk" (UID: "a6c2c81f-2ea1-4da3-aa21-934956a80450") : configmap "kube-root-ca.crt" not found Sep 12 17:24:23.950842 systemd[1]: Created slice kubepods-besteffort-pod19bc89da_3d2e_4683_96bf_a97e2072b4d8.slice - libcontainer container kubepods-besteffort-pod19bc89da_3d2e_4683_96bf_a97e2072b4d8.slice. Sep 12 17:24:23.959216 kubelet[3290]: I0912 17:24:23.959155 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/19bc89da-3d2e-4683-96bf-a97e2072b4d8-var-lib-calico\") pod \"tigera-operator-58fc44c59b-plq26\" (UID: \"19bc89da-3d2e-4683-96bf-a97e2072b4d8\") " pod="tigera-operator/tigera-operator-58fc44c59b-plq26" Sep 12 17:24:23.959216 kubelet[3290]: I0912 17:24:23.959185 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frd8m\" (UniqueName: \"kubernetes.io/projected/19bc89da-3d2e-4683-96bf-a97e2072b4d8-kube-api-access-frd8m\") pod \"tigera-operator-58fc44c59b-plq26\" (UID: \"19bc89da-3d2e-4683-96bf-a97e2072b4d8\") " pod="tigera-operator/tigera-operator-58fc44c59b-plq26" Sep 12 17:24:24.255154 containerd[1876]: time="2025-09-12T17:24:24.255106452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-plq26,Uid:19bc89da-3d2e-4683-96bf-a97e2072b4d8,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:24:24.291858 containerd[1876]: time="2025-09-12T17:24:24.291805481Z" level=info msg="connecting to shim e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286" address="unix:///run/containerd/s/bcffca2c5e584dbc395fc9ba04c27828dfe1050329cd4da6fccaad9e9e888660" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:24.315635 systemd[1]: Started cri-containerd-e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286.scope - libcontainer container e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286. Sep 12 17:24:24.340470 containerd[1876]: time="2025-09-12T17:24:24.340440620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-plq26,Uid:19bc89da-3d2e-4683-96bf-a97e2072b4d8,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286\"" Sep 12 17:24:24.342891 containerd[1876]: time="2025-09-12T17:24:24.342826125Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:24:25.059246 containerd[1876]: time="2025-09-12T17:24:25.059163492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4nvrk,Uid:a6c2c81f-2ea1-4da3-aa21-934956a80450,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:25.102872 containerd[1876]: time="2025-09-12T17:24:25.102833888Z" level=info msg="connecting to shim dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7" address="unix:///run/containerd/s/cbe3f39552a812e03cd7273ce3849b5d6e96b643458303f0d68d632100db5ad9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:25.120739 systemd[1]: Started cri-containerd-dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7.scope - libcontainer container dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7. Sep 12 17:24:25.139856 containerd[1876]: time="2025-09-12T17:24:25.139829603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4nvrk,Uid:a6c2c81f-2ea1-4da3-aa21-934956a80450,Namespace:kube-system,Attempt:0,} returns sandbox id \"dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7\"" Sep 12 17:24:25.142459 containerd[1876]: time="2025-09-12T17:24:25.142438895Z" level=info msg="CreateContainer within sandbox \"dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:24:25.161334 containerd[1876]: time="2025-09-12T17:24:25.161308334Z" level=info msg="Container 25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:25.180221 containerd[1876]: time="2025-09-12T17:24:25.180189093Z" level=info msg="CreateContainer within sandbox \"dde5b14eb307ffe88a7024ead5fd43d093e339ebf30023f90d3d60d503868bc7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635\"" Sep 12 17:24:25.181673 containerd[1876]: time="2025-09-12T17:24:25.180662065Z" level=info msg="StartContainer for \"25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635\"" Sep 12 17:24:25.181673 containerd[1876]: time="2025-09-12T17:24:25.181470064Z" level=info msg="connecting to shim 25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635" address="unix:///run/containerd/s/cbe3f39552a812e03cd7273ce3849b5d6e96b643458303f0d68d632100db5ad9" protocol=ttrpc version=3 Sep 12 17:24:25.197615 systemd[1]: Started cri-containerd-25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635.scope - libcontainer container 25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635. Sep 12 17:24:25.224865 containerd[1876]: time="2025-09-12T17:24:25.224834934Z" level=info msg="StartContainer for \"25bebf8a87cbd9d269e1ccd5997461b58b83dd4472f356f6989aeca119e07635\" returns successfully" Sep 12 17:24:25.994508 kubelet[3290]: I0912 17:24:25.994457 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4nvrk" podStartSLOduration=3.994444875 podStartE2EDuration="3.994444875s" podCreationTimestamp="2025-09-12 17:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:25.994277325 +0000 UTC m=+7.197499832" watchObservedRunningTime="2025-09-12 17:24:25.994444875 +0000 UTC m=+7.197667382" Sep 12 17:24:26.353596 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2640085853.mount: Deactivated successfully. Sep 12 17:24:26.355600 update_engine[1860]: I20250912 17:24:26.355556 1860 update_attempter.cc:509] Updating boot flags... Sep 12 17:24:26.766724 containerd[1876]: time="2025-09-12T17:24:26.766677027Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:26.769218 containerd[1876]: time="2025-09-12T17:24:26.769195468Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 17:24:26.771994 containerd[1876]: time="2025-09-12T17:24:26.771965191Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:26.775932 containerd[1876]: time="2025-09-12T17:24:26.775889684Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:26.776845 containerd[1876]: time="2025-09-12T17:24:26.776750384Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.433901965s" Sep 12 17:24:26.776845 containerd[1876]: time="2025-09-12T17:24:26.776774343Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 17:24:26.779238 containerd[1876]: time="2025-09-12T17:24:26.779200078Z" level=info msg="CreateContainer within sandbox \"e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:24:26.798541 containerd[1876]: time="2025-09-12T17:24:26.797887432Z" level=info msg="Container 4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:26.808114 containerd[1876]: time="2025-09-12T17:24:26.808091062Z" level=info msg="CreateContainer within sandbox \"e9bd2434ffe7c33609ade6362c4dea2483bdfc54d2845578f94f072fac6f6286\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972\"" Sep 12 17:24:26.808923 containerd[1876]: time="2025-09-12T17:24:26.808469023Z" level=info msg="StartContainer for \"4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972\"" Sep 12 17:24:26.810257 containerd[1876]: time="2025-09-12T17:24:26.810208519Z" level=info msg="connecting to shim 4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972" address="unix:///run/containerd/s/bcffca2c5e584dbc395fc9ba04c27828dfe1050329cd4da6fccaad9e9e888660" protocol=ttrpc version=3 Sep 12 17:24:26.825619 systemd[1]: Started cri-containerd-4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972.scope - libcontainer container 4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972. Sep 12 17:24:26.849126 containerd[1876]: time="2025-09-12T17:24:26.849096785Z" level=info msg="StartContainer for \"4e3376ebced5f112eb6d011babd832a4ce1dad1a634a1f4fb3a0e71fab6ce972\" returns successfully" Sep 12 17:24:27.792274 kubelet[3290]: I0912 17:24:27.792146 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-plq26" podStartSLOduration=2.356426167 podStartE2EDuration="4.792133376s" podCreationTimestamp="2025-09-12 17:24:23 +0000 UTC" firstStartedPulling="2025-09-12 17:24:24.341574408 +0000 UTC m=+5.544796923" lastFinishedPulling="2025-09-12 17:24:26.777281625 +0000 UTC m=+7.980504132" observedRunningTime="2025-09-12 17:24:26.996956404 +0000 UTC m=+8.200178927" watchObservedRunningTime="2025-09-12 17:24:27.792133376 +0000 UTC m=+8.995355883" Sep 12 17:24:31.950970 sudo[2347]: pam_unix(sudo:session): session closed for user root Sep 12 17:24:32.033230 sshd[2346]: Connection closed by 10.200.16.10 port 58040 Sep 12 17:24:32.035467 sshd-session[2343]: pam_unix(sshd:session): session closed for user core Sep 12 17:24:32.040006 systemd[1]: sshd@6-10.200.20.21:22-10.200.16.10:58040.service: Deactivated successfully. Sep 12 17:24:32.043380 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:24:32.043772 systemd[1]: session-9.scope: Consumed 3.330s CPU time, 221.8M memory peak. Sep 12 17:24:32.045698 systemd-logind[1855]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:24:32.048096 systemd-logind[1855]: Removed session 9. Sep 12 17:24:35.308998 systemd[1]: Created slice kubepods-besteffort-podc8077ec5_da6b_4442_9025_a77afdd4e448.slice - libcontainer container kubepods-besteffort-podc8077ec5_da6b_4442_9025_a77afdd4e448.slice. Sep 12 17:24:35.333515 kubelet[3290]: I0912 17:24:35.333485 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8077ec5-da6b-4442-9025-a77afdd4e448-tigera-ca-bundle\") pod \"calico-typha-5dfdb7ff87-zl6cx\" (UID: \"c8077ec5-da6b-4442-9025-a77afdd4e448\") " pod="calico-system/calico-typha-5dfdb7ff87-zl6cx" Sep 12 17:24:35.333515 kubelet[3290]: I0912 17:24:35.333524 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7sv\" (UniqueName: \"kubernetes.io/projected/c8077ec5-da6b-4442-9025-a77afdd4e448-kube-api-access-cr7sv\") pod \"calico-typha-5dfdb7ff87-zl6cx\" (UID: \"c8077ec5-da6b-4442-9025-a77afdd4e448\") " pod="calico-system/calico-typha-5dfdb7ff87-zl6cx" Sep 12 17:24:35.333798 kubelet[3290]: I0912 17:24:35.333540 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c8077ec5-da6b-4442-9025-a77afdd4e448-typha-certs\") pod \"calico-typha-5dfdb7ff87-zl6cx\" (UID: \"c8077ec5-da6b-4442-9025-a77afdd4e448\") " pod="calico-system/calico-typha-5dfdb7ff87-zl6cx" Sep 12 17:24:35.434693 systemd[1]: Created slice kubepods-besteffort-pod19c17f46_3f73_483c_8fac_01ab1bcd015e.slice - libcontainer container kubepods-besteffort-pod19c17f46_3f73_483c_8fac_01ab1bcd015e.slice. Sep 12 17:24:35.534216 kubelet[3290]: I0912 17:24:35.534185 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-policysync\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534216 kubelet[3290]: I0912 17:24:35.534215 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/19c17f46-3f73-483c-8fac-01ab1bcd015e-node-certs\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534342 kubelet[3290]: I0912 17:24:35.534227 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-var-run-calico\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534342 kubelet[3290]: I0912 17:24:35.534241 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-lib-modules\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534342 kubelet[3290]: I0912 17:24:35.534258 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cgt\" (UniqueName: \"kubernetes.io/projected/19c17f46-3f73-483c-8fac-01ab1bcd015e-kube-api-access-w8cgt\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534342 kubelet[3290]: I0912 17:24:35.534268 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c17f46-3f73-483c-8fac-01ab1bcd015e-tigera-ca-bundle\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534342 kubelet[3290]: I0912 17:24:35.534282 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-var-lib-calico\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534427 kubelet[3290]: I0912 17:24:35.534292 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-xtables-lock\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534427 kubelet[3290]: I0912 17:24:35.534300 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-cni-bin-dir\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534427 kubelet[3290]: I0912 17:24:35.534308 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-cni-net-dir\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534427 kubelet[3290]: I0912 17:24:35.534318 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-flexvol-driver-host\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.534427 kubelet[3290]: I0912 17:24:35.534333 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/19c17f46-3f73-483c-8fac-01ab1bcd015e-cni-log-dir\") pod \"calico-node-jcfnq\" (UID: \"19c17f46-3f73-483c-8fac-01ab1bcd015e\") " pod="calico-system/calico-node-jcfnq" Sep 12 17:24:35.566416 kubelet[3290]: E0912 17:24:35.565904 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:35.617455 containerd[1876]: time="2025-09-12T17:24:35.617394567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dfdb7ff87-zl6cx,Uid:c8077ec5-da6b-4442-9025-a77afdd4e448,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:35.635546 kubelet[3290]: I0912 17:24:35.635344 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4043ee3-d7ce-4761-b52a-0afce638c51a-kubelet-dir\") pod \"csi-node-driver-9s9cm\" (UID: \"c4043ee3-d7ce-4761-b52a-0afce638c51a\") " pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:35.635546 kubelet[3290]: I0912 17:24:35.635431 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c4043ee3-d7ce-4761-b52a-0afce638c51a-socket-dir\") pod \"csi-node-driver-9s9cm\" (UID: \"c4043ee3-d7ce-4761-b52a-0afce638c51a\") " pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:35.635546 kubelet[3290]: I0912 17:24:35.635445 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c4043ee3-d7ce-4761-b52a-0afce638c51a-varrun\") pod \"csi-node-driver-9s9cm\" (UID: \"c4043ee3-d7ce-4761-b52a-0afce638c51a\") " pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:35.636507 kubelet[3290]: I0912 17:24:35.635462 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x45\" (UniqueName: \"kubernetes.io/projected/c4043ee3-d7ce-4761-b52a-0afce638c51a-kube-api-access-54x45\") pod \"csi-node-driver-9s9cm\" (UID: \"c4043ee3-d7ce-4761-b52a-0afce638c51a\") " pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:35.636653 kubelet[3290]: E0912 17:24:35.636641 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.636703 kubelet[3290]: W0912 17:24:35.636693 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.636761 kubelet[3290]: E0912 17:24:35.636752 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.637317 kubelet[3290]: E0912 17:24:35.637305 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.637534 kubelet[3290]: W0912 17:24:35.637384 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.637534 kubelet[3290]: E0912 17:24:35.637409 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.637935 kubelet[3290]: E0912 17:24:35.637919 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.638199 kubelet[3290]: W0912 17:24:35.637995 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.638199 kubelet[3290]: E0912 17:24:35.638017 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.638825 kubelet[3290]: E0912 17:24:35.638729 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.638948 kubelet[3290]: W0912 17:24:35.638897 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.638948 kubelet[3290]: E0912 17:24:35.638935 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.639235 kubelet[3290]: E0912 17:24:35.639168 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.639235 kubelet[3290]: W0912 17:24:35.639178 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.639235 kubelet[3290]: E0912 17:24:35.639210 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.639630 kubelet[3290]: E0912 17:24:35.639515 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.639749 kubelet[3290]: W0912 17:24:35.639698 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.639749 kubelet[3290]: E0912 17:24:35.639736 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.640078 kubelet[3290]: E0912 17:24:35.640030 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.640078 kubelet[3290]: W0912 17:24:35.640041 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.640078 kubelet[3290]: E0912 17:24:35.640066 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.640556 kubelet[3290]: E0912 17:24:35.640422 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.640556 kubelet[3290]: W0912 17:24:35.640444 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.640556 kubelet[3290]: E0912 17:24:35.640474 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.640997 kubelet[3290]: E0912 17:24:35.640982 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.641097 kubelet[3290]: W0912 17:24:35.641048 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.641097 kubelet[3290]: E0912 17:24:35.641085 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.641441 kubelet[3290]: E0912 17:24:35.641393 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.641441 kubelet[3290]: W0912 17:24:35.641403 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.641441 kubelet[3290]: E0912 17:24:35.641426 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.641709 kubelet[3290]: E0912 17:24:35.641677 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.641709 kubelet[3290]: W0912 17:24:35.641688 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.642255 kubelet[3290]: E0912 17:24:35.641790 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.642255 kubelet[3290]: E0912 17:24:35.641918 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.642255 kubelet[3290]: W0912 17:24:35.641928 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.642255 kubelet[3290]: E0912 17:24:35.641949 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.642414 kubelet[3290]: E0912 17:24:35.642397 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.642455 kubelet[3290]: W0912 17:24:35.642416 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.642543 kubelet[3290]: E0912 17:24:35.642487 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.642795 kubelet[3290]: E0912 17:24:35.642779 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.642795 kubelet[3290]: W0912 17:24:35.642792 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.642970 kubelet[3290]: E0912 17:24:35.642805 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.643593 kubelet[3290]: E0912 17:24:35.643578 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.643593 kubelet[3290]: W0912 17:24:35.643590 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.643760 kubelet[3290]: E0912 17:24:35.643603 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.643956 kubelet[3290]: E0912 17:24:35.643931 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.643956 kubelet[3290]: W0912 17:24:35.643943 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.644060 kubelet[3290]: E0912 17:24:35.643992 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.644184 kubelet[3290]: E0912 17:24:35.644170 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.644184 kubelet[3290]: W0912 17:24:35.644181 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.644301 kubelet[3290]: E0912 17:24:35.644242 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.644538 kubelet[3290]: E0912 17:24:35.644510 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.644538 kubelet[3290]: W0912 17:24:35.644532 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.644829 kubelet[3290]: E0912 17:24:35.644756 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.644933 kubelet[3290]: E0912 17:24:35.644917 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.644933 kubelet[3290]: W0912 17:24:35.644930 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.645035 kubelet[3290]: E0912 17:24:35.644980 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.645369 kubelet[3290]: E0912 17:24:35.645354 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.645369 kubelet[3290]: W0912 17:24:35.645367 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.645510 kubelet[3290]: E0912 17:24:35.645426 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.645815 kubelet[3290]: E0912 17:24:35.645708 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.645939 kubelet[3290]: W0912 17:24:35.645816 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.645939 kubelet[3290]: E0912 17:24:35.645870 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.646836 kubelet[3290]: E0912 17:24:35.646750 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.646836 kubelet[3290]: W0912 17:24:35.646765 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.647699 kubelet[3290]: E0912 17:24:35.647620 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.647853 kubelet[3290]: W0912 17:24:35.647633 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.648685 kubelet[3290]: E0912 17:24:35.648666 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.648904 kubelet[3290]: W0912 17:24:35.648826 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.649093 kubelet[3290]: E0912 17:24:35.649082 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.649179 kubelet[3290]: W0912 17:24:35.649169 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.649443 kubelet[3290]: E0912 17:24:35.649383 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.649443 kubelet[3290]: W0912 17:24:35.649393 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.656769 kubelet[3290]: E0912 17:24:35.656568 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.656769 kubelet[3290]: W0912 17:24:35.656587 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.656769 kubelet[3290]: E0912 17:24:35.656601 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.656769 kubelet[3290]: E0912 17:24:35.656622 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.658195 kubelet[3290]: E0912 17:24:35.657877 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.658195 kubelet[3290]: W0912 17:24:35.657890 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.658195 kubelet[3290]: E0912 17:24:35.657900 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.658195 kubelet[3290]: E0912 17:24:35.657920 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.659015 kubelet[3290]: E0912 17:24:35.658852 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.659015 kubelet[3290]: E0912 17:24:35.658866 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.659015 kubelet[3290]: E0912 17:24:35.658875 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.659015 kubelet[3290]: I0912 17:24:35.658886 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c4043ee3-d7ce-4761-b52a-0afce638c51a-registration-dir\") pod \"csi-node-driver-9s9cm\" (UID: \"c4043ee3-d7ce-4761-b52a-0afce638c51a\") " pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:35.659261 kubelet[3290]: E0912 17:24:35.659248 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.659405 kubelet[3290]: W0912 17:24:35.659391 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.659565 kubelet[3290]: E0912 17:24:35.659552 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.660233 kubelet[3290]: E0912 17:24:35.660205 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.660233 kubelet[3290]: W0912 17:24:35.660219 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.660593 kubelet[3290]: E0912 17:24:35.660494 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.661201 kubelet[3290]: E0912 17:24:35.661061 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.661201 kubelet[3290]: W0912 17:24:35.661074 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.661201 kubelet[3290]: E0912 17:24:35.661092 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.661741 kubelet[3290]: E0912 17:24:35.661475 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.661824 kubelet[3290]: W0912 17:24:35.661803 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.662002 kubelet[3290]: E0912 17:24:35.661931 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.663665 kubelet[3290]: E0912 17:24:35.663352 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.663665 kubelet[3290]: W0912 17:24:35.663367 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.663665 kubelet[3290]: E0912 17:24:35.663556 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.663665 kubelet[3290]: W0912 17:24:35.663564 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.663879 kubelet[3290]: E0912 17:24:35.663866 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.664207 kubelet[3290]: E0912 17:24:35.663948 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.664279 kubelet[3290]: W0912 17:24:35.664267 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.665134 kubelet[3290]: E0912 17:24:35.664714 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.665299 kubelet[3290]: W0912 17:24:35.665225 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.665454 kubelet[3290]: E0912 17:24:35.665112 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.665454 kubelet[3290]: E0912 17:24:35.663953 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.665454 kubelet[3290]: E0912 17:24:35.665395 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.665669 kubelet[3290]: E0912 17:24:35.665659 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.665797 kubelet[3290]: W0912 17:24:35.665743 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.665797 kubelet[3290]: E0912 17:24:35.665762 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.666013 kubelet[3290]: E0912 17:24:35.666002 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.666111 kubelet[3290]: W0912 17:24:35.666091 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.666236 kubelet[3290]: E0912 17:24:35.666224 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.667671 kubelet[3290]: E0912 17:24:35.667646 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.667671 kubelet[3290]: W0912 17:24:35.667659 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.667804 kubelet[3290]: E0912 17:24:35.667760 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.668332 kubelet[3290]: E0912 17:24:35.668187 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.668332 kubelet[3290]: W0912 17:24:35.668203 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.668332 kubelet[3290]: E0912 17:24:35.668278 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.668483 kubelet[3290]: E0912 17:24:35.668462 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.668483 kubelet[3290]: W0912 17:24:35.668471 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.668665 kubelet[3290]: E0912 17:24:35.668613 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.669119 containerd[1876]: time="2025-09-12T17:24:35.669093248Z" level=info msg="connecting to shim 42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd" address="unix:///run/containerd/s/e37b57e774bb6a95667f025d6e420be02edb7e05661806153be72597f0c45db1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:35.669389 kubelet[3290]: E0912 17:24:35.669241 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.669389 kubelet[3290]: W0912 17:24:35.669255 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.669389 kubelet[3290]: E0912 17:24:35.669265 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.670554 kubelet[3290]: E0912 17:24:35.670493 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.670554 kubelet[3290]: W0912 17:24:35.670507 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.670554 kubelet[3290]: E0912 17:24:35.670528 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.679612 kubelet[3290]: E0912 17:24:35.679599 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.679726 kubelet[3290]: W0912 17:24:35.679713 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.679857 kubelet[3290]: E0912 17:24:35.679775 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.701183 systemd[1]: Started cri-containerd-42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd.scope - libcontainer container 42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd. Sep 12 17:24:35.738746 containerd[1876]: time="2025-09-12T17:24:35.738715467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dfdb7ff87-zl6cx,Uid:c8077ec5-da6b-4442-9025-a77afdd4e448,Namespace:calico-system,Attempt:0,} returns sandbox id \"42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd\"" Sep 12 17:24:35.740255 containerd[1876]: time="2025-09-12T17:24:35.740241600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:24:35.744931 containerd[1876]: time="2025-09-12T17:24:35.744874019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jcfnq,Uid:19c17f46-3f73-483c-8fac-01ab1bcd015e,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:35.767406 kubelet[3290]: E0912 17:24:35.767363 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.767406 kubelet[3290]: W0912 17:24:35.767378 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.767406 kubelet[3290]: E0912 17:24:35.767392 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.767751 kubelet[3290]: E0912 17:24:35.767736 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.767904 kubelet[3290]: W0912 17:24:35.767847 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.767904 kubelet[3290]: E0912 17:24:35.767867 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.768119 kubelet[3290]: E0912 17:24:35.768096 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.768119 kubelet[3290]: W0912 17:24:35.768108 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.768231 kubelet[3290]: E0912 17:24:35.768196 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.768457 kubelet[3290]: E0912 17:24:35.768426 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.768457 kubelet[3290]: W0912 17:24:35.768445 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.768586 kubelet[3290]: E0912 17:24:35.768544 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.769142 kubelet[3290]: E0912 17:24:35.768759 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.769142 kubelet[3290]: W0912 17:24:35.768767 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.769142 kubelet[3290]: E0912 17:24:35.768776 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.769142 kubelet[3290]: E0912 17:24:35.768919 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.769142 kubelet[3290]: W0912 17:24:35.768925 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.769142 kubelet[3290]: E0912 17:24:35.768935 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.769413 kubelet[3290]: E0912 17:24:35.769068 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.769413 kubelet[3290]: W0912 17:24:35.769315 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.769413 kubelet[3290]: E0912 17:24:35.769328 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.769602 kubelet[3290]: E0912 17:24:35.769591 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.769754 kubelet[3290]: W0912 17:24:35.769675 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.769754 kubelet[3290]: E0912 17:24:35.769694 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.770160 kubelet[3290]: E0912 17:24:35.770136 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.770160 kubelet[3290]: W0912 17:24:35.770148 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.770443 kubelet[3290]: E0912 17:24:35.770370 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.770567 kubelet[3290]: E0912 17:24:35.770546 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.770567 kubelet[3290]: W0912 17:24:35.770556 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.770737 kubelet[3290]: E0912 17:24:35.770712 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.770830 kubelet[3290]: E0912 17:24:35.770823 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.770932 kubelet[3290]: W0912 17:24:35.770872 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.770989 kubelet[3290]: E0912 17:24:35.770976 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.771186 kubelet[3290]: E0912 17:24:35.771165 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.771186 kubelet[3290]: W0912 17:24:35.771175 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.771375 kubelet[3290]: E0912 17:24:35.771344 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.771481 kubelet[3290]: E0912 17:24:35.771463 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.771481 kubelet[3290]: W0912 17:24:35.771472 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.771688 kubelet[3290]: E0912 17:24:35.771662 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.771838 kubelet[3290]: E0912 17:24:35.771829 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.771925 kubelet[3290]: W0912 17:24:35.771881 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.771990 kubelet[3290]: E0912 17:24:35.771964 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.772152 kubelet[3290]: E0912 17:24:35.772132 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.772152 kubelet[3290]: W0912 17:24:35.772142 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.772312 kubelet[3290]: E0912 17:24:35.772284 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.772426 kubelet[3290]: E0912 17:24:35.772406 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.772426 kubelet[3290]: W0912 17:24:35.772415 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.772570 kubelet[3290]: E0912 17:24:35.772553 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.772702 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773459 kubelet[3290]: W0912 17:24:35.772710 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.772720 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.772822 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773459 kubelet[3290]: W0912 17:24:35.772827 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.772845 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.772988 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773459 kubelet[3290]: W0912 17:24:35.772994 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.773004 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773459 kubelet[3290]: E0912 17:24:35.773124 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773790 kubelet[3290]: W0912 17:24:35.773129 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773790 kubelet[3290]: E0912 17:24:35.773140 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773790 kubelet[3290]: E0912 17:24:35.773271 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773790 kubelet[3290]: W0912 17:24:35.773279 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773790 kubelet[3290]: E0912 17:24:35.773288 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.773790 kubelet[3290]: E0912 17:24:35.773546 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.773790 kubelet[3290]: W0912 17:24:35.773556 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.773790 kubelet[3290]: E0912 17:24:35.773572 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.774065 kubelet[3290]: E0912 17:24:35.774042 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.774065 kubelet[3290]: W0912 17:24:35.774053 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.774227 kubelet[3290]: E0912 17:24:35.774202 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.774453 kubelet[3290]: E0912 17:24:35.774353 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.774453 kubelet[3290]: W0912 17:24:35.774363 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.774453 kubelet[3290]: E0912 17:24:35.774371 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.774851 kubelet[3290]: E0912 17:24:35.774839 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.774997 kubelet[3290]: W0912 17:24:35.774892 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.774997 kubelet[3290]: E0912 17:24:35.774907 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.782130 kubelet[3290]: E0912 17:24:35.782087 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:35.782130 kubelet[3290]: W0912 17:24:35.782099 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:35.782130 kubelet[3290]: E0912 17:24:35.782109 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:35.788203 containerd[1876]: time="2025-09-12T17:24:35.787846421Z" level=info msg="connecting to shim a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827" address="unix:///run/containerd/s/6f0b60d28eea817d8cc41ee87c0124c9684bb74e660add9c33a8b6925d2363de" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:35.803937 systemd[1]: Started cri-containerd-a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827.scope - libcontainer container a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827. Sep 12 17:24:35.823701 containerd[1876]: time="2025-09-12T17:24:35.823628453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jcfnq,Uid:19c17f46-3f73-483c-8fac-01ab1bcd015e,Namespace:calico-system,Attempt:0,} returns sandbox id \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\"" Sep 12 17:24:37.236561 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3701028162.mount: Deactivated successfully. Sep 12 17:24:37.585905 containerd[1876]: time="2025-09-12T17:24:37.585713820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:37.588246 containerd[1876]: time="2025-09-12T17:24:37.588221766Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 17:24:37.594073 containerd[1876]: time="2025-09-12T17:24:37.594039352Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:37.601538 containerd[1876]: time="2025-09-12T17:24:37.601068358Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:37.601538 containerd[1876]: time="2025-09-12T17:24:37.601433471Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.861040104s" Sep 12 17:24:37.601538 containerd[1876]: time="2025-09-12T17:24:37.601449542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 17:24:37.602417 containerd[1876]: time="2025-09-12T17:24:37.602397746Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:24:37.614244 containerd[1876]: time="2025-09-12T17:24:37.614206228Z" level=info msg="CreateContainer within sandbox \"42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:24:37.637392 containerd[1876]: time="2025-09-12T17:24:37.636865818Z" level=info msg="Container 0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:37.653406 containerd[1876]: time="2025-09-12T17:24:37.653369932Z" level=info msg="CreateContainer within sandbox \"42cb651e7e310d92f95cbf7b83744a6a8c9091d69806bd9e71a02098b1a321bd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4\"" Sep 12 17:24:37.654528 containerd[1876]: time="2025-09-12T17:24:37.654473303Z" level=info msg="StartContainer for \"0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4\"" Sep 12 17:24:37.655350 containerd[1876]: time="2025-09-12T17:24:37.655320489Z" level=info msg="connecting to shim 0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4" address="unix:///run/containerd/s/e37b57e774bb6a95667f025d6e420be02edb7e05661806153be72597f0c45db1" protocol=ttrpc version=3 Sep 12 17:24:37.675686 systemd[1]: Started cri-containerd-0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4.scope - libcontainer container 0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4. Sep 12 17:24:37.705270 containerd[1876]: time="2025-09-12T17:24:37.705062256Z" level=info msg="StartContainer for \"0929b15ebd7a17b4805d1ed1e293b0dd59163013a8c46c927af0199838770cf4\" returns successfully" Sep 12 17:24:37.941779 kubelet[3290]: E0912 17:24:37.941500 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:38.032534 kubelet[3290]: E0912 17:24:38.032427 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.032534 kubelet[3290]: W0912 17:24:38.032444 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.032534 kubelet[3290]: E0912 17:24:38.032457 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.032803 kubelet[3290]: E0912 17:24:38.032715 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.032803 kubelet[3290]: W0912 17:24:38.032726 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.032803 kubelet[3290]: E0912 17:24:38.032735 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.033003 kubelet[3290]: E0912 17:24:38.032924 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.033003 kubelet[3290]: W0912 17:24:38.032933 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.033003 kubelet[3290]: E0912 17:24:38.032942 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.033136 kubelet[3290]: E0912 17:24:38.033126 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.033184 kubelet[3290]: W0912 17:24:38.033175 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.033303 kubelet[3290]: E0912 17:24:38.033219 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.033399 kubelet[3290]: E0912 17:24:38.033389 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.033446 kubelet[3290]: W0912 17:24:38.033437 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.033488 kubelet[3290]: E0912 17:24:38.033478 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.033726 kubelet[3290]: E0912 17:24:38.033656 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.033726 kubelet[3290]: W0912 17:24:38.033665 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.033726 kubelet[3290]: E0912 17:24:38.033673 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.033870 kubelet[3290]: E0912 17:24:38.033861 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.033918 kubelet[3290]: W0912 17:24:38.033910 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.033966 kubelet[3290]: E0912 17:24:38.033955 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.034209 kubelet[3290]: E0912 17:24:38.034126 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.034209 kubelet[3290]: W0912 17:24:38.034135 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.034209 kubelet[3290]: E0912 17:24:38.034143 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.034348 kubelet[3290]: E0912 17:24:38.034339 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.034396 kubelet[3290]: W0912 17:24:38.034387 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.034500 kubelet[3290]: E0912 17:24:38.034427 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.034612 kubelet[3290]: E0912 17:24:38.034602 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.034736 kubelet[3290]: W0912 17:24:38.034653 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.034736 kubelet[3290]: E0912 17:24:38.034665 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.034848 kubelet[3290]: E0912 17:24:38.034838 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.034978 kubelet[3290]: W0912 17:24:38.034891 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.034978 kubelet[3290]: E0912 17:24:38.034903 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.035164 kubelet[3290]: E0912 17:24:38.035083 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.035164 kubelet[3290]: W0912 17:24:38.035092 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.035164 kubelet[3290]: E0912 17:24:38.035100 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.035355 kubelet[3290]: E0912 17:24:38.035280 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.035355 kubelet[3290]: W0912 17:24:38.035288 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.035355 kubelet[3290]: E0912 17:24:38.035296 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.035532 kubelet[3290]: E0912 17:24:38.035480 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.035532 kubelet[3290]: W0912 17:24:38.035490 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.035532 kubelet[3290]: E0912 17:24:38.035498 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.035734 kubelet[3290]: E0912 17:24:38.035725 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.035839 kubelet[3290]: W0912 17:24:38.035789 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.035839 kubelet[3290]: E0912 17:24:38.035804 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.084449 kubelet[3290]: E0912 17:24:38.084429 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.084449 kubelet[3290]: W0912 17:24:38.084444 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.084544 kubelet[3290]: E0912 17:24:38.084454 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.084623 kubelet[3290]: E0912 17:24:38.084608 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.084623 kubelet[3290]: W0912 17:24:38.084621 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.084623 kubelet[3290]: E0912 17:24:38.084629 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.084906 kubelet[3290]: E0912 17:24:38.084794 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.084906 kubelet[3290]: W0912 17:24:38.084806 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.084906 kubelet[3290]: E0912 17:24:38.084820 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.084961 kubelet[3290]: E0912 17:24:38.084952 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085086 kubelet[3290]: W0912 17:24:38.084960 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085086 kubelet[3290]: E0912 17:24:38.084973 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.085305 kubelet[3290]: E0912 17:24:38.085145 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085305 kubelet[3290]: W0912 17:24:38.085155 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085305 kubelet[3290]: E0912 17:24:38.085169 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.085404 kubelet[3290]: E0912 17:24:38.085394 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085446 kubelet[3290]: W0912 17:24:38.085438 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085498 kubelet[3290]: E0912 17:24:38.085490 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.085678 kubelet[3290]: E0912 17:24:38.085663 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085678 kubelet[3290]: W0912 17:24:38.085675 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085740 kubelet[3290]: E0912 17:24:38.085687 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.085809 kubelet[3290]: E0912 17:24:38.085796 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085809 kubelet[3290]: W0912 17:24:38.085806 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085922 kubelet[3290]: E0912 17:24:38.085812 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.085957 kubelet[3290]: E0912 17:24:38.085947 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.085974 kubelet[3290]: W0912 17:24:38.085957 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.085974 kubelet[3290]: E0912 17:24:38.085970 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.086179 kubelet[3290]: E0912 17:24:38.086166 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.086179 kubelet[3290]: W0912 17:24:38.086176 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.086238 kubelet[3290]: E0912 17:24:38.086185 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.086293 kubelet[3290]: E0912 17:24:38.086282 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.086293 kubelet[3290]: W0912 17:24:38.086290 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.086400 kubelet[3290]: E0912 17:24:38.086358 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.086462 kubelet[3290]: E0912 17:24:38.086451 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.086462 kubelet[3290]: W0912 17:24:38.086458 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.086502 kubelet[3290]: E0912 17:24:38.086467 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.086644 kubelet[3290]: E0912 17:24:38.086632 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.086644 kubelet[3290]: W0912 17:24:38.086641 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.086694 kubelet[3290]: E0912 17:24:38.086654 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.086847 kubelet[3290]: E0912 17:24:38.086825 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.086847 kubelet[3290]: W0912 17:24:38.086835 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.087580 kubelet[3290]: E0912 17:24:38.087545 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.087804 kubelet[3290]: E0912 17:24:38.087787 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.087804 kubelet[3290]: W0912 17:24:38.087800 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.087904 kubelet[3290]: E0912 17:24:38.087849 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.088061 kubelet[3290]: E0912 17:24:38.088044 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.088061 kubelet[3290]: W0912 17:24:38.088058 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.088110 kubelet[3290]: E0912 17:24:38.088070 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.088229 kubelet[3290]: E0912 17:24:38.088217 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.088229 kubelet[3290]: W0912 17:24:38.088226 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.088274 kubelet[3290]: E0912 17:24:38.088234 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:38.088489 kubelet[3290]: E0912 17:24:38.088477 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:38.088489 kubelet[3290]: W0912 17:24:38.088486 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:38.088561 kubelet[3290]: E0912 17:24:38.088494 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.009403 kubelet[3290]: I0912 17:24:39.009368 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:24:39.042312 kubelet[3290]: E0912 17:24:39.042290 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.042578 kubelet[3290]: W0912 17:24:39.042404 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.042578 kubelet[3290]: E0912 17:24:39.042423 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.042807 kubelet[3290]: E0912 17:24:39.042794 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.042991 kubelet[3290]: W0912 17:24:39.042958 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.043081 kubelet[3290]: E0912 17:24:39.043071 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.043579 kubelet[3290]: E0912 17:24:39.043487 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.043579 kubelet[3290]: W0912 17:24:39.043499 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.043579 kubelet[3290]: E0912 17:24:39.043508 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.043989 kubelet[3290]: E0912 17:24:39.043941 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.044203 kubelet[3290]: W0912 17:24:39.044064 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.044203 kubelet[3290]: E0912 17:24:39.044085 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.044495 kubelet[3290]: E0912 17:24:39.044408 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.044495 kubelet[3290]: W0912 17:24:39.044419 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.044495 kubelet[3290]: E0912 17:24:39.044428 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.044973 kubelet[3290]: E0912 17:24:39.044881 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.044973 kubelet[3290]: W0912 17:24:39.044892 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.044973 kubelet[3290]: E0912 17:24:39.044901 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.045305 kubelet[3290]: E0912 17:24:39.045204 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.045305 kubelet[3290]: W0912 17:24:39.045216 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.045305 kubelet[3290]: E0912 17:24:39.045230 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.045645 kubelet[3290]: E0912 17:24:39.045628 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.045774 kubelet[3290]: W0912 17:24:39.045757 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.046029 kubelet[3290]: E0912 17:24:39.045936 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.046255 kubelet[3290]: E0912 17:24:39.046228 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.046404 kubelet[3290]: W0912 17:24:39.046312 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.046404 kubelet[3290]: E0912 17:24:39.046330 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.046646 kubelet[3290]: E0912 17:24:39.046634 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.046975 kubelet[3290]: W0912 17:24:39.046695 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.046975 kubelet[3290]: E0912 17:24:39.046839 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.047190 kubelet[3290]: E0912 17:24:39.047057 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.047190 kubelet[3290]: W0912 17:24:39.047067 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.047190 kubelet[3290]: E0912 17:24:39.047075 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.047509 kubelet[3290]: E0912 17:24:39.047375 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.047509 kubelet[3290]: W0912 17:24:39.047386 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.047509 kubelet[3290]: E0912 17:24:39.047395 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.047883 kubelet[3290]: E0912 17:24:39.047715 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.047883 kubelet[3290]: W0912 17:24:39.047765 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.047883 kubelet[3290]: E0912 17:24:39.047777 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.048352 kubelet[3290]: E0912 17:24:39.048208 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.048352 kubelet[3290]: W0912 17:24:39.048221 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.048352 kubelet[3290]: E0912 17:24:39.048230 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.048852 kubelet[3290]: E0912 17:24:39.048582 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.048852 kubelet[3290]: W0912 17:24:39.048800 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.048852 kubelet[3290]: E0912 17:24:39.048815 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.091655 kubelet[3290]: E0912 17:24:39.091638 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.091655 kubelet[3290]: W0912 17:24:39.091652 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.091809 kubelet[3290]: E0912 17:24:39.091662 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.091809 kubelet[3290]: E0912 17:24:39.091788 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.091809 kubelet[3290]: W0912 17:24:39.091794 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.091809 kubelet[3290]: E0912 17:24:39.091806 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.091990 kubelet[3290]: E0912 17:24:39.091941 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.091990 kubelet[3290]: W0912 17:24:39.091953 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.092097 kubelet[3290]: E0912 17:24:39.092043 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.092358 kubelet[3290]: E0912 17:24:39.092282 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.092358 kubelet[3290]: W0912 17:24:39.092301 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.092358 kubelet[3290]: E0912 17:24:39.092317 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.092687 kubelet[3290]: E0912 17:24:39.092675 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.092846 kubelet[3290]: W0912 17:24:39.092750 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.092846 kubelet[3290]: E0912 17:24:39.092776 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.092961 kubelet[3290]: E0912 17:24:39.092953 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.093011 kubelet[3290]: W0912 17:24:39.093000 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.093063 kubelet[3290]: E0912 17:24:39.093054 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.093269 kubelet[3290]: E0912 17:24:39.093259 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.093355 kubelet[3290]: W0912 17:24:39.093314 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.093355 kubelet[3290]: E0912 17:24:39.093338 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.093604 kubelet[3290]: E0912 17:24:39.093581 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.093758 kubelet[3290]: W0912 17:24:39.093737 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.093948 kubelet[3290]: E0912 17:24:39.093925 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.094140 kubelet[3290]: E0912 17:24:39.094093 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.094140 kubelet[3290]: W0912 17:24:39.094103 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.094140 kubelet[3290]: E0912 17:24:39.094122 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.094398 kubelet[3290]: E0912 17:24:39.094329 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.094398 kubelet[3290]: W0912 17:24:39.094339 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.094398 kubelet[3290]: E0912 17:24:39.094353 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.094685 kubelet[3290]: E0912 17:24:39.094606 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.094685 kubelet[3290]: W0912 17:24:39.094616 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.094685 kubelet[3290]: E0912 17:24:39.094632 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.094907 kubelet[3290]: E0912 17:24:39.094835 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.094907 kubelet[3290]: W0912 17:24:39.094845 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.094907 kubelet[3290]: E0912 17:24:39.094860 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095225 kubelet[3290]: E0912 17:24:39.095106 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095225 kubelet[3290]: W0912 17:24:39.095116 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095225 kubelet[3290]: E0912 17:24:39.095131 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095301 kubelet[3290]: E0912 17:24:39.095284 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095301 kubelet[3290]: W0912 17:24:39.095293 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095301 kubelet[3290]: E0912 17:24:39.095300 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095395 kubelet[3290]: E0912 17:24:39.095382 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095395 kubelet[3290]: W0912 17:24:39.095390 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095429 kubelet[3290]: E0912 17:24:39.095395 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095500 kubelet[3290]: E0912 17:24:39.095485 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095500 kubelet[3290]: W0912 17:24:39.095492 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095500 kubelet[3290]: E0912 17:24:39.095497 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095749 kubelet[3290]: E0912 17:24:39.095735 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095749 kubelet[3290]: W0912 17:24:39.095744 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095749 kubelet[3290]: E0912 17:24:39.095754 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.095856 kubelet[3290]: E0912 17:24:39.095847 3290 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:24:39.095856 kubelet[3290]: W0912 17:24:39.095851 3290 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:24:39.095888 kubelet[3290]: E0912 17:24:39.095857 3290 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:24:39.097995 containerd[1876]: time="2025-09-12T17:24:39.097591140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:39.100145 containerd[1876]: time="2025-09-12T17:24:39.100125237Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 17:24:39.103269 containerd[1876]: time="2025-09-12T17:24:39.103153422Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:39.108931 containerd[1876]: time="2025-09-12T17:24:39.108891181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:39.109958 containerd[1876]: time="2025-09-12T17:24:39.109886719Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.507379171s" Sep 12 17:24:39.109958 containerd[1876]: time="2025-09-12T17:24:39.109911565Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 17:24:39.112558 containerd[1876]: time="2025-09-12T17:24:39.111684974Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:24:39.129691 containerd[1876]: time="2025-09-12T17:24:39.129663795Z" level=info msg="Container e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:39.134327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1954270312.mount: Deactivated successfully. Sep 12 17:24:39.148681 containerd[1876]: time="2025-09-12T17:24:39.148651064Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\"" Sep 12 17:24:39.149248 containerd[1876]: time="2025-09-12T17:24:39.149130506Z" level=info msg="StartContainer for \"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\"" Sep 12 17:24:39.150453 containerd[1876]: time="2025-09-12T17:24:39.150395939Z" level=info msg="connecting to shim e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69" address="unix:///run/containerd/s/6f0b60d28eea817d8cc41ee87c0124c9684bb74e660add9c33a8b6925d2363de" protocol=ttrpc version=3 Sep 12 17:24:39.174640 systemd[1]: Started cri-containerd-e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69.scope - libcontainer container e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69. Sep 12 17:24:39.201300 containerd[1876]: time="2025-09-12T17:24:39.201025090Z" level=info msg="StartContainer for \"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\" returns successfully" Sep 12 17:24:39.211550 systemd[1]: cri-containerd-e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69.scope: Deactivated successfully. Sep 12 17:24:39.216278 containerd[1876]: time="2025-09-12T17:24:39.216188360Z" level=info msg="received exit event container_id:\"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\" id:\"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\" pid:4038 exited_at:{seconds:1757697879 nanos:215886619}" Sep 12 17:24:39.216625 containerd[1876]: time="2025-09-12T17:24:39.216589479Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\" id:\"e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69\" pid:4038 exited_at:{seconds:1757697879 nanos:215886619}" Sep 12 17:24:39.231614 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e6b4e602e147946b40e6311b85f6fce5a17105f82c9b40b630e562c291019d69-rootfs.mount: Deactivated successfully. Sep 12 17:24:39.941539 kubelet[3290]: E0912 17:24:39.941493 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:40.036613 kubelet[3290]: I0912 17:24:40.036566 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5dfdb7ff87-zl6cx" podStartSLOduration=3.17421309 podStartE2EDuration="5.036553443s" podCreationTimestamp="2025-09-12 17:24:35 +0000 UTC" firstStartedPulling="2025-09-12 17:24:35.739793946 +0000 UTC m=+16.943016461" lastFinishedPulling="2025-09-12 17:24:37.602134307 +0000 UTC m=+18.805356814" observedRunningTime="2025-09-12 17:24:38.020438754 +0000 UTC m=+19.223661277" watchObservedRunningTime="2025-09-12 17:24:40.036553443 +0000 UTC m=+21.239775950" Sep 12 17:24:41.020775 containerd[1876]: time="2025-09-12T17:24:41.020549801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:24:41.941478 kubelet[3290]: E0912 17:24:41.941436 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:42.142533 kubelet[3290]: I0912 17:24:42.142424 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:24:43.397422 containerd[1876]: time="2025-09-12T17:24:43.397033487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:43.404016 containerd[1876]: time="2025-09-12T17:24:43.403992457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 17:24:43.409557 containerd[1876]: time="2025-09-12T17:24:43.409532213Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:43.413381 containerd[1876]: time="2025-09-12T17:24:43.413355084Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:43.413912 containerd[1876]: time="2025-09-12T17:24:43.413643306Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.393065707s" Sep 12 17:24:43.413912 containerd[1876]: time="2025-09-12T17:24:43.413666609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 17:24:43.416141 containerd[1876]: time="2025-09-12T17:24:43.416115839Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:24:43.433534 containerd[1876]: time="2025-09-12T17:24:43.433165470Z" level=info msg="Container bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:43.449346 containerd[1876]: time="2025-09-12T17:24:43.449313766Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\"" Sep 12 17:24:43.450501 containerd[1876]: time="2025-09-12T17:24:43.450473613Z" level=info msg="StartContainer for \"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\"" Sep 12 17:24:43.451606 containerd[1876]: time="2025-09-12T17:24:43.451582976Z" level=info msg="connecting to shim bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2" address="unix:///run/containerd/s/6f0b60d28eea817d8cc41ee87c0124c9684bb74e660add9c33a8b6925d2363de" protocol=ttrpc version=3 Sep 12 17:24:43.471631 systemd[1]: Started cri-containerd-bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2.scope - libcontainer container bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2. Sep 12 17:24:43.500616 containerd[1876]: time="2025-09-12T17:24:43.500591397Z" level=info msg="StartContainer for \"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\" returns successfully" Sep 12 17:24:43.941460 kubelet[3290]: E0912 17:24:43.941323 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:44.871320 containerd[1876]: time="2025-09-12T17:24:44.871283274Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:24:44.873300 systemd[1]: cri-containerd-bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2.scope: Deactivated successfully. Sep 12 17:24:44.873863 systemd[1]: cri-containerd-bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2.scope: Consumed 315ms CPU time, 184.9M memory peak, 165.8M written to disk. Sep 12 17:24:44.875153 containerd[1876]: time="2025-09-12T17:24:44.874857478Z" level=info msg="received exit event container_id:\"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\" id:\"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\" pid:4098 exited_at:{seconds:1757697884 nanos:874141768}" Sep 12 17:24:44.875749 containerd[1876]: time="2025-09-12T17:24:44.875729450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\" id:\"bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2\" pid:4098 exited_at:{seconds:1757697884 nanos:874141768}" Sep 12 17:24:44.892315 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bd3f48d4dc8257efe77b0d50888a4ec9536ece10238dce292674267454d8fad2-rootfs.mount: Deactivated successfully. Sep 12 17:24:44.938924 kubelet[3290]: I0912 17:24:44.938892 3290 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 17:24:45.143731 kubelet[3290]: I0912 17:24:45.027407 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42z2\" (UniqueName: \"kubernetes.io/projected/dc6ee434-cb48-4791-b62b-b85df19a6240-kube-api-access-t42z2\") pod \"goldmane-7988f88666-x2xqz\" (UID: \"dc6ee434-cb48-4791-b62b-b85df19a6240\") " pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.143731 kubelet[3290]: I0912 17:24:45.027431 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6063725a-5f6c-46c2-a715-57fd6390cb34-calico-apiserver-certs\") pod \"calico-apiserver-64d4864b98-lq477\" (UID: \"6063725a-5f6c-46c2-a715-57fd6390cb34\") " pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" Sep 12 17:24:45.143731 kubelet[3290]: I0912 17:24:45.027442 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gcs\" (UniqueName: \"kubernetes.io/projected/3cad8a41-7137-4312-9228-3e6bff3d94f5-kube-api-access-88gcs\") pod \"coredns-7c65d6cfc9-qnr8m\" (UID: \"3cad8a41-7137-4312-9228-3e6bff3d94f5\") " pod="kube-system/coredns-7c65d6cfc9-qnr8m" Sep 12 17:24:45.143731 kubelet[3290]: I0912 17:24:45.027455 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw96z\" (UniqueName: \"kubernetes.io/projected/6063725a-5f6c-46c2-a715-57fd6390cb34-kube-api-access-jw96z\") pod \"calico-apiserver-64d4864b98-lq477\" (UID: \"6063725a-5f6c-46c2-a715-57fd6390cb34\") " pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" Sep 12 17:24:45.143731 kubelet[3290]: I0912 17:24:45.027466 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19c62e4a-5240-4d3f-8cb7-6612cf508617-config-volume\") pod \"coredns-7c65d6cfc9-fn7z8\" (UID: \"19c62e4a-5240-4d3f-8cb7-6612cf508617\") " pod="kube-system/coredns-7c65d6cfc9-fn7z8" Sep 12 17:24:44.976834 systemd[1]: Created slice kubepods-burstable-pod19c62e4a_5240_4d3f_8cb7_6612cf508617.slice - libcontainer container kubepods-burstable-pod19c62e4a_5240_4d3f_8cb7_6612cf508617.slice. Sep 12 17:24:45.144120 kubelet[3290]: I0912 17:24:45.027478 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fdp\" (UniqueName: \"kubernetes.io/projected/19c62e4a-5240-4d3f-8cb7-6612cf508617-kube-api-access-m4fdp\") pod \"coredns-7c65d6cfc9-fn7z8\" (UID: \"19c62e4a-5240-4d3f-8cb7-6612cf508617\") " pod="kube-system/coredns-7c65d6cfc9-fn7z8" Sep 12 17:24:45.144120 kubelet[3290]: I0912 17:24:45.027489 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc6ee434-cb48-4791-b62b-b85df19a6240-goldmane-ca-bundle\") pod \"goldmane-7988f88666-x2xqz\" (UID: \"dc6ee434-cb48-4791-b62b-b85df19a6240\") " pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.144120 kubelet[3290]: I0912 17:24:45.027500 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/dc6ee434-cb48-4791-b62b-b85df19a6240-goldmane-key-pair\") pod \"goldmane-7988f88666-x2xqz\" (UID: \"dc6ee434-cb48-4791-b62b-b85df19a6240\") " pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.144120 kubelet[3290]: I0912 17:24:45.027509 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf55d22c-6583-49a5-840e-385d6703a230-whisker-ca-bundle\") pod \"whisker-5fd6ffdb49-f5wfw\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " pod="calico-system/whisker-5fd6ffdb49-f5wfw" Sep 12 17:24:45.144120 kubelet[3290]: I0912 17:24:45.027528 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3cad8a41-7137-4312-9228-3e6bff3d94f5-config-volume\") pod \"coredns-7c65d6cfc9-qnr8m\" (UID: \"3cad8a41-7137-4312-9228-3e6bff3d94f5\") " pod="kube-system/coredns-7c65d6cfc9-qnr8m" Sep 12 17:24:44.991255 systemd[1]: Created slice kubepods-besteffort-podcf55d22c_6583_49a5_840e_385d6703a230.slice - libcontainer container kubepods-besteffort-podcf55d22c_6583_49a5_840e_385d6703a230.slice. Sep 12 17:24:45.144224 kubelet[3290]: I0912 17:24:45.027539 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf55d22c-6583-49a5-840e-385d6703a230-whisker-backend-key-pair\") pod \"whisker-5fd6ffdb49-f5wfw\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " pod="calico-system/whisker-5fd6ffdb49-f5wfw" Sep 12 17:24:45.144224 kubelet[3290]: I0912 17:24:45.027548 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vshf\" (UniqueName: \"kubernetes.io/projected/cf55d22c-6583-49a5-840e-385d6703a230-kube-api-access-4vshf\") pod \"whisker-5fd6ffdb49-f5wfw\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " pod="calico-system/whisker-5fd6ffdb49-f5wfw" Sep 12 17:24:45.144224 kubelet[3290]: I0912 17:24:45.027559 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc6ee434-cb48-4791-b62b-b85df19a6240-config\") pod \"goldmane-7988f88666-x2xqz\" (UID: \"dc6ee434-cb48-4791-b62b-b85df19a6240\") " pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.144224 kubelet[3290]: I0912 17:24:45.127972 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmfbv\" (UniqueName: \"kubernetes.io/projected/a4c4cdec-52cb-4465-af8e-2767ba38a041-kube-api-access-zmfbv\") pod \"calico-apiserver-64d4864b98-zbsdh\" (UID: \"a4c4cdec-52cb-4465-af8e-2767ba38a041\") " pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" Sep 12 17:24:45.144224 kubelet[3290]: I0912 17:24:45.128118 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b05511d-a6ed-42cc-83cb-70388a48965a-tigera-ca-bundle\") pod \"calico-kube-controllers-774f5bcb5b-lmnmr\" (UID: \"1b05511d-a6ed-42cc-83cb-70388a48965a\") " pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" Sep 12 17:24:45.005489 systemd[1]: Created slice kubepods-besteffort-pod6063725a_5f6c_46c2_a715_57fd6390cb34.slice - libcontainer container kubepods-besteffort-pod6063725a_5f6c_46c2_a715_57fd6390cb34.slice. Sep 12 17:24:45.144326 kubelet[3290]: I0912 17:24:45.128297 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a4c4cdec-52cb-4465-af8e-2767ba38a041-calico-apiserver-certs\") pod \"calico-apiserver-64d4864b98-zbsdh\" (UID: \"a4c4cdec-52cb-4465-af8e-2767ba38a041\") " pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" Sep 12 17:24:45.144326 kubelet[3290]: I0912 17:24:45.128982 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhkmv\" (UniqueName: \"kubernetes.io/projected/1b05511d-a6ed-42cc-83cb-70388a48965a-kube-api-access-rhkmv\") pod \"calico-kube-controllers-774f5bcb5b-lmnmr\" (UID: \"1b05511d-a6ed-42cc-83cb-70388a48965a\") " pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" Sep 12 17:24:45.012412 systemd[1]: Created slice kubepods-besteffort-poddc6ee434_cb48_4791_b62b_b85df19a6240.slice - libcontainer container kubepods-besteffort-poddc6ee434_cb48_4791_b62b_b85df19a6240.slice. Sep 12 17:24:45.020604 systemd[1]: Created slice kubepods-burstable-pod3cad8a41_7137_4312_9228_3e6bff3d94f5.slice - libcontainer container kubepods-burstable-pod3cad8a41_7137_4312_9228_3e6bff3d94f5.slice. Sep 12 17:24:45.024932 systemd[1]: Created slice kubepods-besteffort-pod1b05511d_a6ed_42cc_83cb_70388a48965a.slice - libcontainer container kubepods-besteffort-pod1b05511d_a6ed_42cc_83cb_70388a48965a.slice. Sep 12 17:24:45.031184 systemd[1]: Created slice kubepods-besteffort-poda4c4cdec_52cb_4465_af8e_2767ba38a041.slice - libcontainer container kubepods-besteffort-poda4c4cdec_52cb_4465_af8e_2767ba38a041.slice. Sep 12 17:24:45.446151 containerd[1876]: time="2025-09-12T17:24:45.446050330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-fn7z8,Uid:19c62e4a-5240-4d3f-8cb7-6612cf508617,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:45.451553 containerd[1876]: time="2025-09-12T17:24:45.451481040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fd6ffdb49-f5wfw,Uid:cf55d22c-6583-49a5-840e-385d6703a230,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:45.451616 containerd[1876]: time="2025-09-12T17:24:45.451606536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-774f5bcb5b-lmnmr,Uid:1b05511d-a6ed-42cc-83cb-70388a48965a,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:45.451947 containerd[1876]: time="2025-09-12T17:24:45.451926253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qnr8m,Uid:3cad8a41-7137-4312-9228-3e6bff3d94f5,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:45.452032 containerd[1876]: time="2025-09-12T17:24:45.452016088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x2xqz,Uid:dc6ee434-cb48-4791-b62b-b85df19a6240,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:45.452083 containerd[1876]: time="2025-09-12T17:24:45.452070653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-lq477,Uid:6063725a-5f6c-46c2-a715-57fd6390cb34,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:45.463544 containerd[1876]: time="2025-09-12T17:24:45.463372781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-zbsdh,Uid:a4c4cdec-52cb-4465-af8e-2767ba38a041,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:45.491670 containerd[1876]: time="2025-09-12T17:24:45.491646550Z" level=error msg="Failed to destroy network for sandbox \"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.500542 containerd[1876]: time="2025-09-12T17:24:45.500488544Z" level=error msg="Failed to destroy network for sandbox \"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.537790 containerd[1876]: time="2025-09-12T17:24:45.537579445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fd6ffdb49-f5wfw,Uid:cf55d22c-6583-49a5-840e-385d6703a230,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.537790 containerd[1876]: time="2025-09-12T17:24:45.537653561Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-fn7z8,Uid:19c62e4a-5240-4d3f-8cb7-6612cf508617,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.537917 kubelet[3290]: E0912 17:24:45.537773 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.537917 kubelet[3290]: E0912 17:24:45.537840 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fd6ffdb49-f5wfw" Sep 12 17:24:45.537917 kubelet[3290]: E0912 17:24:45.537854 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fd6ffdb49-f5wfw" Sep 12 17:24:45.537984 kubelet[3290]: E0912 17:24:45.537881 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5fd6ffdb49-f5wfw_calico-system(cf55d22c-6583-49a5-840e-385d6703a230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5fd6ffdb49-f5wfw_calico-system(cf55d22c-6583-49a5-840e-385d6703a230)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a121ebb704328c524efa0f93ebe4dfaaa0ee135d687acf2243d5ea69d828b5d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5fd6ffdb49-f5wfw" podUID="cf55d22c-6583-49a5-840e-385d6703a230" Sep 12 17:24:45.538379 kubelet[3290]: E0912 17:24:45.537772 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.538379 kubelet[3290]: E0912 17:24:45.538055 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-fn7z8" Sep 12 17:24:45.538379 kubelet[3290]: E0912 17:24:45.538070 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-fn7z8" Sep 12 17:24:45.538467 kubelet[3290]: E0912 17:24:45.538096 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-fn7z8_kube-system(19c62e4a-5240-4d3f-8cb7-6612cf508617)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-fn7z8_kube-system(19c62e4a-5240-4d3f-8cb7-6612cf508617)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3aea577c58f8047fcdcbc8c4bdaffe67a495c72c27eecbaac588961935a58b4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-fn7z8" podUID="19c62e4a-5240-4d3f-8cb7-6612cf508617" Sep 12 17:24:45.550823 containerd[1876]: time="2025-09-12T17:24:45.550750767Z" level=error msg="Failed to destroy network for sandbox \"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.556627 containerd[1876]: time="2025-09-12T17:24:45.556598235Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-774f5bcb5b-lmnmr,Uid:1b05511d-a6ed-42cc-83cb-70388a48965a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.557124 kubelet[3290]: E0912 17:24:45.556874 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.557124 kubelet[3290]: E0912 17:24:45.556917 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" Sep 12 17:24:45.557124 kubelet[3290]: E0912 17:24:45.556929 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" Sep 12 17:24:45.557222 kubelet[3290]: E0912 17:24:45.556953 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-774f5bcb5b-lmnmr_calico-system(1b05511d-a6ed-42cc-83cb-70388a48965a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-774f5bcb5b-lmnmr_calico-system(1b05511d-a6ed-42cc-83cb-70388a48965a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"041cdc9abd1bf48abbeed3774d89795b7d219e92ba27040af7503d5c95540e00\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" podUID="1b05511d-a6ed-42cc-83cb-70388a48965a" Sep 12 17:24:45.580107 containerd[1876]: time="2025-09-12T17:24:45.580035019Z" level=error msg="Failed to destroy network for sandbox \"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.583756 containerd[1876]: time="2025-09-12T17:24:45.583722488Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-lq477,Uid:6063725a-5f6c-46c2-a715-57fd6390cb34,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.584317 kubelet[3290]: E0912 17:24:45.584289 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.584609 kubelet[3290]: E0912 17:24:45.584505 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" Sep 12 17:24:45.584609 kubelet[3290]: E0912 17:24:45.584574 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" Sep 12 17:24:45.584744 kubelet[3290]: E0912 17:24:45.584723 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64d4864b98-lq477_calico-apiserver(6063725a-5f6c-46c2-a715-57fd6390cb34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64d4864b98-lq477_calico-apiserver(6063725a-5f6c-46c2-a715-57fd6390cb34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad73fcc7e7853dc04d1243865863bc9a204a10d509d9e6e73c3e9b7cdcbb0200\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" podUID="6063725a-5f6c-46c2-a715-57fd6390cb34" Sep 12 17:24:45.588472 containerd[1876]: time="2025-09-12T17:24:45.587794390Z" level=error msg="Failed to destroy network for sandbox \"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.590587 containerd[1876]: time="2025-09-12T17:24:45.590556210Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qnr8m,Uid:3cad8a41-7137-4312-9228-3e6bff3d94f5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.590798 kubelet[3290]: E0912 17:24:45.590727 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.590798 kubelet[3290]: E0912 17:24:45.590768 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-qnr8m" Sep 12 17:24:45.590798 kubelet[3290]: E0912 17:24:45.590780 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-qnr8m" Sep 12 17:24:45.591165 kubelet[3290]: E0912 17:24:45.590917 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-qnr8m_kube-system(3cad8a41-7137-4312-9228-3e6bff3d94f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-qnr8m_kube-system(3cad8a41-7137-4312-9228-3e6bff3d94f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"96ce7e18829c19b49f4c35d0dd14d795a43f13c1800e47a14f7eb2e0a5739d71\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-qnr8m" podUID="3cad8a41-7137-4312-9228-3e6bff3d94f5" Sep 12 17:24:45.596024 containerd[1876]: time="2025-09-12T17:24:45.595999943Z" level=error msg="Failed to destroy network for sandbox \"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.596170 containerd[1876]: time="2025-09-12T17:24:45.596029061Z" level=error msg="Failed to destroy network for sandbox \"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.599099 containerd[1876]: time="2025-09-12T17:24:45.599077464Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x2xqz,Uid:dc6ee434-cb48-4791-b62b-b85df19a6240,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.599383 kubelet[3290]: E0912 17:24:45.599361 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.599480 kubelet[3290]: E0912 17:24:45.599466 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.599650 kubelet[3290]: E0912 17:24:45.599634 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-x2xqz" Sep 12 17:24:45.599768 kubelet[3290]: E0912 17:24:45.599752 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-x2xqz_calico-system(dc6ee434-cb48-4791-b62b-b85df19a6240)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-x2xqz_calico-system(dc6ee434-cb48-4791-b62b-b85df19a6240)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7517e988e6ecea528d5048da5aa0d640f75cbb70b0e8df07cc2824d1f7f1641e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-x2xqz" podUID="dc6ee434-cb48-4791-b62b-b85df19a6240" Sep 12 17:24:45.603995 containerd[1876]: time="2025-09-12T17:24:45.603971333Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-zbsdh,Uid:a4c4cdec-52cb-4465-af8e-2767ba38a041,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.604264 kubelet[3290]: E0912 17:24:45.604199 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.604264 kubelet[3290]: E0912 17:24:45.604233 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" Sep 12 17:24:45.604264 kubelet[3290]: E0912 17:24:45.604244 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" Sep 12 17:24:45.604387 kubelet[3290]: E0912 17:24:45.604370 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64d4864b98-zbsdh_calico-apiserver(a4c4cdec-52cb-4465-af8e-2767ba38a041)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64d4864b98-zbsdh_calico-apiserver(a4c4cdec-52cb-4465-af8e-2767ba38a041)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4f2a16ea1cbf98a447ab1982401f04bc6f324e287639be42b83483b4b654786\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" podUID="a4c4cdec-52cb-4465-af8e-2767ba38a041" Sep 12 17:24:45.945549 systemd[1]: Created slice kubepods-besteffort-podc4043ee3_d7ce_4761_b52a_0afce638c51a.slice - libcontainer container kubepods-besteffort-podc4043ee3_d7ce_4761_b52a_0afce638c51a.slice. Sep 12 17:24:45.947548 containerd[1876]: time="2025-09-12T17:24:45.947386189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9s9cm,Uid:c4043ee3-d7ce-4761-b52a-0afce638c51a,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:45.981582 containerd[1876]: time="2025-09-12T17:24:45.981545664Z" level=error msg="Failed to destroy network for sandbox \"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.982787 systemd[1]: run-netns-cni\x2d0341510e\x2d5c8a\x2dceb4\x2d1cce\x2d21bd7f425c35.mount: Deactivated successfully. Sep 12 17:24:45.986749 containerd[1876]: time="2025-09-12T17:24:45.986719372Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9s9cm,Uid:c4043ee3-d7ce-4761-b52a-0afce638c51a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.986930 kubelet[3290]: E0912 17:24:45.986901 3290 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:24:45.986970 kubelet[3290]: E0912 17:24:45.986943 3290 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:45.986970 kubelet[3290]: E0912 17:24:45.986960 3290 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9s9cm" Sep 12 17:24:45.987032 kubelet[3290]: E0912 17:24:45.987005 3290 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9s9cm_calico-system(c4043ee3-d7ce-4761-b52a-0afce638c51a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9s9cm_calico-system(c4043ee3-d7ce-4761-b52a-0afce638c51a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fdc0ec01c51fdfecab5a1dd0965ebc933149339bc58c97c57b564b417f149d42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9s9cm" podUID="c4043ee3-d7ce-4761-b52a-0afce638c51a" Sep 12 17:24:46.034281 containerd[1876]: time="2025-09-12T17:24:46.034262364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:24:49.610655 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1048022443.mount: Deactivated successfully. Sep 12 17:24:50.163781 containerd[1876]: time="2025-09-12T17:24:50.163732186Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:50.171300 containerd[1876]: time="2025-09-12T17:24:50.171266507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 17:24:50.174472 containerd[1876]: time="2025-09-12T17:24:50.173771862Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:50.196243 containerd[1876]: time="2025-09-12T17:24:50.196218568Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:50.196684 containerd[1876]: time="2025-09-12T17:24:50.196658654Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.161141261s" Sep 12 17:24:50.196735 containerd[1876]: time="2025-09-12T17:24:50.196685733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 17:24:50.210056 containerd[1876]: time="2025-09-12T17:24:50.209951921Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:24:50.261215 containerd[1876]: time="2025-09-12T17:24:50.261183653Z" level=info msg="Container a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:50.276093 containerd[1876]: time="2025-09-12T17:24:50.276061186Z" level=info msg="CreateContainer within sandbox \"a8756d1a491310fea37d51d7bdaefea1ce9d60c7867f9ab592288bb11bbd1827\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\"" Sep 12 17:24:50.277381 containerd[1876]: time="2025-09-12T17:24:50.277357805Z" level=info msg="StartContainer for \"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\"" Sep 12 17:24:50.278486 containerd[1876]: time="2025-09-12T17:24:50.278459299Z" level=info msg="connecting to shim a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a" address="unix:///run/containerd/s/6f0b60d28eea817d8cc41ee87c0124c9684bb74e660add9c33a8b6925d2363de" protocol=ttrpc version=3 Sep 12 17:24:50.298638 systemd[1]: Started cri-containerd-a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a.scope - libcontainer container a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a. Sep 12 17:24:50.347384 containerd[1876]: time="2025-09-12T17:24:50.347349407Z" level=info msg="StartContainer for \"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" returns successfully" Sep 12 17:24:50.640192 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:24:50.640321 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:24:50.864737 kubelet[3290]: I0912 17:24:50.864699 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf55d22c-6583-49a5-840e-385d6703a230-whisker-backend-key-pair\") pod \"cf55d22c-6583-49a5-840e-385d6703a230\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " Sep 12 17:24:50.864737 kubelet[3290]: I0912 17:24:50.864739 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf55d22c-6583-49a5-840e-385d6703a230-whisker-ca-bundle\") pod \"cf55d22c-6583-49a5-840e-385d6703a230\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " Sep 12 17:24:50.865400 kubelet[3290]: I0912 17:24:50.864760 3290 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vshf\" (UniqueName: \"kubernetes.io/projected/cf55d22c-6583-49a5-840e-385d6703a230-kube-api-access-4vshf\") pod \"cf55d22c-6583-49a5-840e-385d6703a230\" (UID: \"cf55d22c-6583-49a5-840e-385d6703a230\") " Sep 12 17:24:50.867932 kubelet[3290]: I0912 17:24:50.867876 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf55d22c-6583-49a5-840e-385d6703a230-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cf55d22c-6583-49a5-840e-385d6703a230" (UID: "cf55d22c-6583-49a5-840e-385d6703a230"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 17:24:50.869608 systemd[1]: var-lib-kubelet-pods-cf55d22c\x2d6583\x2d49a5\x2d840e\x2d385d6703a230-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:24:50.873676 kubelet[3290]: I0912 17:24:50.873635 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf55d22c-6583-49a5-840e-385d6703a230-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cf55d22c-6583-49a5-840e-385d6703a230" (UID: "cf55d22c-6583-49a5-840e-385d6703a230"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 17:24:50.874485 kubelet[3290]: I0912 17:24:50.874447 3290 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf55d22c-6583-49a5-840e-385d6703a230-kube-api-access-4vshf" (OuterVolumeSpecName: "kube-api-access-4vshf") pod "cf55d22c-6583-49a5-840e-385d6703a230" (UID: "cf55d22c-6583-49a5-840e-385d6703a230"). InnerVolumeSpecName "kube-api-access-4vshf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 17:24:50.874801 systemd[1]: var-lib-kubelet-pods-cf55d22c\x2d6583\x2d49a5\x2d840e\x2d385d6703a230-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4vshf.mount: Deactivated successfully. Sep 12 17:24:50.950227 systemd[1]: Removed slice kubepods-besteffort-podcf55d22c_6583_49a5_840e_385d6703a230.slice - libcontainer container kubepods-besteffort-podcf55d22c_6583_49a5_840e_385d6703a230.slice. Sep 12 17:24:50.965972 kubelet[3290]: I0912 17:24:50.965947 3290 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vshf\" (UniqueName: \"kubernetes.io/projected/cf55d22c-6583-49a5-840e-385d6703a230-kube-api-access-4vshf\") on node \"ci-4426.1.0-a-2d28ed79c9\" DevicePath \"\"" Sep 12 17:24:50.965972 kubelet[3290]: I0912 17:24:50.965970 3290 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf55d22c-6583-49a5-840e-385d6703a230-whisker-backend-key-pair\") on node \"ci-4426.1.0-a-2d28ed79c9\" DevicePath \"\"" Sep 12 17:24:50.965972 kubelet[3290]: I0912 17:24:50.965978 3290 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf55d22c-6583-49a5-840e-385d6703a230-whisker-ca-bundle\") on node \"ci-4426.1.0-a-2d28ed79c9\" DevicePath \"\"" Sep 12 17:24:51.097821 kubelet[3290]: I0912 17:24:51.097469 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jcfnq" podStartSLOduration=1.725949316 podStartE2EDuration="16.09745636s" podCreationTimestamp="2025-09-12 17:24:35 +0000 UTC" firstStartedPulling="2025-09-12 17:24:35.825944035 +0000 UTC m=+17.029166542" lastFinishedPulling="2025-09-12 17:24:50.197451079 +0000 UTC m=+31.400673586" observedRunningTime="2025-09-12 17:24:51.096944023 +0000 UTC m=+32.300166530" watchObservedRunningTime="2025-09-12 17:24:51.09745636 +0000 UTC m=+32.300678875" Sep 12 17:24:51.121939 systemd[1]: Created slice kubepods-besteffort-pod4d0c4691_3a19_4c47_abfb_025cbe1e23ba.slice - libcontainer container kubepods-besteffort-pod4d0c4691_3a19_4c47_abfb_025cbe1e23ba.slice. Sep 12 17:24:51.167001 kubelet[3290]: I0912 17:24:51.166970 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d0c4691-3a19-4c47-abfb-025cbe1e23ba-whisker-ca-bundle\") pod \"whisker-6df9779856-mw5qb\" (UID: \"4d0c4691-3a19-4c47-abfb-025cbe1e23ba\") " pod="calico-system/whisker-6df9779856-mw5qb" Sep 12 17:24:51.167066 kubelet[3290]: I0912 17:24:51.167009 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4d0c4691-3a19-4c47-abfb-025cbe1e23ba-whisker-backend-key-pair\") pod \"whisker-6df9779856-mw5qb\" (UID: \"4d0c4691-3a19-4c47-abfb-025cbe1e23ba\") " pod="calico-system/whisker-6df9779856-mw5qb" Sep 12 17:24:51.167066 kubelet[3290]: I0912 17:24:51.167024 3290 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffzb9\" (UniqueName: \"kubernetes.io/projected/4d0c4691-3a19-4c47-abfb-025cbe1e23ba-kube-api-access-ffzb9\") pod \"whisker-6df9779856-mw5qb\" (UID: \"4d0c4691-3a19-4c47-abfb-025cbe1e23ba\") " pod="calico-system/whisker-6df9779856-mw5qb" Sep 12 17:24:51.425222 containerd[1876]: time="2025-09-12T17:24:51.425182988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6df9779856-mw5qb,Uid:4d0c4691-3a19-4c47-abfb-025cbe1e23ba,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:51.532130 systemd-networkd[1695]: calid9fd7d4fc96: Link UP Sep 12 17:24:51.532767 systemd-networkd[1695]: calid9fd7d4fc96: Gained carrier Sep 12 17:24:51.549926 containerd[1876]: 2025-09-12 17:24:51.444 [INFO][4430] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:24:51.549926 containerd[1876]: 2025-09-12 17:24:51.464 [INFO][4430] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0 whisker-6df9779856- calico-system 4d0c4691-3a19-4c47-abfb-025cbe1e23ba 859 0 2025-09-12 17:24:51 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6df9779856 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 whisker-6df9779856-mw5qb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid9fd7d4fc96 [] [] }} ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-" Sep 12 17:24:51.549926 containerd[1876]: 2025-09-12 17:24:51.464 [INFO][4430] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.549926 containerd[1876]: 2025-09-12 17:24:51.482 [INFO][4441] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" HandleID="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.482 [INFO][4441] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" HandleID="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b950), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"whisker-6df9779856-mw5qb", "timestamp":"2025-09-12 17:24:51.48279151 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.482 [INFO][4441] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.482 [INFO][4441] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.483 [INFO][4441] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.488 [INFO][4441] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.492 [INFO][4441] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.495 [INFO][4441] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.496 [INFO][4441] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550092 containerd[1876]: 2025-09-12 17:24:51.498 [INFO][4441] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.498 [INFO][4441] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.499 [INFO][4441] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1 Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.504 [INFO][4441] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.509 [INFO][4441] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.65/26] block=192.168.112.64/26 handle="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.509 [INFO][4441] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.65/26] handle="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.509 [INFO][4441] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:51.550250 containerd[1876]: 2025-09-12 17:24:51.509 [INFO][4441] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.65/26] IPv6=[] ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" HandleID="k8s-pod-network.cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.550346 containerd[1876]: 2025-09-12 17:24:51.512 [INFO][4430] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0", GenerateName:"whisker-6df9779856-", Namespace:"calico-system", SelfLink:"", UID:"4d0c4691-3a19-4c47-abfb-025cbe1e23ba", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6df9779856", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"whisker-6df9779856-mw5qb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.112.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid9fd7d4fc96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:51.550346 containerd[1876]: 2025-09-12 17:24:51.512 [INFO][4430] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.65/32] ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.550393 containerd[1876]: 2025-09-12 17:24:51.512 [INFO][4430] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid9fd7d4fc96 ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.550393 containerd[1876]: 2025-09-12 17:24:51.533 [INFO][4430] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.550422 containerd[1876]: 2025-09-12 17:24:51.533 [INFO][4430] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0", GenerateName:"whisker-6df9779856-", Namespace:"calico-system", SelfLink:"", UID:"4d0c4691-3a19-4c47-abfb-025cbe1e23ba", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6df9779856", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1", Pod:"whisker-6df9779856-mw5qb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.112.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid9fd7d4fc96", MAC:"fa:55:a0:b7:51:c3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:51.550453 containerd[1876]: 2025-09-12 17:24:51.545 [INFO][4430] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" Namespace="calico-system" Pod="whisker-6df9779856-mw5qb" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-whisker--6df9779856--mw5qb-eth0" Sep 12 17:24:51.596549 containerd[1876]: time="2025-09-12T17:24:51.595685612Z" level=info msg="connecting to shim cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1" address="unix:///run/containerd/s/ae335400ac38ff5d4aa0c26c1374d582bca2c079d59f720557893bf63fda2632" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:51.631622 systemd[1]: Started cri-containerd-cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1.scope - libcontainer container cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1. Sep 12 17:24:51.659564 containerd[1876]: time="2025-09-12T17:24:51.659502421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6df9779856-mw5qb,Uid:4d0c4691-3a19-4c47-abfb-025cbe1e23ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1\"" Sep 12 17:24:51.661545 containerd[1876]: time="2025-09-12T17:24:51.661491998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:24:52.436921 systemd-networkd[1695]: vxlan.calico: Link UP Sep 12 17:24:52.436988 systemd-networkd[1695]: vxlan.calico: Gained carrier Sep 12 17:24:52.942867 kubelet[3290]: I0912 17:24:52.942830 3290 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf55d22c-6583-49a5-840e-385d6703a230" path="/var/lib/kubelet/pods/cf55d22c-6583-49a5-840e-385d6703a230/volumes" Sep 12 17:24:53.041666 systemd-networkd[1695]: calid9fd7d4fc96: Gained IPv6LL Sep 12 17:24:53.601676 containerd[1876]: time="2025-09-12T17:24:53.601630200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:53.604651 containerd[1876]: time="2025-09-12T17:24:53.604629985Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 17:24:53.609033 containerd[1876]: time="2025-09-12T17:24:53.608996699Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:53.612883 containerd[1876]: time="2025-09-12T17:24:53.612483192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:53.612883 containerd[1876]: time="2025-09-12T17:24:53.612784494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.951264233s" Sep 12 17:24:53.612883 containerd[1876]: time="2025-09-12T17:24:53.612806101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 17:24:53.614631 containerd[1876]: time="2025-09-12T17:24:53.614600316Z" level=info msg="CreateContainer within sandbox \"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:24:53.634143 containerd[1876]: time="2025-09-12T17:24:53.633655757Z" level=info msg="Container ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:53.636146 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3930147659.mount: Deactivated successfully. Sep 12 17:24:53.653896 containerd[1876]: time="2025-09-12T17:24:53.653859476Z" level=info msg="CreateContainer within sandbox \"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0\"" Sep 12 17:24:53.654331 containerd[1876]: time="2025-09-12T17:24:53.654275339Z" level=info msg="StartContainer for \"ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0\"" Sep 12 17:24:53.655012 containerd[1876]: time="2025-09-12T17:24:53.654989450Z" level=info msg="connecting to shim ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0" address="unix:///run/containerd/s/ae335400ac38ff5d4aa0c26c1374d582bca2c079d59f720557893bf63fda2632" protocol=ttrpc version=3 Sep 12 17:24:53.672621 systemd[1]: Started cri-containerd-ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0.scope - libcontainer container ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0. Sep 12 17:24:53.701031 containerd[1876]: time="2025-09-12T17:24:53.701007783Z" level=info msg="StartContainer for \"ff40218d14f2ba06edef3e797467a2269efb06d6260b47a5b9cfd9c0d7a446e0\" returns successfully" Sep 12 17:24:53.702171 containerd[1876]: time="2025-09-12T17:24:53.702149765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:24:53.873650 systemd-networkd[1695]: vxlan.calico: Gained IPv6LL Sep 12 17:24:55.237171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2365325665.mount: Deactivated successfully. Sep 12 17:24:55.384653 containerd[1876]: time="2025-09-12T17:24:55.384608886Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:55.387551 containerd[1876]: time="2025-09-12T17:24:55.387511245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 17:24:55.392552 containerd[1876]: time="2025-09-12T17:24:55.392529745Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:55.397628 containerd[1876]: time="2025-09-12T17:24:55.397587962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:55.398150 containerd[1876]: time="2025-09-12T17:24:55.397814093Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.695640937s" Sep 12 17:24:55.398150 containerd[1876]: time="2025-09-12T17:24:55.397837771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 17:24:55.400097 containerd[1876]: time="2025-09-12T17:24:55.400015028Z" level=info msg="CreateContainer within sandbox \"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:24:55.420783 containerd[1876]: time="2025-09-12T17:24:55.420762315Z" level=info msg="Container 4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:55.439079 containerd[1876]: time="2025-09-12T17:24:55.439041505Z" level=info msg="CreateContainer within sandbox \"cfd9220d0ec576d319a22087f841021d37c3f90559ed5a72fcc9970485a3e1a1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284\"" Sep 12 17:24:55.439659 containerd[1876]: time="2025-09-12T17:24:55.439581618Z" level=info msg="StartContainer for \"4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284\"" Sep 12 17:24:55.440371 containerd[1876]: time="2025-09-12T17:24:55.440246003Z" level=info msg="connecting to shim 4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284" address="unix:///run/containerd/s/ae335400ac38ff5d4aa0c26c1374d582bca2c079d59f720557893bf63fda2632" protocol=ttrpc version=3 Sep 12 17:24:55.464629 systemd[1]: Started cri-containerd-4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284.scope - libcontainer container 4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284. Sep 12 17:24:55.496373 containerd[1876]: time="2025-09-12T17:24:55.496131234Z" level=info msg="StartContainer for \"4b37cba62e6645334039621ed2cf182f6ecff751dbf77384236fe133803ca284\" returns successfully" Sep 12 17:24:56.083295 kubelet[3290]: I0912 17:24:56.083242 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6df9779856-mw5qb" podStartSLOduration=1.345478362 podStartE2EDuration="5.083226828s" podCreationTimestamp="2025-09-12 17:24:51 +0000 UTC" firstStartedPulling="2025-09-12 17:24:51.660575861 +0000 UTC m=+32.863798368" lastFinishedPulling="2025-09-12 17:24:55.398324327 +0000 UTC m=+36.601546834" observedRunningTime="2025-09-12 17:24:56.081870499 +0000 UTC m=+37.285093046" watchObservedRunningTime="2025-09-12 17:24:56.083226828 +0000 UTC m=+37.286449335" Sep 12 17:24:57.941707 containerd[1876]: time="2025-09-12T17:24:57.941440153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qnr8m,Uid:3cad8a41-7137-4312-9228-3e6bff3d94f5,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:58.030096 systemd-networkd[1695]: cali97baa773eb2: Link UP Sep 12 17:24:58.030499 systemd-networkd[1695]: cali97baa773eb2: Gained carrier Sep 12 17:24:58.046307 containerd[1876]: 2025-09-12 17:24:57.979 [INFO][4783] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0 coredns-7c65d6cfc9- kube-system 3cad8a41-7137-4312-9228-3e6bff3d94f5 797 0 2025-09-12 17:24:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 coredns-7c65d6cfc9-qnr8m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali97baa773eb2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-" Sep 12 17:24:58.046307 containerd[1876]: 2025-09-12 17:24:57.979 [INFO][4783] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.046307 containerd[1876]: 2025-09-12 17:24:57.996 [INFO][4795] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" HandleID="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:57.996 [INFO][4795] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" HandleID="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb7c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"coredns-7c65d6cfc9-qnr8m", "timestamp":"2025-09-12 17:24:57.996431723 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:57.996 [INFO][4795] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:57.996 [INFO][4795] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:57.996 [INFO][4795] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:58.002 [INFO][4795] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:58.005 [INFO][4795] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:58.008 [INFO][4795] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:58.010 [INFO][4795] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.047209 containerd[1876]: 2025-09-12 17:24:58.012 [INFO][4795] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.012 [INFO][4795] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.013 [INFO][4795] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541 Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.021 [INFO][4795] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.025 [INFO][4795] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.66/26] block=192.168.112.64/26 handle="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.025 [INFO][4795] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.66/26] handle="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.025 [INFO][4795] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:58.048337 containerd[1876]: 2025-09-12 17:24:58.025 [INFO][4795] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.66/26] IPv6=[] ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" HandleID="k8s-pod-network.c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.027 [INFO][4783] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3cad8a41-7137-4312-9228-3e6bff3d94f5", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"coredns-7c65d6cfc9-qnr8m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali97baa773eb2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.027 [INFO][4783] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.66/32] ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.027 [INFO][4783] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali97baa773eb2 ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.030 [INFO][4783] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.031 [INFO][4783] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3cad8a41-7137-4312-9228-3e6bff3d94f5", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541", Pod:"coredns-7c65d6cfc9-qnr8m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali97baa773eb2", MAC:"d6:12:23:af:31:f2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:58.048449 containerd[1876]: 2025-09-12 17:24:58.042 [INFO][4783] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" Namespace="kube-system" Pod="coredns-7c65d6cfc9-qnr8m" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--qnr8m-eth0" Sep 12 17:24:58.099699 containerd[1876]: time="2025-09-12T17:24:58.099644339Z" level=info msg="connecting to shim c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541" address="unix:///run/containerd/s/e07a72e848f48ed42f3f8c5eef26c733e72ddc5448954b86981a556a3d3a0a5f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:58.119764 systemd[1]: Started cri-containerd-c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541.scope - libcontainer container c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541. Sep 12 17:24:58.157893 containerd[1876]: time="2025-09-12T17:24:58.157870697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-qnr8m,Uid:3cad8a41-7137-4312-9228-3e6bff3d94f5,Namespace:kube-system,Attempt:0,} returns sandbox id \"c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541\"" Sep 12 17:24:58.160408 containerd[1876]: time="2025-09-12T17:24:58.160383135Z" level=info msg="CreateContainer within sandbox \"c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:24:58.179296 containerd[1876]: time="2025-09-12T17:24:58.179164608Z" level=info msg="Container f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:58.180609 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1744648216.mount: Deactivated successfully. Sep 12 17:24:58.189545 containerd[1876]: time="2025-09-12T17:24:58.189488726Z" level=info msg="CreateContainer within sandbox \"c11347d6a4ccbdbf6524b6ea7b2199bd3351f2dadf09aacef46a276619044541\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d\"" Sep 12 17:24:58.190027 containerd[1876]: time="2025-09-12T17:24:58.190004656Z" level=info msg="StartContainer for \"f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d\"" Sep 12 17:24:58.190840 containerd[1876]: time="2025-09-12T17:24:58.190811169Z" level=info msg="connecting to shim f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d" address="unix:///run/containerd/s/e07a72e848f48ed42f3f8c5eef26c733e72ddc5448954b86981a556a3d3a0a5f" protocol=ttrpc version=3 Sep 12 17:24:58.210632 systemd[1]: Started cri-containerd-f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d.scope - libcontainer container f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d. Sep 12 17:24:58.235536 containerd[1876]: time="2025-09-12T17:24:58.234896511Z" level=info msg="StartContainer for \"f9ce66ab8a7f46ba6cc3afb81e641c44d349ef21af312e79aca2d66442f2fb8d\" returns successfully" Sep 12 17:24:58.941697 containerd[1876]: time="2025-09-12T17:24:58.941604109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-774f5bcb5b-lmnmr,Uid:1b05511d-a6ed-42cc-83cb-70388a48965a,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:59.036483 systemd-networkd[1695]: calib96b75b0b45: Link UP Sep 12 17:24:59.037500 systemd-networkd[1695]: calib96b75b0b45: Gained carrier Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.976 [INFO][4892] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0 calico-kube-controllers-774f5bcb5b- calico-system 1b05511d-a6ed-42cc-83cb-70388a48965a 794 0 2025-09-12 17:24:35 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:774f5bcb5b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 calico-kube-controllers-774f5bcb5b-lmnmr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib96b75b0b45 [] [] }} ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.976 [INFO][4892] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.999 [INFO][4904] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" HandleID="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.999 [INFO][4904] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" HandleID="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b0d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"calico-kube-controllers-774f5bcb5b-lmnmr", "timestamp":"2025-09-12 17:24:58.99907144 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.999 [INFO][4904] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.999 [INFO][4904] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:58.999 [INFO][4904] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.006 [INFO][4904] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.011 [INFO][4904] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.014 [INFO][4904] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.016 [INFO][4904] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.018 [INFO][4904] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.018 [INFO][4904] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.019 [INFO][4904] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286 Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.023 [INFO][4904] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.031 [INFO][4904] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.67/26] block=192.168.112.64/26 handle="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.031 [INFO][4904] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.67/26] handle="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.032 [INFO][4904] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:59.054043 containerd[1876]: 2025-09-12 17:24:59.032 [INFO][4904] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.67/26] IPv6=[] ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" HandleID="k8s-pod-network.8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.033 [INFO][4892] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0", GenerateName:"calico-kube-controllers-774f5bcb5b-", Namespace:"calico-system", SelfLink:"", UID:"1b05511d-a6ed-42cc-83cb-70388a48965a", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"774f5bcb5b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"calico-kube-controllers-774f5bcb5b-lmnmr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.112.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib96b75b0b45", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.033 [INFO][4892] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.67/32] ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.033 [INFO][4892] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib96b75b0b45 ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.037 [INFO][4892] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.038 [INFO][4892] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0", GenerateName:"calico-kube-controllers-774f5bcb5b-", Namespace:"calico-system", SelfLink:"", UID:"1b05511d-a6ed-42cc-83cb-70388a48965a", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"774f5bcb5b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286", Pod:"calico-kube-controllers-774f5bcb5b-lmnmr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.112.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib96b75b0b45", MAC:"26:aa:99:b0:f6:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:59.055223 containerd[1876]: 2025-09-12 17:24:59.051 [INFO][4892] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" Namespace="calico-system" Pod="calico-kube-controllers-774f5bcb5b-lmnmr" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--kube--controllers--774f5bcb5b--lmnmr-eth0" Sep 12 17:24:59.091269 kubelet[3290]: I0912 17:24:59.091226 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-qnr8m" podStartSLOduration=36.091212861 podStartE2EDuration="36.091212861s" podCreationTimestamp="2025-09-12 17:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:59.09009019 +0000 UTC m=+40.293312697" watchObservedRunningTime="2025-09-12 17:24:59.091212861 +0000 UTC m=+40.294435368" Sep 12 17:24:59.113264 containerd[1876]: time="2025-09-12T17:24:59.113207571Z" level=info msg="connecting to shim 8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286" address="unix:///run/containerd/s/3f3f1533a006b02c1babfc06ad08f4be3d4a6db6817ffb92d8f3bd8ea12c0405" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:59.137678 systemd[1]: Started cri-containerd-8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286.scope - libcontainer container 8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286. Sep 12 17:24:59.170916 containerd[1876]: time="2025-09-12T17:24:59.170872145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-774f5bcb5b-lmnmr,Uid:1b05511d-a6ed-42cc-83cb-70388a48965a,Namespace:calico-system,Attempt:0,} returns sandbox id \"8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286\"" Sep 12 17:24:59.172279 containerd[1876]: time="2025-09-12T17:24:59.172257081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:24:59.761665 systemd-networkd[1695]: cali97baa773eb2: Gained IPv6LL Sep 12 17:24:59.941716 containerd[1876]: time="2025-09-12T17:24:59.941685142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-lq477,Uid:6063725a-5f6c-46c2-a715-57fd6390cb34,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:59.941845 containerd[1876]: time="2025-09-12T17:24:59.941828158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-zbsdh,Uid:a4c4cdec-52cb-4465-af8e-2767ba38a041,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:59.941989 containerd[1876]: time="2025-09-12T17:24:59.941685702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-fn7z8,Uid:19c62e4a-5240-4d3f-8cb7-6612cf508617,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:59.942263 containerd[1876]: time="2025-09-12T17:24:59.942197401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9s9cm,Uid:c4043ee3-d7ce-4761-b52a-0afce638c51a,Namespace:calico-system,Attempt:0,}" Sep 12 17:25:00.113203 systemd-networkd[1695]: calie0c64e51ee3: Link UP Sep 12 17:25:00.116385 systemd-networkd[1695]: calie0c64e51ee3: Gained carrier Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:24:59.999 [INFO][4975] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0 calico-apiserver-64d4864b98- calico-apiserver 6063725a-5f6c-46c2-a715-57fd6390cb34 796 0 2025-09-12 17:24:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64d4864b98 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 calico-apiserver-64d4864b98-lq477 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie0c64e51ee3 [] [] }} ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.000 [INFO][4975] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.046 [INFO][5020] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" HandleID="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.046 [INFO][5020] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" HandleID="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024aff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"calico-apiserver-64d4864b98-lq477", "timestamp":"2025-09-12 17:25:00.046714828 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.046 [INFO][5020] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.046 [INFO][5020] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.046 [INFO][5020] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.055 [INFO][5020] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.063 [INFO][5020] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.068 [INFO][5020] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.070 [INFO][5020] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.073 [INFO][5020] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.073 [INFO][5020] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.079 [INFO][5020] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78 Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.086 [INFO][5020] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5020] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.68/26] block=192.168.112.64/26 handle="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5020] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.68/26] handle="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5020] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:25:00.213260 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5020] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.68/26] IPv6=[] ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" HandleID="k8s-pod-network.4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.110 [INFO][4975] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0", GenerateName:"calico-apiserver-64d4864b98-", Namespace:"calico-apiserver", SelfLink:"", UID:"6063725a-5f6c-46c2-a715-57fd6390cb34", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d4864b98", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"calico-apiserver-64d4864b98-lq477", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie0c64e51ee3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.111 [INFO][4975] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.68/32] ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.111 [INFO][4975] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie0c64e51ee3 ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.118 [INFO][4975] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.118 [INFO][4975] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0", GenerateName:"calico-apiserver-64d4864b98-", Namespace:"calico-apiserver", SelfLink:"", UID:"6063725a-5f6c-46c2-a715-57fd6390cb34", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d4864b98", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78", Pod:"calico-apiserver-64d4864b98-lq477", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie0c64e51ee3", MAC:"0a:c1:63:90:e8:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.215696 containerd[1876]: 2025-09-12 17:25:00.210 [INFO][4975] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-lq477" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--lq477-eth0" Sep 12 17:25:00.257404 systemd-networkd[1695]: cali95b956c72bc: Link UP Sep 12 17:25:00.258388 systemd-networkd[1695]: cali95b956c72bc: Gained carrier Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.035 [INFO][4987] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0 calico-apiserver-64d4864b98- calico-apiserver a4c4cdec-52cb-4465-af8e-2767ba38a041 795 0 2025-09-12 17:24:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64d4864b98 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 calico-apiserver-64d4864b98-zbsdh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali95b956c72bc [] [] }} ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.035 [INFO][4987] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.082 [INFO][5030] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" HandleID="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.084 [INFO][5030] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" HandleID="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"calico-apiserver-64d4864b98-zbsdh", "timestamp":"2025-09-12 17:25:00.082493719 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.084 [INFO][5030] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5030] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.102 [INFO][5030] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.208 [INFO][5030] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.215 [INFO][5030] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.221 [INFO][5030] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.226 [INFO][5030] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.229 [INFO][5030] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.229 [INFO][5030] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.230 [INFO][5030] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8 Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.239 [INFO][5030] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5030] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.69/26] block=192.168.112.64/26 handle="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5030] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.69/26] handle="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5030] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:25:00.277256 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5030] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.69/26] IPv6=[] ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" HandleID="k8s-pod-network.83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.254 [INFO][4987] cni-plugin/k8s.go 418: Populated endpoint ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0", GenerateName:"calico-apiserver-64d4864b98-", Namespace:"calico-apiserver", SelfLink:"", UID:"a4c4cdec-52cb-4465-af8e-2767ba38a041", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d4864b98", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"calico-apiserver-64d4864b98-zbsdh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali95b956c72bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.254 [INFO][4987] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.69/32] ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.254 [INFO][4987] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali95b956c72bc ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.260 [INFO][4987] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.262 [INFO][4987] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0", GenerateName:"calico-apiserver-64d4864b98-", Namespace:"calico-apiserver", SelfLink:"", UID:"a4c4cdec-52cb-4465-af8e-2767ba38a041", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64d4864b98", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8", Pod:"calico-apiserver-64d4864b98-zbsdh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali95b956c72bc", MAC:"3a:ee:26:ba:e3:17", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.278293 containerd[1876]: 2025-09-12 17:25:00.275 [INFO][4987] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" Namespace="calico-apiserver" Pod="calico-apiserver-64d4864b98-zbsdh" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-calico--apiserver--64d4864b98--zbsdh-eth0" Sep 12 17:25:00.300981 containerd[1876]: time="2025-09-12T17:25:00.300376251Z" level=info msg="connecting to shim 4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78" address="unix:///run/containerd/s/843fddde434f4f81bef67faa8404d63e60e55ac8c25e871b8ac4ddc42a44213a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:25:00.321660 systemd[1]: Started cri-containerd-4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78.scope - libcontainer container 4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78. Sep 12 17:25:00.358226 systemd-networkd[1695]: cali07fefbd0c6c: Link UP Sep 12 17:25:00.359078 systemd-networkd[1695]: cali07fefbd0c6c: Gained carrier Sep 12 17:25:00.376902 containerd[1876]: time="2025-09-12T17:25:00.376848089Z" level=info msg="connecting to shim 83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8" address="unix:///run/containerd/s/ce027a8d72118f3074050855513981fefc6719628833c99c6617eb02cc04d03f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.045 [INFO][4998] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0 coredns-7c65d6cfc9- kube-system 19c62e4a-5240-4d3f-8cb7-6612cf508617 787 0 2025-09-12 17:24:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 coredns-7c65d6cfc9-fn7z8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali07fefbd0c6c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.047 [INFO][4998] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.097 [INFO][5037] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" HandleID="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.097 [INFO][5037] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" HandleID="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"coredns-7c65d6cfc9-fn7z8", "timestamp":"2025-09-12 17:25:00.097589183 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.097 [INFO][5037] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5037] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.249 [INFO][5037] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.266 [INFO][5037] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.319 [INFO][5037] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.326 [INFO][5037] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.328 [INFO][5037] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.330 [INFO][5037] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.330 [INFO][5037] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.331 [INFO][5037] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7 Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.340 [INFO][5037] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.351 [INFO][5037] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.70/26] block=192.168.112.64/26 handle="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.351 [INFO][5037] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.70/26] handle="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.351 [INFO][5037] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:25:00.385109 containerd[1876]: 2025-09-12 17:25:00.352 [INFO][5037] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.70/26] IPv6=[] ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" HandleID="k8s-pod-network.41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.355 [INFO][4998] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"19c62e4a-5240-4d3f-8cb7-6612cf508617", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"coredns-7c65d6cfc9-fn7z8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali07fefbd0c6c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.355 [INFO][4998] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.70/32] ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.355 [INFO][4998] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07fefbd0c6c ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.359 [INFO][4998] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.360 [INFO][4998] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"19c62e4a-5240-4d3f-8cb7-6612cf508617", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7", Pod:"coredns-7c65d6cfc9-fn7z8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali07fefbd0c6c", MAC:"ce:b2:83:d9:05:57", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.385459 containerd[1876]: 2025-09-12 17:25:00.382 [INFO][4998] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-fn7z8" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-coredns--7c65d6cfc9--fn7z8-eth0" Sep 12 17:25:00.398740 systemd[1]: Started cri-containerd-83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8.scope - libcontainer container 83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8. Sep 12 17:25:00.405747 containerd[1876]: time="2025-09-12T17:25:00.405710695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-lq477,Uid:6063725a-5f6c-46c2-a715-57fd6390cb34,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78\"" Sep 12 17:25:00.457927 systemd-networkd[1695]: cali26a02b1d498: Link UP Sep 12 17:25:00.458077 systemd-networkd[1695]: cali26a02b1d498: Gained carrier Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.073 [INFO][5010] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0 csi-node-driver- calico-system c4043ee3-d7ce-4761-b52a-0afce638c51a 681 0 2025-09-12 17:24:35 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 csi-node-driver-9s9cm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali26a02b1d498 [] [] }} ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.074 [INFO][5010] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.101 [INFO][5045] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" HandleID="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.101 [INFO][5045] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" HandleID="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330230), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"csi-node-driver-9s9cm", "timestamp":"2025-09-12 17:25:00.100507821 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.101 [INFO][5045] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.352 [INFO][5045] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.352 [INFO][5045] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.364 [INFO][5045] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.416 [INFO][5045] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.430 [INFO][5045] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.433 [INFO][5045] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.435 [INFO][5045] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.435 [INFO][5045] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.436 [INFO][5045] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2 Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.443 [INFO][5045] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.452 [INFO][5045] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.71/26] block=192.168.112.64/26 handle="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.452 [INFO][5045] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.71/26] handle="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.452 [INFO][5045] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:25:00.476889 containerd[1876]: 2025-09-12 17:25:00.452 [INFO][5045] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.71/26] IPv6=[] ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" HandleID="k8s-pod-network.c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.454 [INFO][5010] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c4043ee3-d7ce-4761-b52a-0afce638c51a", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"csi-node-driver-9s9cm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.112.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali26a02b1d498", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.454 [INFO][5010] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.71/32] ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.454 [INFO][5010] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali26a02b1d498 ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.458 [INFO][5010] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.458 [INFO][5010] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c4043ee3-d7ce-4761-b52a-0afce638c51a", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2", Pod:"csi-node-driver-9s9cm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.112.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali26a02b1d498", MAC:"9e:1b:a8:41:1e:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:00.477500 containerd[1876]: 2025-09-12 17:25:00.474 [INFO][5010] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" Namespace="calico-system" Pod="csi-node-driver-9s9cm" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-csi--node--driver--9s9cm-eth0" Sep 12 17:25:00.594111 containerd[1876]: time="2025-09-12T17:25:00.594016774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64d4864b98-zbsdh,Uid:a4c4cdec-52cb-4465-af8e-2767ba38a041,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8\"" Sep 12 17:25:00.651320 containerd[1876]: time="2025-09-12T17:25:00.651206289Z" level=info msg="connecting to shim c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2" address="unix:///run/containerd/s/2957fc945c488f4fc50bb6496b23b2cae30e9e579f6e54835ae1b328304e7013" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:25:00.651623 containerd[1876]: time="2025-09-12T17:25:00.651218584Z" level=info msg="connecting to shim 41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7" address="unix:///run/containerd/s/c6c5c6a12885c46d623bf03d0050061666133a8c9b346470f82d1506e5de1cf3" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:25:00.666651 systemd[1]: Started cri-containerd-41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7.scope - libcontainer container 41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7. Sep 12 17:25:00.670136 systemd[1]: Started cri-containerd-c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2.scope - libcontainer container c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2. Sep 12 17:25:00.722357 containerd[1876]: time="2025-09-12T17:25:00.722263483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-fn7z8,Uid:19c62e4a-5240-4d3f-8cb7-6612cf508617,Namespace:kube-system,Attempt:0,} returns sandbox id \"41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7\"" Sep 12 17:25:00.725820 containerd[1876]: time="2025-09-12T17:25:00.724748354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9s9cm,Uid:c4043ee3-d7ce-4761-b52a-0afce638c51a,Namespace:calico-system,Attempt:0,} returns sandbox id \"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2\"" Sep 12 17:25:00.726949 containerd[1876]: time="2025-09-12T17:25:00.726908836Z" level=info msg="CreateContainer within sandbox \"41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:25:00.769659 containerd[1876]: time="2025-09-12T17:25:00.769630506Z" level=info msg="Container dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:00.780844 containerd[1876]: time="2025-09-12T17:25:00.780781400Z" level=info msg="CreateContainer within sandbox \"41d573d4aa3a0fc398d3a34e2c4e7a54f0494ba621af725ac0f336b9a77552a7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503\"" Sep 12 17:25:00.781705 containerd[1876]: time="2025-09-12T17:25:00.781310089Z" level=info msg="StartContainer for \"dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503\"" Sep 12 17:25:00.782415 containerd[1876]: time="2025-09-12T17:25:00.782383915Z" level=info msg="connecting to shim dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503" address="unix:///run/containerd/s/c6c5c6a12885c46d623bf03d0050061666133a8c9b346470f82d1506e5de1cf3" protocol=ttrpc version=3 Sep 12 17:25:00.802646 systemd[1]: Started cri-containerd-dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503.scope - libcontainer container dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503. Sep 12 17:25:00.834761 containerd[1876]: time="2025-09-12T17:25:00.834735207Z" level=info msg="StartContainer for \"dcb55af795be111b1212f8bd0f4c1823df8bdce04008571437d37b75d28be503\" returns successfully" Sep 12 17:25:00.942945 containerd[1876]: time="2025-09-12T17:25:00.942560400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x2xqz,Uid:dc6ee434-cb48-4791-b62b-b85df19a6240,Namespace:calico-system,Attempt:0,}" Sep 12 17:25:01.042651 systemd-networkd[1695]: calib96b75b0b45: Gained IPv6LL Sep 12 17:25:01.050173 systemd-networkd[1695]: cali153f808d4bc: Link UP Sep 12 17:25:01.050981 systemd-networkd[1695]: cali153f808d4bc: Gained carrier Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:00.983 [INFO][5304] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0 goldmane-7988f88666- calico-system dc6ee434-cb48-4791-b62b-b85df19a6240 798 0 2025-09-12 17:24:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.1.0-a-2d28ed79c9 goldmane-7988f88666-x2xqz eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali153f808d4bc [] [] }} ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:00.983 [INFO][5304] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.007 [INFO][5316] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" HandleID="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.007 [INFO][5316] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" HandleID="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000255640), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-2d28ed79c9", "pod":"goldmane-7988f88666-x2xqz", "timestamp":"2025-09-12 17:25:01.007654675 +0000 UTC"}, Hostname:"ci-4426.1.0-a-2d28ed79c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.007 [INFO][5316] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.007 [INFO][5316] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.007 [INFO][5316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-2d28ed79c9' Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.013 [INFO][5316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.018 [INFO][5316] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.022 [INFO][5316] ipam/ipam.go 511: Trying affinity for 192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.024 [INFO][5316] ipam/ipam.go 158: Attempting to load block cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.027 [INFO][5316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.027 [INFO][5316] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.029 [INFO][5316] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4 Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.034 [INFO][5316] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.044 [INFO][5316] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.112.72/26] block=192.168.112.64/26 handle="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.044 [INFO][5316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.112.72/26] handle="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" host="ci-4426.1.0-a-2d28ed79c9" Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.044 [INFO][5316] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:25:01.067562 containerd[1876]: 2025-09-12 17:25:01.044 [INFO][5316] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.112.72/26] IPv6=[] ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" HandleID="k8s-pod-network.6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Workload="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.046 [INFO][5304] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"dc6ee434-cb48-4791-b62b-b85df19a6240", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"", Pod:"goldmane-7988f88666-x2xqz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.112.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali153f808d4bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.046 [INFO][5304] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.72/32] ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.046 [INFO][5304] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali153f808d4bc ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.051 [INFO][5304] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.051 [INFO][5304] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"dc6ee434-cb48-4791-b62b-b85df19a6240", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-2d28ed79c9", ContainerID:"6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4", Pod:"goldmane-7988f88666-x2xqz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.112.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali153f808d4bc", MAC:"06:3b:4a:12:ea:29", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:25:01.068045 containerd[1876]: 2025-09-12 17:25:01.063 [INFO][5304] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" Namespace="calico-system" Pod="goldmane-7988f88666-x2xqz" WorkloadEndpoint="ci--4426.1.0--a--2d28ed79c9-k8s-goldmane--7988f88666--x2xqz-eth0" Sep 12 17:25:01.126606 containerd[1876]: time="2025-09-12T17:25:01.126559386Z" level=info msg="connecting to shim 6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4" address="unix:///run/containerd/s/b4e431ed2c36cf8a4466a8fbd9fd40edc51141ecec92e16d347e1f596e31b831" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:25:01.155753 systemd[1]: Started cri-containerd-6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4.scope - libcontainer container 6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4. Sep 12 17:25:01.173305 kubelet[3290]: I0912 17:25:01.173259 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-fn7z8" podStartSLOduration=38.173214844 podStartE2EDuration="38.173214844s" podCreationTimestamp="2025-09-12 17:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:25:01.141500742 +0000 UTC m=+42.344723249" watchObservedRunningTime="2025-09-12 17:25:01.173214844 +0000 UTC m=+42.376437351" Sep 12 17:25:01.218235 containerd[1876]: time="2025-09-12T17:25:01.218002954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-x2xqz,Uid:dc6ee434-cb48-4791-b62b-b85df19a6240,Namespace:calico-system,Attempt:0,} returns sandbox id \"6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4\"" Sep 12 17:25:01.617822 systemd-networkd[1695]: cali07fefbd0c6c: Gained IPv6LL Sep 12 17:25:01.745774 systemd-networkd[1695]: cali26a02b1d498: Gained IPv6LL Sep 12 17:25:01.873800 systemd-networkd[1695]: calie0c64e51ee3: Gained IPv6LL Sep 12 17:25:02.031433 containerd[1876]: time="2025-09-12T17:25:02.031390057Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:02.034588 containerd[1876]: time="2025-09-12T17:25:02.034561539Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 17:25:02.036896 containerd[1876]: time="2025-09-12T17:25:02.036852060Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:02.040811 containerd[1876]: time="2025-09-12T17:25:02.040769360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:02.041293 containerd[1876]: time="2025-09-12T17:25:02.041141769Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.868859737s" Sep 12 17:25:02.041293 containerd[1876]: time="2025-09-12T17:25:02.041165335Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 17:25:02.042304 containerd[1876]: time="2025-09-12T17:25:02.042281441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:25:02.054054 containerd[1876]: time="2025-09-12T17:25:02.054027485Z" level=info msg="CreateContainer within sandbox \"8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:25:02.077206 containerd[1876]: time="2025-09-12T17:25:02.076684023Z" level=info msg="Container 4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:02.095662 containerd[1876]: time="2025-09-12T17:25:02.095635545Z" level=info msg="CreateContainer within sandbox \"8a863a3fad20abeb9f7e5ce334078c1a54724ef97a2d30e4158135d17ce06286\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\"" Sep 12 17:25:02.096602 containerd[1876]: time="2025-09-12T17:25:02.095901385Z" level=info msg="StartContainer for \"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\"" Sep 12 17:25:02.096785 containerd[1876]: time="2025-09-12T17:25:02.096765083Z" level=info msg="connecting to shim 4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f" address="unix:///run/containerd/s/3f3f1533a006b02c1babfc06ad08f4be3d4a6db6817ffb92d8f3bd8ea12c0405" protocol=ttrpc version=3 Sep 12 17:25:02.116631 systemd[1]: Started cri-containerd-4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f.scope - libcontainer container 4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f. Sep 12 17:25:02.150397 containerd[1876]: time="2025-09-12T17:25:02.150324086Z" level=info msg="StartContainer for \"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" returns successfully" Sep 12 17:25:02.193670 systemd-networkd[1695]: cali95b956c72bc: Gained IPv6LL Sep 12 17:25:02.257606 systemd-networkd[1695]: cali153f808d4bc: Gained IPv6LL Sep 12 17:25:03.155086 kubelet[3290]: I0912 17:25:03.153494 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-774f5bcb5b-lmnmr" podStartSLOduration=25.283394737 podStartE2EDuration="28.153479239s" podCreationTimestamp="2025-09-12 17:24:35 +0000 UTC" firstStartedPulling="2025-09-12 17:24:59.171875543 +0000 UTC m=+40.375098050" lastFinishedPulling="2025-09-12 17:25:02.041960037 +0000 UTC m=+43.245182552" observedRunningTime="2025-09-12 17:25:03.151688279 +0000 UTC m=+44.354910794" watchObservedRunningTime="2025-09-12 17:25:03.153479239 +0000 UTC m=+44.356701762" Sep 12 17:25:03.199005 containerd[1876]: time="2025-09-12T17:25:03.198971441Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"e866784efd9a935686df7f4fe003b82ccc0595523e3646bd70ec10be87b6a34d\" pid:5447 exited_at:{seconds:1757697903 nanos:194127951}" Sep 12 17:25:05.085234 containerd[1876]: time="2025-09-12T17:25:05.084786360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:05.087929 containerd[1876]: time="2025-09-12T17:25:05.087907597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 17:25:05.090825 containerd[1876]: time="2025-09-12T17:25:05.090795233Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:05.106968 containerd[1876]: time="2025-09-12T17:25:05.106923331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:05.107529 containerd[1876]: time="2025-09-12T17:25:05.107298164Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.064971637s" Sep 12 17:25:05.107529 containerd[1876]: time="2025-09-12T17:25:05.107321674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:25:05.108507 containerd[1876]: time="2025-09-12T17:25:05.108451644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:25:05.109649 containerd[1876]: time="2025-09-12T17:25:05.109622115Z" level=info msg="CreateContainer within sandbox \"4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:25:05.134155 containerd[1876]: time="2025-09-12T17:25:05.133690941Z" level=info msg="Container e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:05.155349 containerd[1876]: time="2025-09-12T17:25:05.155324832Z" level=info msg="CreateContainer within sandbox \"4a7242c9628a245328eb40862ddfea4764c144ea9454b21ea92a02f078b4fd78\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5\"" Sep 12 17:25:05.156649 containerd[1876]: time="2025-09-12T17:25:05.156625319Z" level=info msg="StartContainer for \"e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5\"" Sep 12 17:25:05.157331 containerd[1876]: time="2025-09-12T17:25:05.157308876Z" level=info msg="connecting to shim e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5" address="unix:///run/containerd/s/843fddde434f4f81bef67faa8404d63e60e55ac8c25e871b8ac4ddc42a44213a" protocol=ttrpc version=3 Sep 12 17:25:05.176621 systemd[1]: Started cri-containerd-e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5.scope - libcontainer container e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5. Sep 12 17:25:05.206707 containerd[1876]: time="2025-09-12T17:25:05.206677988Z" level=info msg="StartContainer for \"e7c902f442f9f63fe4b77f62aa518285b188310b9007634a42d719ecd4b42bb5\" returns successfully" Sep 12 17:25:05.476623 containerd[1876]: time="2025-09-12T17:25:05.476588304Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:05.480422 containerd[1876]: time="2025-09-12T17:25:05.480403322Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:25:05.481184 containerd[1876]: time="2025-09-12T17:25:05.481164371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 372.544986ms" Sep 12 17:25:05.481220 containerd[1876]: time="2025-09-12T17:25:05.481189217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:25:05.483331 containerd[1876]: time="2025-09-12T17:25:05.482398990Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:25:05.484023 containerd[1876]: time="2025-09-12T17:25:05.483999842Z" level=info msg="CreateContainer within sandbox \"83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:25:05.515383 containerd[1876]: time="2025-09-12T17:25:05.515353030Z" level=info msg="Container 33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:05.537654 containerd[1876]: time="2025-09-12T17:25:05.537623393Z" level=info msg="CreateContainer within sandbox \"83f41a0568ea0dd45feae21d28d135bca9b47015ef615810bfa6779ceaae0dc8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764\"" Sep 12 17:25:05.539588 containerd[1876]: time="2025-09-12T17:25:05.538075933Z" level=info msg="StartContainer for \"33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764\"" Sep 12 17:25:05.539981 containerd[1876]: time="2025-09-12T17:25:05.539957263Z" level=info msg="connecting to shim 33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764" address="unix:///run/containerd/s/ce027a8d72118f3074050855513981fefc6719628833c99c6617eb02cc04d03f" protocol=ttrpc version=3 Sep 12 17:25:05.555790 systemd[1]: Started cri-containerd-33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764.scope - libcontainer container 33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764. Sep 12 17:25:05.621375 containerd[1876]: time="2025-09-12T17:25:05.621348122Z" level=info msg="StartContainer for \"33f263465ef5cbdf10f2568c4f70b4215f87dc63355bfb4d8fc493209aef5764\" returns successfully" Sep 12 17:25:06.157842 kubelet[3290]: I0912 17:25:06.157798 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64d4864b98-zbsdh" podStartSLOduration=29.271099809 podStartE2EDuration="34.157783805s" podCreationTimestamp="2025-09-12 17:24:32 +0000 UTC" firstStartedPulling="2025-09-12 17:25:00.595143141 +0000 UTC m=+41.798365648" lastFinishedPulling="2025-09-12 17:25:05.481827137 +0000 UTC m=+46.685049644" observedRunningTime="2025-09-12 17:25:06.157480624 +0000 UTC m=+47.360703131" watchObservedRunningTime="2025-09-12 17:25:06.157783805 +0000 UTC m=+47.361006320" Sep 12 17:25:07.147894 kubelet[3290]: I0912 17:25:07.147854 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:25:07.242947 containerd[1876]: time="2025-09-12T17:25:07.242907061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:07.245601 containerd[1876]: time="2025-09-12T17:25:07.245571023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 17:25:07.249550 containerd[1876]: time="2025-09-12T17:25:07.249414783Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:07.257399 containerd[1876]: time="2025-09-12T17:25:07.257278541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:07.257896 containerd[1876]: time="2025-09-12T17:25:07.257854201Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.774510495s" Sep 12 17:25:07.257896 containerd[1876]: time="2025-09-12T17:25:07.257889175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 17:25:07.264647 containerd[1876]: time="2025-09-12T17:25:07.264610700Z" level=info msg="CreateContainer within sandbox \"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:25:07.265028 containerd[1876]: time="2025-09-12T17:25:07.264616867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:25:07.297979 containerd[1876]: time="2025-09-12T17:25:07.297935461Z" level=info msg="Container d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:07.313721 containerd[1876]: time="2025-09-12T17:25:07.313642401Z" level=info msg="CreateContainer within sandbox \"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d\"" Sep 12 17:25:07.314236 containerd[1876]: time="2025-09-12T17:25:07.314155281Z" level=info msg="StartContainer for \"d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d\"" Sep 12 17:25:07.315783 containerd[1876]: time="2025-09-12T17:25:07.315733135Z" level=info msg="connecting to shim d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d" address="unix:///run/containerd/s/2957fc945c488f4fc50bb6496b23b2cae30e9e579f6e54835ae1b328304e7013" protocol=ttrpc version=3 Sep 12 17:25:07.337001 systemd[1]: Started cri-containerd-d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d.scope - libcontainer container d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d. Sep 12 17:25:07.390981 containerd[1876]: time="2025-09-12T17:25:07.390950019Z" level=info msg="StartContainer for \"d3443e891da99cc7fa5983dc490aeff54858be4fdcdfdb13da7ce3cd5e02eb5d\" returns successfully" Sep 12 17:25:07.593247 kubelet[3290]: I0912 17:25:07.593063 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64d4864b98-lq477" podStartSLOduration=30.898326203 podStartE2EDuration="35.593047845s" podCreationTimestamp="2025-09-12 17:24:32 +0000 UTC" firstStartedPulling="2025-09-12 17:25:00.413358825 +0000 UTC m=+41.616581340" lastFinishedPulling="2025-09-12 17:25:05.108080459 +0000 UTC m=+46.311302982" observedRunningTime="2025-09-12 17:25:06.175615388 +0000 UTC m=+47.378837903" watchObservedRunningTime="2025-09-12 17:25:07.593047845 +0000 UTC m=+48.796270352" Sep 12 17:25:09.892394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2628140019.mount: Deactivated successfully. Sep 12 17:25:10.396514 containerd[1876]: time="2025-09-12T17:25:10.396477746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:10.401592 containerd[1876]: time="2025-09-12T17:25:10.401568172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 17:25:10.404659 containerd[1876]: time="2025-09-12T17:25:10.404623615Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:10.408259 containerd[1876]: time="2025-09-12T17:25:10.408219937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:10.408795 containerd[1876]: time="2025-09-12T17:25:10.408601514Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.143506533s" Sep 12 17:25:10.408795 containerd[1876]: time="2025-09-12T17:25:10.408625049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 17:25:10.411088 containerd[1876]: time="2025-09-12T17:25:10.411068376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:25:10.420065 containerd[1876]: time="2025-09-12T17:25:10.419915842Z" level=info msg="CreateContainer within sandbox \"6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:25:10.441567 containerd[1876]: time="2025-09-12T17:25:10.441545861Z" level=info msg="Container 08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:10.473979 containerd[1876]: time="2025-09-12T17:25:10.473946025Z" level=info msg="CreateContainer within sandbox \"6ce6dcd15a58870258be8359cf2d7e125242b11aa32502049f771ce9459025f4\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\"" Sep 12 17:25:10.475541 containerd[1876]: time="2025-09-12T17:25:10.474818813Z" level=info msg="StartContainer for \"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\"" Sep 12 17:25:10.475541 containerd[1876]: time="2025-09-12T17:25:10.475488637Z" level=info msg="connecting to shim 08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7" address="unix:///run/containerd/s/b4e431ed2c36cf8a4466a8fbd9fd40edc51141ecec92e16d347e1f596e31b831" protocol=ttrpc version=3 Sep 12 17:25:10.492627 systemd[1]: Started cri-containerd-08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7.scope - libcontainer container 08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7. Sep 12 17:25:10.527956 containerd[1876]: time="2025-09-12T17:25:10.527920507Z" level=info msg="StartContainer for \"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" returns successfully" Sep 12 17:25:11.181633 kubelet[3290]: I0912 17:25:11.181539 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-x2xqz" podStartSLOduration=26.991076107 podStartE2EDuration="36.18150626s" podCreationTimestamp="2025-09-12 17:24:35 +0000 UTC" firstStartedPulling="2025-09-12 17:25:01.220412092 +0000 UTC m=+42.423634599" lastFinishedPulling="2025-09-12 17:25:10.410842245 +0000 UTC m=+51.614064752" observedRunningTime="2025-09-12 17:25:11.179081108 +0000 UTC m=+52.382303615" watchObservedRunningTime="2025-09-12 17:25:11.18150626 +0000 UTC m=+52.384728767" Sep 12 17:25:11.223355 containerd[1876]: time="2025-09-12T17:25:11.223321504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"bba637c1d65b9f992f566d9614cb523d96c0b4240597c9f669fcd720ba8666a3\" pid:5633 exit_status:1 exited_at:{seconds:1757697911 nanos:223069279}" Sep 12 17:25:12.228540 containerd[1876]: time="2025-09-12T17:25:12.228496772Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"d277107aa31a1b43755d59379e0edae911d09fdf65724a04bdff36279000ccd8\" pid:5662 exit_status:1 exited_at:{seconds:1757697912 nanos:227991874}" Sep 12 17:25:12.353991 containerd[1876]: time="2025-09-12T17:25:12.353943809Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:12.356582 containerd[1876]: time="2025-09-12T17:25:12.356556078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 17:25:12.359087 containerd[1876]: time="2025-09-12T17:25:12.359046850Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:12.362577 containerd[1876]: time="2025-09-12T17:25:12.362532035Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:25:12.362944 containerd[1876]: time="2025-09-12T17:25:12.362921604Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.951701661s" Sep 12 17:25:12.363022 containerd[1876]: time="2025-09-12T17:25:12.363009638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 17:25:12.365493 containerd[1876]: time="2025-09-12T17:25:12.365473076Z" level=info msg="CreateContainer within sandbox \"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:25:12.379011 containerd[1876]: time="2025-09-12T17:25:12.378652661Z" level=info msg="Container afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:25:12.383841 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2071006937.mount: Deactivated successfully. Sep 12 17:25:12.395537 containerd[1876]: time="2025-09-12T17:25:12.395480134Z" level=info msg="CreateContainer within sandbox \"c196d6c8fc04e796b3a931aa11aa85623707c3a586ddbebfdccd1ef9987d91a2\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497\"" Sep 12 17:25:12.396002 containerd[1876]: time="2025-09-12T17:25:12.395936562Z" level=info msg="StartContainer for \"afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497\"" Sep 12 17:25:12.397732 containerd[1876]: time="2025-09-12T17:25:12.397703178Z" level=info msg="connecting to shim afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497" address="unix:///run/containerd/s/2957fc945c488f4fc50bb6496b23b2cae30e9e579f6e54835ae1b328304e7013" protocol=ttrpc version=3 Sep 12 17:25:12.415618 systemd[1]: Started cri-containerd-afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497.scope - libcontainer container afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497. Sep 12 17:25:12.447457 containerd[1876]: time="2025-09-12T17:25:12.447427728Z" level=info msg="StartContainer for \"afa8f8b8a153fa7e5c81395a9930e84f8e73be6299793dca7cf8f5d1fac7b497\" returns successfully" Sep 12 17:25:13.042752 kubelet[3290]: I0912 17:25:13.042714 3290 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:25:13.046719 containerd[1876]: time="2025-09-12T17:25:13.046689636Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"87eecbb6dddf7db276a376a35dc4ba8f71915dca62665f0918d2b69c02b36c1f\" pid:5723 exited_at:{seconds:1757697913 nanos:46226143}" Sep 12 17:25:13.047791 kubelet[3290]: I0912 17:25:13.047639 3290 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:25:13.130413 containerd[1876]: time="2025-09-12T17:25:13.130385408Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"4d2c2750ed784146b899ef6d25ec325c4561adb44db42b4aad412a43d6ca839d\" pid:5746 exited_at:{seconds:1757697913 nanos:130000271}" Sep 12 17:25:13.221241 containerd[1876]: time="2025-09-12T17:25:13.221210293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"2d7e6263f068d0da8d5dbd1b598efa3db72e86d17ef402b540486835880016fb\" pid:5771 exit_status:1 exited_at:{seconds:1757697913 nanos:220905415}" Sep 12 17:25:15.789998 containerd[1876]: time="2025-09-12T17:25:15.789950389Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"2425436bc1c83cf419b87a2c269d6a70d77a9aa95d389a14f4101b8756ab081c\" pid:5794 exited_at:{seconds:1757697915 nanos:789215721}" Sep 12 17:25:18.628135 containerd[1876]: time="2025-09-12T17:25:18.628055420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"96e129ada9fbff8a188c488958123f8d84dbe14b7419291f93c14f642039a68d\" pid:5816 exited_at:{seconds:1757697918 nanos:627786990}" Sep 12 17:25:19.063735 containerd[1876]: time="2025-09-12T17:25:19.063016052Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"75f9f7de2546cae2d8cd6f19c6cdb8b8976a5a6ff85e4c2fc04b506f8a2f6165\" pid:5840 exited_at:{seconds:1757697919 nanos:62752998}" Sep 12 17:25:21.048513 containerd[1876]: time="2025-09-12T17:25:21.048460338Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"3736c3252305303dde5205a75664e98882a9508025351d399d18d2fd1bc6d46e\" pid:5863 exited_at:{seconds:1757697921 nanos:47950437}" Sep 12 17:25:21.063330 kubelet[3290]: I0912 17:25:21.063189 3290 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9s9cm" podStartSLOduration=34.426629178 podStartE2EDuration="46.06317557s" podCreationTimestamp="2025-09-12 17:24:35 +0000 UTC" firstStartedPulling="2025-09-12 17:25:00.726997255 +0000 UTC m=+41.930219762" lastFinishedPulling="2025-09-12 17:25:12.363543647 +0000 UTC m=+53.566766154" observedRunningTime="2025-09-12 17:25:13.187363984 +0000 UTC m=+54.390586547" watchObservedRunningTime="2025-09-12 17:25:21.06317557 +0000 UTC m=+62.266398077" Sep 12 17:25:28.175340 kubelet[3290]: I0912 17:25:28.175108 3290 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:25:43.063896 containerd[1876]: time="2025-09-12T17:25:43.063854057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"d612c5af0ed6416296623ee3f7fac9ac2c6817b3d6ffd14d6331aaa07aa290f0\" pid:5907 exited_at:{seconds:1757697943 nanos:61177713}" Sep 12 17:25:45.774211 containerd[1876]: time="2025-09-12T17:25:45.774158732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"dd0467d5c29cd48a047014a75b2bbc26076078a9969a32461a5c50c00494f173\" pid:5931 exited_at:{seconds:1757697945 nanos:773899133}" Sep 12 17:25:51.070451 containerd[1876]: time="2025-09-12T17:25:51.070408159Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"42d4d9925c033f220f4f76bbbdb0b8c5330eb2e2571f925eb934a24a64920893\" pid:5952 exited_at:{seconds:1757697951 nanos:70131104}" Sep 12 17:26:13.036314 containerd[1876]: time="2025-09-12T17:26:13.036270210Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"6cc76b0af35d3f7fecb8129604f6a732207471945496ca70d4e590684b9da0cc\" pid:5980 exited_at:{seconds:1757697973 nanos:36029856}" Sep 12 17:26:15.774221 containerd[1876]: time="2025-09-12T17:26:15.774181045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"e24228432b600f44e66dcfe0d1f3d35c11e044823f3012d81b65dc7c2bc55cd7\" pid:6006 exited_at:{seconds:1757697975 nanos:773910389}" Sep 12 17:26:18.620984 containerd[1876]: time="2025-09-12T17:26:18.620943013Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"06dc4b6f2302eec7e6df8c9f92b0cd00735c6495290a75fb9adc8167352ccdc8\" pid:6027 exited_at:{seconds:1757697978 nanos:620723754}" Sep 12 17:26:19.011748 containerd[1876]: time="2025-09-12T17:26:19.011714145Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"d4b57b27b383e15179eabac2399d323772b8103d442f2250d11f540d78c15db7\" pid:6051 exited_at:{seconds:1757697979 nanos:11588352}" Sep 12 17:26:21.045512 containerd[1876]: time="2025-09-12T17:26:21.045472772Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"ab2f89351c6f0db44adc4e729651faa56e9406983bebe74386aa977b362c9612\" pid:6071 exited_at:{seconds:1757697981 nanos:45160400}" Sep 12 17:26:43.032280 containerd[1876]: time="2025-09-12T17:26:43.032022128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"173ac26be0670b5f432cf3932f1e74fd7ca19ed948291894b22fc4449b0c3b4a\" pid:6122 exited_at:{seconds:1757698003 nanos:31778431}" Sep 12 17:26:45.773679 containerd[1876]: time="2025-09-12T17:26:45.773642230Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"7aa6c050bcea5ba84a6dc8a3b000987c6b4da20f8ee696ec3ac32d54c088b513\" pid:6144 exited_at:{seconds:1757698005 nanos:773390654}" Sep 12 17:26:51.048332 containerd[1876]: time="2025-09-12T17:26:51.048267292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"8e4e666e6090cb341c4b5485b997bcba26cb886bc34ea03581bcc82c146db29a\" pid:6164 exited_at:{seconds:1757698011 nanos:47915658}" Sep 12 17:27:05.667181 systemd[1]: Started sshd@7-10.200.20.21:22-10.200.16.10:39066.service - OpenSSH per-connection server daemon (10.200.16.10:39066). Sep 12 17:27:06.084601 sshd[6183]: Accepted publickey for core from 10.200.16.10 port 39066 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:06.086496 sshd-session[6183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:06.090315 systemd-logind[1855]: New session 10 of user core. Sep 12 17:27:06.098873 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:27:06.441768 sshd[6186]: Connection closed by 10.200.16.10 port 39066 Sep 12 17:27:06.442252 sshd-session[6183]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:06.445130 systemd[1]: sshd@7-10.200.20.21:22-10.200.16.10:39066.service: Deactivated successfully. Sep 12 17:27:06.448318 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:27:06.448993 systemd-logind[1855]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:27:06.450405 systemd-logind[1855]: Removed session 10. Sep 12 17:27:11.516729 systemd[1]: Started sshd@8-10.200.20.21:22-10.200.16.10:56238.service - OpenSSH per-connection server daemon (10.200.16.10:56238). Sep 12 17:27:11.932759 sshd[6198]: Accepted publickey for core from 10.200.16.10 port 56238 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:11.934447 sshd-session[6198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:11.940980 systemd-logind[1855]: New session 11 of user core. Sep 12 17:27:11.946641 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:27:12.304920 sshd[6201]: Connection closed by 10.200.16.10 port 56238 Sep 12 17:27:12.304832 sshd-session[6198]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:12.309858 systemd-logind[1855]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:27:12.310832 systemd[1]: sshd@8-10.200.20.21:22-10.200.16.10:56238.service: Deactivated successfully. Sep 12 17:27:12.314467 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:27:12.316121 systemd-logind[1855]: Removed session 11. Sep 12 17:27:13.034387 containerd[1876]: time="2025-09-12T17:27:13.034349367Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"7b4c2b027fd6d994290fc8987b5c37bec508819e7a39dc01bebc2eab594e3838\" pid:6226 exited_at:{seconds:1757698033 nanos:34058753}" Sep 12 17:27:15.778197 containerd[1876]: time="2025-09-12T17:27:15.778157453Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"4deb7c0ae75082065b39936dc34c50d1ce58bea6473cc06c236cb5a5c6fefd9d\" pid:6249 exited_at:{seconds:1757698035 nanos:777141220}" Sep 12 17:27:17.398698 systemd[1]: Started sshd@9-10.200.20.21:22-10.200.16.10:56246.service - OpenSSH per-connection server daemon (10.200.16.10:56246). Sep 12 17:27:17.848099 sshd[6259]: Accepted publickey for core from 10.200.16.10 port 56246 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:17.848922 sshd-session[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:17.852356 systemd-logind[1855]: New session 12 of user core. Sep 12 17:27:17.857620 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:27:18.219349 sshd[6262]: Connection closed by 10.200.16.10 port 56246 Sep 12 17:27:18.219668 sshd-session[6259]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:18.223002 systemd[1]: sshd@9-10.200.20.21:22-10.200.16.10:56246.service: Deactivated successfully. Sep 12 17:27:18.224860 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:27:18.226208 systemd-logind[1855]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:27:18.227364 systemd-logind[1855]: Removed session 12. Sep 12 17:27:18.294001 systemd[1]: Started sshd@10-10.200.20.21:22-10.200.16.10:56248.service - OpenSSH per-connection server daemon (10.200.16.10:56248). Sep 12 17:27:18.617884 containerd[1876]: time="2025-09-12T17:27:18.617784643Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"3b0913fb90349034e7b894340d3752b36bea0dc048766cba958c8fe386daad1a\" pid:6290 exited_at:{seconds:1757698038 nanos:617412347}" Sep 12 17:27:18.704848 sshd[6275]: Accepted publickey for core from 10.200.16.10 port 56248 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:18.705813 sshd-session[6275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:18.709134 systemd-logind[1855]: New session 13 of user core. Sep 12 17:27:18.716639 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:27:19.019009 containerd[1876]: time="2025-09-12T17:27:19.018921332Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"358287365ba0fd52f25f14bf1140eff472c5c0ca599508eeb0e49fe861a9bcba\" pid:6319 exited_at:{seconds:1757698039 nanos:18509734}" Sep 12 17:27:19.077525 sshd[6299]: Connection closed by 10.200.16.10 port 56248 Sep 12 17:27:19.077911 sshd-session[6275]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:19.081887 systemd[1]: sshd@10-10.200.20.21:22-10.200.16.10:56248.service: Deactivated successfully. Sep 12 17:27:19.084217 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:27:19.086082 systemd-logind[1855]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:27:19.087182 systemd-logind[1855]: Removed session 13. Sep 12 17:27:19.157601 systemd[1]: Started sshd@11-10.200.20.21:22-10.200.16.10:56260.service - OpenSSH per-connection server daemon (10.200.16.10:56260). Sep 12 17:27:19.577865 sshd[6334]: Accepted publickey for core from 10.200.16.10 port 56260 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:19.579186 sshd-session[6334]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:19.583684 systemd-logind[1855]: New session 14 of user core. Sep 12 17:27:19.586622 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:27:19.930136 sshd[6337]: Connection closed by 10.200.16.10 port 56260 Sep 12 17:27:19.930447 sshd-session[6334]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:19.934073 systemd-logind[1855]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:27:19.934220 systemd[1]: sshd@11-10.200.20.21:22-10.200.16.10:56260.service: Deactivated successfully. Sep 12 17:27:19.937267 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:27:19.939189 systemd-logind[1855]: Removed session 14. Sep 12 17:27:21.046380 containerd[1876]: time="2025-09-12T17:27:21.046343219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"b63e411ce5a9ba0fa94d6de09645249997ebec1aaf6f845aa3929b47adb72362\" pid:6363 exited_at:{seconds:1757698041 nanos:46185980}" Sep 12 17:27:25.013603 systemd[1]: Started sshd@12-10.200.20.21:22-10.200.16.10:52690.service - OpenSSH per-connection server daemon (10.200.16.10:52690). Sep 12 17:27:25.465089 sshd[6374]: Accepted publickey for core from 10.200.16.10 port 52690 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:25.466088 sshd-session[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:25.469772 systemd-logind[1855]: New session 15 of user core. Sep 12 17:27:25.474635 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:27:25.834188 sshd[6379]: Connection closed by 10.200.16.10 port 52690 Sep 12 17:27:25.834634 sshd-session[6374]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:25.837428 systemd-logind[1855]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:27:25.838029 systemd[1]: sshd@12-10.200.20.21:22-10.200.16.10:52690.service: Deactivated successfully. Sep 12 17:27:25.839513 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:27:25.841444 systemd-logind[1855]: Removed session 15. Sep 12 17:27:30.910496 systemd[1]: Started sshd@13-10.200.20.21:22-10.200.16.10:41614.service - OpenSSH per-connection server daemon (10.200.16.10:41614). Sep 12 17:27:31.324307 sshd[6392]: Accepted publickey for core from 10.200.16.10 port 41614 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:31.325562 sshd-session[6392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:31.328858 systemd-logind[1855]: New session 16 of user core. Sep 12 17:27:31.340791 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:27:31.684246 sshd[6395]: Connection closed by 10.200.16.10 port 41614 Sep 12 17:27:31.684728 sshd-session[6392]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:31.688089 systemd[1]: sshd@13-10.200.20.21:22-10.200.16.10:41614.service: Deactivated successfully. Sep 12 17:27:31.689573 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:27:31.691591 systemd-logind[1855]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:27:31.692745 systemd-logind[1855]: Removed session 16. Sep 12 17:27:36.772293 systemd[1]: Started sshd@14-10.200.20.21:22-10.200.16.10:41628.service - OpenSSH per-connection server daemon (10.200.16.10:41628). Sep 12 17:27:37.225699 sshd[6412]: Accepted publickey for core from 10.200.16.10 port 41628 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:37.226654 sshd-session[6412]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:37.230477 systemd-logind[1855]: New session 17 of user core. Sep 12 17:27:37.235623 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:27:37.591140 sshd[6415]: Connection closed by 10.200.16.10 port 41628 Sep 12 17:27:37.591492 sshd-session[6412]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:37.594635 systemd[1]: sshd@14-10.200.20.21:22-10.200.16.10:41628.service: Deactivated successfully. Sep 12 17:27:37.596471 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:27:37.597234 systemd-logind[1855]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:27:37.599127 systemd-logind[1855]: Removed session 17. Sep 12 17:27:37.676412 systemd[1]: Started sshd@15-10.200.20.21:22-10.200.16.10:41634.service - OpenSSH per-connection server daemon (10.200.16.10:41634). Sep 12 17:27:38.138493 sshd[6427]: Accepted publickey for core from 10.200.16.10 port 41634 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:38.139833 sshd-session[6427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:38.144567 systemd-logind[1855]: New session 18 of user core. Sep 12 17:27:38.154346 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:27:38.649545 sshd[6430]: Connection closed by 10.200.16.10 port 41634 Sep 12 17:27:38.649973 sshd-session[6427]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:38.652833 systemd[1]: sshd@15-10.200.20.21:22-10.200.16.10:41634.service: Deactivated successfully. Sep 12 17:27:38.655416 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:27:38.656339 systemd-logind[1855]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:27:38.658161 systemd-logind[1855]: Removed session 18. Sep 12 17:27:38.730554 systemd[1]: Started sshd@16-10.200.20.21:22-10.200.16.10:41644.service - OpenSSH per-connection server daemon (10.200.16.10:41644). Sep 12 17:27:39.178649 sshd[6440]: Accepted publickey for core from 10.200.16.10 port 41644 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:39.179684 sshd-session[6440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:39.183046 systemd-logind[1855]: New session 19 of user core. Sep 12 17:27:39.189613 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:27:40.733224 sshd[6443]: Connection closed by 10.200.16.10 port 41644 Sep 12 17:27:40.734127 sshd-session[6440]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:40.737632 systemd[1]: sshd@16-10.200.20.21:22-10.200.16.10:41644.service: Deactivated successfully. Sep 12 17:27:40.739077 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:27:40.739215 systemd[1]: session-19.scope: Consumed 305ms CPU time, 78.8M memory peak. Sep 12 17:27:40.740408 systemd-logind[1855]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:27:40.741233 systemd-logind[1855]: Removed session 19. Sep 12 17:27:40.827785 systemd[1]: Started sshd@17-10.200.20.21:22-10.200.16.10:43452.service - OpenSSH per-connection server daemon (10.200.16.10:43452). Sep 12 17:27:41.322023 sshd[6460]: Accepted publickey for core from 10.200.16.10 port 43452 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:41.323141 sshd-session[6460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:41.326454 systemd-logind[1855]: New session 20 of user core. Sep 12 17:27:41.331643 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:27:41.838806 sshd[6465]: Connection closed by 10.200.16.10 port 43452 Sep 12 17:27:41.839119 sshd-session[6460]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:41.843859 systemd[1]: sshd@17-10.200.20.21:22-10.200.16.10:43452.service: Deactivated successfully. Sep 12 17:27:41.847874 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:27:41.849139 systemd-logind[1855]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:27:41.851307 systemd-logind[1855]: Removed session 20. Sep 12 17:27:41.916920 systemd[1]: Started sshd@18-10.200.20.21:22-10.200.16.10:43464.service - OpenSSH per-connection server daemon (10.200.16.10:43464). Sep 12 17:27:42.390405 sshd[6475]: Accepted publickey for core from 10.200.16.10 port 43464 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:42.392539 sshd-session[6475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:42.396507 systemd-logind[1855]: New session 21 of user core. Sep 12 17:27:42.402613 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:27:42.796113 sshd[6478]: Connection closed by 10.200.16.10 port 43464 Sep 12 17:27:42.795230 sshd-session[6475]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:42.799027 systemd-logind[1855]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:27:42.799557 systemd[1]: sshd@18-10.200.20.21:22-10.200.16.10:43464.service: Deactivated successfully. Sep 12 17:27:42.802203 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:27:42.804054 systemd-logind[1855]: Removed session 21. Sep 12 17:27:43.035730 containerd[1876]: time="2025-09-12T17:27:43.035697723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"56f00d764f4e09562e977ebedbe2651c36f584388464d48fbf254b82b963c9ad\" pid:6498 exited_at:{seconds:1757698063 nanos:35432356}" Sep 12 17:27:45.790608 containerd[1876]: time="2025-09-12T17:27:45.790493895Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"38ffd952fa42ed25f123ebfab1ff5d0dfdc3cc9326777d6b98854608c9f35d8f\" pid:6526 exited_at:{seconds:1757698065 nanos:789805467}" Sep 12 17:27:47.873135 systemd[1]: Started sshd@19-10.200.20.21:22-10.200.16.10:43468.service - OpenSSH per-connection server daemon (10.200.16.10:43468). Sep 12 17:27:48.294480 sshd[6537]: Accepted publickey for core from 10.200.16.10 port 43468 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:48.295638 sshd-session[6537]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:48.299275 systemd-logind[1855]: New session 22 of user core. Sep 12 17:27:48.306623 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:27:48.656056 sshd[6540]: Connection closed by 10.200.16.10 port 43468 Sep 12 17:27:48.656704 sshd-session[6537]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:48.659890 systemd-logind[1855]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:27:48.660093 systemd[1]: sshd@19-10.200.20.21:22-10.200.16.10:43468.service: Deactivated successfully. Sep 12 17:27:48.661597 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:27:48.662824 systemd-logind[1855]: Removed session 22. Sep 12 17:27:51.046728 containerd[1876]: time="2025-09-12T17:27:51.046609148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08cd58fcf7f0208c0083fe528897deead74f4d35f139adfc9bb95f0ad38371a7\" id:\"feb437d3c3f0eb1c49f46830e56212dc587b918ff51afae7c35e150864f7d6c9\" pid:6562 exited_at:{seconds:1757698071 nanos:46244163}" Sep 12 17:27:53.755690 systemd[1]: Started sshd@20-10.200.20.21:22-10.200.16.10:48338.service - OpenSSH per-connection server daemon (10.200.16.10:48338). Sep 12 17:27:54.246678 sshd[6573]: Accepted publickey for core from 10.200.16.10 port 48338 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:54.247695 sshd-session[6573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:54.252455 systemd-logind[1855]: New session 23 of user core. Sep 12 17:27:54.255736 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 17:27:54.637551 sshd[6576]: Connection closed by 10.200.16.10 port 48338 Sep 12 17:27:54.637527 sshd-session[6573]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:54.639909 systemd[1]: sshd@20-10.200.20.21:22-10.200.16.10:48338.service: Deactivated successfully. Sep 12 17:27:54.641655 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 17:27:54.642772 systemd-logind[1855]: Session 23 logged out. Waiting for processes to exit. Sep 12 17:27:54.644382 systemd-logind[1855]: Removed session 23. Sep 12 17:27:59.717148 systemd[1]: Started sshd@21-10.200.20.21:22-10.200.16.10:48350.service - OpenSSH per-connection server daemon (10.200.16.10:48350). Sep 12 17:28:00.168351 sshd[6589]: Accepted publickey for core from 10.200.16.10 port 48350 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:28:00.169363 sshd-session[6589]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:28:00.172815 systemd-logind[1855]: New session 24 of user core. Sep 12 17:28:00.186637 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 17:28:00.532069 sshd[6592]: Connection closed by 10.200.16.10 port 48350 Sep 12 17:28:00.532672 sshd-session[6589]: pam_unix(sshd:session): session closed for user core Sep 12 17:28:00.535892 systemd[1]: sshd@21-10.200.20.21:22-10.200.16.10:48350.service: Deactivated successfully. Sep 12 17:28:00.537400 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 17:28:00.538008 systemd-logind[1855]: Session 24 logged out. Waiting for processes to exit. Sep 12 17:28:00.539897 systemd-logind[1855]: Removed session 24. Sep 12 17:28:05.619289 systemd[1]: Started sshd@22-10.200.20.21:22-10.200.16.10:42108.service - OpenSSH per-connection server daemon (10.200.16.10:42108). Sep 12 17:28:06.077244 sshd[6624]: Accepted publickey for core from 10.200.16.10 port 42108 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:28:06.078545 sshd-session[6624]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:28:06.081955 systemd-logind[1855]: New session 25 of user core. Sep 12 17:28:06.090709 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 17:28:06.444883 sshd[6627]: Connection closed by 10.200.16.10 port 42108 Sep 12 17:28:06.444355 sshd-session[6624]: pam_unix(sshd:session): session closed for user core Sep 12 17:28:06.446774 systemd-logind[1855]: Session 25 logged out. Waiting for processes to exit. Sep 12 17:28:06.447977 systemd[1]: sshd@22-10.200.20.21:22-10.200.16.10:42108.service: Deactivated successfully. Sep 12 17:28:06.451126 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 17:28:06.453162 systemd-logind[1855]: Removed session 25. Sep 12 17:28:11.525703 systemd[1]: Started sshd@23-10.200.20.21:22-10.200.16.10:57510.service - OpenSSH per-connection server daemon (10.200.16.10:57510). Sep 12 17:28:11.979264 sshd[6639]: Accepted publickey for core from 10.200.16.10 port 57510 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:28:11.980964 sshd-session[6639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:28:11.985652 systemd-logind[1855]: New session 26 of user core. Sep 12 17:28:11.992631 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 17:28:12.375846 sshd[6642]: Connection closed by 10.200.16.10 port 57510 Sep 12 17:28:12.376330 sshd-session[6639]: pam_unix(sshd:session): session closed for user core Sep 12 17:28:12.380174 systemd-logind[1855]: Session 26 logged out. Waiting for processes to exit. Sep 12 17:28:12.380371 systemd[1]: sshd@23-10.200.20.21:22-10.200.16.10:57510.service: Deactivated successfully. Sep 12 17:28:12.382300 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 17:28:12.384199 systemd-logind[1855]: Removed session 26. Sep 12 17:28:13.033250 containerd[1876]: time="2025-09-12T17:28:13.033189814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a30465121f8e78ece5c0e0ef66b49e47a3230d1b34402535e01586d06c952c8a\" id:\"d461d05b0dd24409c4be22afe29e2c8be47f95e55cd67ce70bada4ddf9c52baf\" pid:6665 exited_at:{seconds:1757698093 nanos:32817152}" Sep 12 17:28:15.776311 containerd[1876]: time="2025-09-12T17:28:15.776267987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b7e12d9a2c03246faf0f73fe3919aa0813e365a6656d9c248a6b9ec3053846f\" id:\"9848fbf12d1bf5efe5fc3927b5e762e156bea1cc95d48981bff7be6631f6a58e\" pid:6688 exited_at:{seconds:1757698095 nanos:776102664}"