Sep 3 23:25:07.071409 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 3 23:25:07.071427 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Wed Sep 3 22:04:24 -00 2025 Sep 3 23:25:07.071433 kernel: KASLR enabled Sep 3 23:25:07.071437 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 3 23:25:07.071442 kernel: printk: legacy bootconsole [pl11] enabled Sep 3 23:25:07.071446 kernel: efi: EFI v2.7 by EDK II Sep 3 23:25:07.071451 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e018 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 Sep 3 23:25:07.071454 kernel: random: crng init done Sep 3 23:25:07.071458 kernel: secureboot: Secure boot disabled Sep 3 23:25:07.071462 kernel: ACPI: Early table checksum verification disabled Sep 3 23:25:07.071466 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 3 23:25:07.071470 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071474 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071478 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 3 23:25:07.071483 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071488 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071492 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071497 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071501 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071505 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071509 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 3 23:25:07.071513 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 3 23:25:07.071518 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 3 23:25:07.071522 kernel: ACPI: Use ACPI SPCR as default console: No Sep 3 23:25:07.071526 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 3 23:25:07.071530 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 3 23:25:07.071534 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 3 23:25:07.071539 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 3 23:25:07.071543 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 3 23:25:07.071548 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 3 23:25:07.071552 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 3 23:25:07.071556 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 3 23:25:07.071560 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 3 23:25:07.071564 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 3 23:25:07.071569 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 3 23:25:07.071573 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 3 23:25:07.071577 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 3 23:25:07.071581 kernel: NODE_DATA(0) allocated [mem 0x1bf7fca00-0x1bf803fff] Sep 3 23:25:07.071585 kernel: Zone ranges: Sep 3 23:25:07.071589 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 3 23:25:07.071596 kernel: DMA32 empty Sep 3 23:25:07.071600 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 3 23:25:07.071605 kernel: Device empty Sep 3 23:25:07.071609 kernel: Movable zone start for each node Sep 3 23:25:07.071614 kernel: Early memory node ranges Sep 3 23:25:07.071619 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 3 23:25:07.071623 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 3 23:25:07.071627 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 3 23:25:07.071632 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 3 23:25:07.071636 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 3 23:25:07.071640 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 3 23:25:07.071644 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 3 23:25:07.071649 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 3 23:25:07.071653 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 3 23:25:07.071657 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 3 23:25:07.071662 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 3 23:25:07.071666 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 3 23:25:07.071671 kernel: psci: probing for conduit method from ACPI. Sep 3 23:25:07.071675 kernel: psci: PSCIv1.1 detected in firmware. Sep 3 23:25:07.071680 kernel: psci: Using standard PSCI v0.2 function IDs Sep 3 23:25:07.071684 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 3 23:25:07.071688 kernel: psci: SMC Calling Convention v1.4 Sep 3 23:25:07.071693 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 3 23:25:07.071697 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 3 23:25:07.071701 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 3 23:25:07.071705 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 3 23:25:07.071710 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 3 23:25:07.071714 kernel: Detected PIPT I-cache on CPU0 Sep 3 23:25:07.071719 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 3 23:25:07.071724 kernel: CPU features: detected: GIC system register CPU interface Sep 3 23:25:07.071728 kernel: CPU features: detected: Spectre-v4 Sep 3 23:25:07.071733 kernel: CPU features: detected: Spectre-BHB Sep 3 23:25:07.071737 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 3 23:25:07.071741 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 3 23:25:07.071746 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 3 23:25:07.071750 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 3 23:25:07.071754 kernel: alternatives: applying boot alternatives Sep 3 23:25:07.071759 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=cb633bb0c889435b58a5c40c9c9bc9d5899ece5018569c9fa08f911265d3f18e Sep 3 23:25:07.071764 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 3 23:25:07.071769 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 3 23:25:07.071774 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 3 23:25:07.071778 kernel: Fallback order for Node 0: 0 Sep 3 23:25:07.071782 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 3 23:25:07.071787 kernel: Policy zone: Normal Sep 3 23:25:07.071791 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 3 23:25:07.071795 kernel: software IO TLB: area num 2. Sep 3 23:25:07.071800 kernel: software IO TLB: mapped [mem 0x0000000036280000-0x000000003a280000] (64MB) Sep 3 23:25:07.071804 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 3 23:25:07.071808 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 3 23:25:07.071813 kernel: rcu: RCU event tracing is enabled. Sep 3 23:25:07.071819 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 3 23:25:07.071823 kernel: Trampoline variant of Tasks RCU enabled. Sep 3 23:25:07.071828 kernel: Tracing variant of Tasks RCU enabled. Sep 3 23:25:07.071832 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 3 23:25:07.071836 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 3 23:25:07.071841 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 3 23:25:07.071845 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 3 23:25:07.071849 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 3 23:25:07.071854 kernel: GICv3: 960 SPIs implemented Sep 3 23:25:07.071858 kernel: GICv3: 0 Extended SPIs implemented Sep 3 23:25:07.071862 kernel: Root IRQ handler: gic_handle_irq Sep 3 23:25:07.071866 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 3 23:25:07.071872 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 3 23:25:07.071887 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 3 23:25:07.071892 kernel: ITS: No ITS available, not enabling LPIs Sep 3 23:25:07.071896 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 3 23:25:07.071901 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 3 23:25:07.071905 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 3 23:25:07.071910 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 3 23:25:07.071914 kernel: Console: colour dummy device 80x25 Sep 3 23:25:07.071919 kernel: printk: legacy console [tty1] enabled Sep 3 23:25:07.071923 kernel: ACPI: Core revision 20240827 Sep 3 23:25:07.071928 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 3 23:25:07.071934 kernel: pid_max: default: 32768 minimum: 301 Sep 3 23:25:07.071938 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 3 23:25:07.071943 kernel: landlock: Up and running. Sep 3 23:25:07.071947 kernel: SELinux: Initializing. Sep 3 23:25:07.071952 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 3 23:25:07.071959 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 3 23:25:07.071965 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 3 23:25:07.071970 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 3 23:25:07.071974 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 3 23:25:07.071979 kernel: rcu: Hierarchical SRCU implementation. Sep 3 23:25:07.071984 kernel: rcu: Max phase no-delay instances is 400. Sep 3 23:25:07.071989 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 3 23:25:07.071994 kernel: Remapping and enabling EFI services. Sep 3 23:25:07.071999 kernel: smp: Bringing up secondary CPUs ... Sep 3 23:25:07.072003 kernel: Detected PIPT I-cache on CPU1 Sep 3 23:25:07.072008 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 3 23:25:07.072014 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 3 23:25:07.072018 kernel: smp: Brought up 1 node, 2 CPUs Sep 3 23:25:07.072023 kernel: SMP: Total of 2 processors activated. Sep 3 23:25:07.072028 kernel: CPU: All CPU(s) started at EL1 Sep 3 23:25:07.072032 kernel: CPU features: detected: 32-bit EL0 Support Sep 3 23:25:07.072037 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 3 23:25:07.072042 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 3 23:25:07.072047 kernel: CPU features: detected: Common not Private translations Sep 3 23:25:07.072051 kernel: CPU features: detected: CRC32 instructions Sep 3 23:25:07.072057 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 3 23:25:07.072061 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 3 23:25:07.072066 kernel: CPU features: detected: LSE atomic instructions Sep 3 23:25:07.072071 kernel: CPU features: detected: Privileged Access Never Sep 3 23:25:07.072075 kernel: CPU features: detected: Speculation barrier (SB) Sep 3 23:25:07.072080 kernel: CPU features: detected: TLB range maintenance instructions Sep 3 23:25:07.072085 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 3 23:25:07.072090 kernel: CPU features: detected: Scalable Vector Extension Sep 3 23:25:07.072094 kernel: alternatives: applying system-wide alternatives Sep 3 23:25:07.072100 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 3 23:25:07.072105 kernel: SVE: maximum available vector length 16 bytes per vector Sep 3 23:25:07.072109 kernel: SVE: default vector length 16 bytes per vector Sep 3 23:25:07.072114 kernel: Memory: 3959600K/4194160K available (11136K kernel code, 2436K rwdata, 9076K rodata, 38976K init, 1038K bss, 213372K reserved, 16384K cma-reserved) Sep 3 23:25:07.072119 kernel: devtmpfs: initialized Sep 3 23:25:07.072124 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 3 23:25:07.072128 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 3 23:25:07.072133 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 3 23:25:07.072138 kernel: 0 pages in range for non-PLT usage Sep 3 23:25:07.072143 kernel: 508560 pages in range for PLT usage Sep 3 23:25:07.072148 kernel: pinctrl core: initialized pinctrl subsystem Sep 3 23:25:07.072152 kernel: SMBIOS 3.1.0 present. Sep 3 23:25:07.072157 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 3 23:25:07.072162 kernel: DMI: Memory slots populated: 2/2 Sep 3 23:25:07.072167 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 3 23:25:07.072171 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 3 23:25:07.072176 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 3 23:25:07.072181 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 3 23:25:07.072186 kernel: audit: initializing netlink subsys (disabled) Sep 3 23:25:07.072191 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Sep 3 23:25:07.072196 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 3 23:25:07.072200 kernel: cpuidle: using governor menu Sep 3 23:25:07.072205 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 3 23:25:07.072210 kernel: ASID allocator initialised with 32768 entries Sep 3 23:25:07.072214 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 3 23:25:07.072219 kernel: Serial: AMBA PL011 UART driver Sep 3 23:25:07.072224 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 3 23:25:07.072229 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 3 23:25:07.072234 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 3 23:25:07.072238 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 3 23:25:07.072243 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 3 23:25:07.072248 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 3 23:25:07.072252 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 3 23:25:07.072257 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 3 23:25:07.072262 kernel: ACPI: Added _OSI(Module Device) Sep 3 23:25:07.072266 kernel: ACPI: Added _OSI(Processor Device) Sep 3 23:25:07.072272 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 3 23:25:07.072277 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 3 23:25:07.072281 kernel: ACPI: Interpreter enabled Sep 3 23:25:07.072286 kernel: ACPI: Using GIC for interrupt routing Sep 3 23:25:07.072291 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 3 23:25:07.072295 kernel: printk: legacy console [ttyAMA0] enabled Sep 3 23:25:07.072300 kernel: printk: legacy bootconsole [pl11] disabled Sep 3 23:25:07.072305 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 3 23:25:07.072309 kernel: ACPI: CPU0 has been hot-added Sep 3 23:25:07.072315 kernel: ACPI: CPU1 has been hot-added Sep 3 23:25:07.072319 kernel: iommu: Default domain type: Translated Sep 3 23:25:07.072324 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 3 23:25:07.072329 kernel: efivars: Registered efivars operations Sep 3 23:25:07.072333 kernel: vgaarb: loaded Sep 3 23:25:07.072338 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 3 23:25:07.072343 kernel: VFS: Disk quotas dquot_6.6.0 Sep 3 23:25:07.072348 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 3 23:25:07.072352 kernel: pnp: PnP ACPI init Sep 3 23:25:07.072358 kernel: pnp: PnP ACPI: found 0 devices Sep 3 23:25:07.072362 kernel: NET: Registered PF_INET protocol family Sep 3 23:25:07.072367 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 3 23:25:07.072372 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 3 23:25:07.072376 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 3 23:25:07.072381 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 3 23:25:07.072386 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 3 23:25:07.072390 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 3 23:25:07.072395 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 3 23:25:07.072401 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 3 23:25:07.072405 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 3 23:25:07.072410 kernel: PCI: CLS 0 bytes, default 64 Sep 3 23:25:07.072415 kernel: kvm [1]: HYP mode not available Sep 3 23:25:07.072419 kernel: Initialise system trusted keyrings Sep 3 23:25:07.072424 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 3 23:25:07.072429 kernel: Key type asymmetric registered Sep 3 23:25:07.072433 kernel: Asymmetric key parser 'x509' registered Sep 3 23:25:07.072438 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 3 23:25:07.072443 kernel: io scheduler mq-deadline registered Sep 3 23:25:07.072448 kernel: io scheduler kyber registered Sep 3 23:25:07.072453 kernel: io scheduler bfq registered Sep 3 23:25:07.072457 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 3 23:25:07.072462 kernel: thunder_xcv, ver 1.0 Sep 3 23:25:07.072467 kernel: thunder_bgx, ver 1.0 Sep 3 23:25:07.072471 kernel: nicpf, ver 1.0 Sep 3 23:25:07.072476 kernel: nicvf, ver 1.0 Sep 3 23:25:07.072581 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 3 23:25:07.072632 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-03T23:25:06 UTC (1756941906) Sep 3 23:25:07.072638 kernel: efifb: probing for efifb Sep 3 23:25:07.072643 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 3 23:25:07.072648 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 3 23:25:07.072653 kernel: efifb: scrolling: redraw Sep 3 23:25:07.072657 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 3 23:25:07.072662 kernel: Console: switching to colour frame buffer device 128x48 Sep 3 23:25:07.072667 kernel: fb0: EFI VGA frame buffer device Sep 3 23:25:07.072673 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 3 23:25:07.072677 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 3 23:25:07.072682 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 3 23:25:07.072687 kernel: watchdog: NMI not fully supported Sep 3 23:25:07.072691 kernel: watchdog: Hard watchdog permanently disabled Sep 3 23:25:07.072696 kernel: NET: Registered PF_INET6 protocol family Sep 3 23:25:07.072701 kernel: Segment Routing with IPv6 Sep 3 23:25:07.072706 kernel: In-situ OAM (IOAM) with IPv6 Sep 3 23:25:07.072710 kernel: NET: Registered PF_PACKET protocol family Sep 3 23:25:07.072716 kernel: Key type dns_resolver registered Sep 3 23:25:07.072720 kernel: registered taskstats version 1 Sep 3 23:25:07.072725 kernel: Loading compiled-in X.509 certificates Sep 3 23:25:07.072730 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 08fc774dab168e64ce30c382a4517d40e72c4744' Sep 3 23:25:07.072735 kernel: Demotion targets for Node 0: null Sep 3 23:25:07.072739 kernel: Key type .fscrypt registered Sep 3 23:25:07.072744 kernel: Key type fscrypt-provisioning registered Sep 3 23:25:07.072749 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 3 23:25:07.072753 kernel: ima: Allocated hash algorithm: sha1 Sep 3 23:25:07.072759 kernel: ima: No architecture policies found Sep 3 23:25:07.072763 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 3 23:25:07.072768 kernel: clk: Disabling unused clocks Sep 3 23:25:07.072773 kernel: PM: genpd: Disabling unused power domains Sep 3 23:25:07.072777 kernel: Warning: unable to open an initial console. Sep 3 23:25:07.072782 kernel: Freeing unused kernel memory: 38976K Sep 3 23:25:07.072787 kernel: Run /init as init process Sep 3 23:25:07.072792 kernel: with arguments: Sep 3 23:25:07.072796 kernel: /init Sep 3 23:25:07.072802 kernel: with environment: Sep 3 23:25:07.072806 kernel: HOME=/ Sep 3 23:25:07.072811 kernel: TERM=linux Sep 3 23:25:07.072815 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 3 23:25:07.072821 systemd[1]: Successfully made /usr/ read-only. Sep 3 23:25:07.072828 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 3 23:25:07.072833 systemd[1]: Detected virtualization microsoft. Sep 3 23:25:07.072839 systemd[1]: Detected architecture arm64. Sep 3 23:25:07.072844 systemd[1]: Running in initrd. Sep 3 23:25:07.072849 systemd[1]: No hostname configured, using default hostname. Sep 3 23:25:07.072854 systemd[1]: Hostname set to . Sep 3 23:25:07.072859 systemd[1]: Initializing machine ID from random generator. Sep 3 23:25:07.072864 systemd[1]: Queued start job for default target initrd.target. Sep 3 23:25:07.072869 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:07.072882 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:07.072888 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 3 23:25:07.072894 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 3 23:25:07.072899 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 3 23:25:07.072905 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 3 23:25:07.072911 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 3 23:25:07.072916 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 3 23:25:07.072921 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:07.072927 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:07.072932 systemd[1]: Reached target paths.target - Path Units. Sep 3 23:25:07.072937 systemd[1]: Reached target slices.target - Slice Units. Sep 3 23:25:07.072942 systemd[1]: Reached target swap.target - Swaps. Sep 3 23:25:07.072947 systemd[1]: Reached target timers.target - Timer Units. Sep 3 23:25:07.072952 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 3 23:25:07.072958 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 3 23:25:07.072963 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 3 23:25:07.072968 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 3 23:25:07.072974 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:07.072979 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:07.072984 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:07.072989 systemd[1]: Reached target sockets.target - Socket Units. Sep 3 23:25:07.072994 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 3 23:25:07.072999 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 3 23:25:07.073004 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 3 23:25:07.073009 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 3 23:25:07.073016 systemd[1]: Starting systemd-fsck-usr.service... Sep 3 23:25:07.073021 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 3 23:25:07.073026 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 3 23:25:07.073041 systemd-journald[224]: Collecting audit messages is disabled. Sep 3 23:25:07.073056 systemd-journald[224]: Journal started Sep 3 23:25:07.073070 systemd-journald[224]: Runtime Journal (/run/log/journal/73b98167a33e46488a9d4f428cf2687d) is 8M, max 78.5M, 70.5M free. Sep 3 23:25:07.076913 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:07.081617 systemd-modules-load[225]: Inserted module 'overlay' Sep 3 23:25:07.099890 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 3 23:25:07.099927 systemd[1]: Started systemd-journald.service - Journal Service. Sep 3 23:25:07.106003 kernel: Bridge firewalling registered Sep 3 23:25:07.106089 systemd-modules-load[225]: Inserted module 'br_netfilter' Sep 3 23:25:07.111086 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 3 23:25:07.124518 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:07.129271 systemd[1]: Finished systemd-fsck-usr.service. Sep 3 23:25:07.136467 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:07.144020 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:07.154308 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 3 23:25:07.178614 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 3 23:25:07.189241 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 3 23:25:07.200693 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 3 23:25:07.209317 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 3 23:25:07.224954 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:07.229246 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 3 23:25:07.238224 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 3 23:25:07.238553 systemd-tmpfiles[252]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 3 23:25:07.253728 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 3 23:25:07.267354 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:07.283816 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:07.290746 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 3 23:25:07.306031 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=cb633bb0c889435b58a5c40c9c9bc9d5899ece5018569c9fa08f911265d3f18e Sep 3 23:25:07.350501 systemd-resolved[272]: Positive Trust Anchors: Sep 3 23:25:07.350515 systemd-resolved[272]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 3 23:25:07.350535 systemd-resolved[272]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 3 23:25:07.352732 systemd-resolved[272]: Defaulting to hostname 'linux'. Sep 3 23:25:07.354077 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 3 23:25:07.360304 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:07.436896 kernel: SCSI subsystem initialized Sep 3 23:25:07.441886 kernel: Loading iSCSI transport class v2.0-870. Sep 3 23:25:07.448908 kernel: iscsi: registered transport (tcp) Sep 3 23:25:07.461829 kernel: iscsi: registered transport (qla4xxx) Sep 3 23:25:07.461870 kernel: QLogic iSCSI HBA Driver Sep 3 23:25:07.474537 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 3 23:25:07.494289 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:07.500332 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 3 23:25:07.544196 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 3 23:25:07.549936 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 3 23:25:07.619893 kernel: raid6: neonx8 gen() 18553 MB/s Sep 3 23:25:07.638880 kernel: raid6: neonx4 gen() 18568 MB/s Sep 3 23:25:07.657880 kernel: raid6: neonx2 gen() 17078 MB/s Sep 3 23:25:07.677881 kernel: raid6: neonx1 gen() 15013 MB/s Sep 3 23:25:07.696879 kernel: raid6: int64x8 gen() 10519 MB/s Sep 3 23:25:07.715887 kernel: raid6: int64x4 gen() 10606 MB/s Sep 3 23:25:07.735882 kernel: raid6: int64x2 gen() 8973 MB/s Sep 3 23:25:07.757183 kernel: raid6: int64x1 gen() 7015 MB/s Sep 3 23:25:07.757190 kernel: raid6: using algorithm neonx4 gen() 18568 MB/s Sep 3 23:25:07.779164 kernel: raid6: .... xor() 15131 MB/s, rmw enabled Sep 3 23:25:07.779170 kernel: raid6: using neon recovery algorithm Sep 3 23:25:07.787235 kernel: xor: measuring software checksum speed Sep 3 23:25:07.787243 kernel: 8regs : 28605 MB/sec Sep 3 23:25:07.790439 kernel: 32regs : 28769 MB/sec Sep 3 23:25:07.792968 kernel: arm64_neon : 37504 MB/sec Sep 3 23:25:07.795764 kernel: xor: using function: arm64_neon (37504 MB/sec) Sep 3 23:25:07.833900 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 3 23:25:07.839274 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 3 23:25:07.848779 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:07.875556 systemd-udevd[473]: Using default interface naming scheme 'v255'. Sep 3 23:25:07.882616 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:07.895048 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 3 23:25:07.920227 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 3 23:25:07.939391 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 3 23:25:07.944967 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 3 23:25:07.997561 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:08.005672 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 3 23:25:08.070909 kernel: hv_vmbus: Vmbus version:5.3 Sep 3 23:25:08.082029 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:08.090305 kernel: hv_vmbus: registering driver hid_hyperv Sep 3 23:25:08.090329 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 3 23:25:08.085765 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:08.114491 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Sep 3 23:25:08.114545 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 3 23:25:08.114554 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 3 23:25:08.114958 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:08.128195 kernel: hv_vmbus: registering driver hv_netvsc Sep 3 23:25:08.128212 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Sep 3 23:25:08.141708 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 3 23:25:08.144430 kernel: hv_vmbus: registering driver hv_storvsc Sep 3 23:25:08.145362 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:08.152622 kernel: scsi host0: storvsc_host_t Sep 3 23:25:08.159607 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:08.169429 kernel: scsi host1: storvsc_host_t Sep 3 23:25:08.169581 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 3 23:25:08.176758 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 3 23:25:08.184926 kernel: PTP clock support registered Sep 3 23:25:08.200812 kernel: hv_utils: Registering HyperV Utility Driver Sep 3 23:25:08.200844 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 3 23:25:08.200993 kernel: hv_vmbus: registering driver hv_utils Sep 3 23:25:08.201001 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 3 23:25:08.198696 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:08.067802 kernel: hv_utils: Heartbeat IC version 3.0 Sep 3 23:25:08.075914 kernel: hv_utils: Shutdown IC version 3.2 Sep 3 23:25:08.075929 kernel: hv_utils: TimeSync IC version 4.0 Sep 3 23:25:08.075935 kernel: hv_netvsc 000d3afb-7590-000d-3afb-7590000d3afb eth0: VF slot 1 added Sep 3 23:25:08.076045 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 3 23:25:08.076120 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 3 23:25:08.076183 systemd-journald[224]: Time jumped backwards, rotating. Sep 3 23:25:08.076210 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 3 23:25:08.067858 systemd-resolved[272]: Clock change detected. Flushing caches. Sep 3 23:25:08.087626 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#198 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:08.095521 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#205 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:08.111810 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 3 23:25:08.111839 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 3 23:25:08.121464 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 3 23:25:08.121632 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 3 23:25:08.121641 kernel: hv_vmbus: registering driver hv_pci Sep 3 23:25:08.123964 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 3 23:25:08.130721 kernel: hv_pci 95d0e115-a21a-46b4-be6d-06f40d0f3f75: PCI VMBus probing: Using version 0x10004 Sep 3 23:25:08.141350 kernel: hv_pci 95d0e115-a21a-46b4-be6d-06f40d0f3f75: PCI host bridge to bus a21a:00 Sep 3 23:25:08.141485 kernel: pci_bus a21a:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 3 23:25:08.141588 kernel: pci_bus a21a:00: No busn resource found for root bus, will use [bus 00-ff] Sep 3 23:25:08.151534 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#239 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:08.153587 kernel: pci a21a:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 3 23:25:08.163774 kernel: pci a21a:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 3 23:25:08.168532 kernel: pci a21a:00:02.0: enabling Extended Tags Sep 3 23:25:08.184521 kernel: pci a21a:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at a21a:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 3 23:25:08.184570 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#214 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:08.197594 kernel: pci_bus a21a:00: busn_res: [bus 00-ff] end is updated to 00 Sep 3 23:25:08.197740 kernel: pci a21a:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 3 23:25:08.258516 kernel: mlx5_core a21a:00:02.0: enabling device (0000 -> 0002) Sep 3 23:25:08.266034 kernel: mlx5_core a21a:00:02.0: PTM is not supported by PCIe Sep 3 23:25:08.266171 kernel: mlx5_core a21a:00:02.0: firmware version: 16.30.5006 Sep 3 23:25:08.433715 kernel: hv_netvsc 000d3afb-7590-000d-3afb-7590000d3afb eth0: VF registering: eth1 Sep 3 23:25:08.433908 kernel: mlx5_core a21a:00:02.0 eth1: joined to eth0 Sep 3 23:25:08.438590 kernel: mlx5_core a21a:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 3 23:25:08.448533 kernel: mlx5_core a21a:00:02.0 enP41498s1: renamed from eth1 Sep 3 23:25:08.752482 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 3 23:25:08.794659 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 3 23:25:08.800155 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 3 23:25:08.810392 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 3 23:25:08.845211 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 3 23:25:08.856191 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 3 23:25:08.990115 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 3 23:25:08.998676 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 3 23:25:09.008216 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:09.017667 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 3 23:25:09.026642 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 3 23:25:09.043209 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 3 23:25:09.845896 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#205 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 3 23:25:09.858551 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 3 23:25:09.859592 disk-uuid[640]: The operation has completed successfully. Sep 3 23:25:09.932044 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 3 23:25:09.932133 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 3 23:25:09.958911 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 3 23:25:09.982802 sh[820]: Success Sep 3 23:25:10.015772 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 3 23:25:10.015830 kernel: device-mapper: uevent: version 1.0.3 Sep 3 23:25:10.021514 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 3 23:25:10.030531 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 3 23:25:10.381768 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 3 23:25:10.389402 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 3 23:25:10.404249 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 3 23:25:10.431503 kernel: BTRFS: device fsid e8b97e78-d30f-4a41-b431-d82f3afef949 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (838) Sep 3 23:25:10.432564 kernel: BTRFS info (device dm-0): first mount of filesystem e8b97e78-d30f-4a41-b431-d82f3afef949 Sep 3 23:25:10.432574 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:10.855767 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 3 23:25:10.855844 kernel: BTRFS info (device dm-0): enabling free space tree Sep 3 23:25:10.890702 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 3 23:25:10.894526 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 3 23:25:10.902212 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 3 23:25:10.902924 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 3 23:25:10.925097 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 3 23:25:10.953759 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (861) Sep 3 23:25:10.962687 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:10.962718 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:11.022233 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 3 23:25:11.032023 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 3 23:25:11.061462 systemd-networkd[1001]: lo: Link UP Sep 3 23:25:11.061472 systemd-networkd[1001]: lo: Gained carrier Sep 3 23:25:11.062196 systemd-networkd[1001]: Enumeration completed Sep 3 23:25:11.064177 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 3 23:25:11.069049 systemd-networkd[1001]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:11.069052 systemd-networkd[1001]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:11.069475 systemd[1]: Reached target network.target - Network. Sep 3 23:25:11.109209 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:11.109238 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:11.117523 kernel: BTRFS info (device sda6): last unmount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:11.117576 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 3 23:25:11.126360 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 3 23:25:11.169533 kernel: mlx5_core a21a:00:02.0 enP41498s1: Link up Sep 3 23:25:11.201536 kernel: hv_netvsc 000d3afb-7590-000d-3afb-7590000d3afb eth0: Data path switched to VF: enP41498s1 Sep 3 23:25:11.201673 systemd-networkd[1001]: enP41498s1: Link UP Sep 3 23:25:11.201731 systemd-networkd[1001]: eth0: Link UP Sep 3 23:25:11.201801 systemd-networkd[1001]: eth0: Gained carrier Sep 3 23:25:11.201815 systemd-networkd[1001]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:11.218660 systemd-networkd[1001]: enP41498s1: Gained carrier Sep 3 23:25:11.229547 systemd-networkd[1001]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:12.178913 ignition[1009]: Ignition 2.21.0 Sep 3 23:25:12.178927 ignition[1009]: Stage: fetch-offline Sep 3 23:25:12.182890 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 3 23:25:12.179015 ignition[1009]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:12.189900 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 3 23:25:12.179025 ignition[1009]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:12.179102 ignition[1009]: parsed url from cmdline: "" Sep 3 23:25:12.179104 ignition[1009]: no config URL provided Sep 3 23:25:12.179107 ignition[1009]: reading system config file "/usr/lib/ignition/user.ign" Sep 3 23:25:12.179112 ignition[1009]: no config at "/usr/lib/ignition/user.ign" Sep 3 23:25:12.179115 ignition[1009]: failed to fetch config: resource requires networking Sep 3 23:25:12.179247 ignition[1009]: Ignition finished successfully Sep 3 23:25:12.218347 ignition[1017]: Ignition 2.21.0 Sep 3 23:25:12.218352 ignition[1017]: Stage: fetch Sep 3 23:25:12.218534 ignition[1017]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:12.218542 ignition[1017]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:12.218613 ignition[1017]: parsed url from cmdline: "" Sep 3 23:25:12.218615 ignition[1017]: no config URL provided Sep 3 23:25:12.218619 ignition[1017]: reading system config file "/usr/lib/ignition/user.ign" Sep 3 23:25:12.218624 ignition[1017]: no config at "/usr/lib/ignition/user.ign" Sep 3 23:25:12.218655 ignition[1017]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 3 23:25:12.309318 ignition[1017]: GET result: OK Sep 3 23:25:12.309396 ignition[1017]: config has been read from IMDS userdata Sep 3 23:25:12.309427 ignition[1017]: parsing config with SHA512: 7b2686895681bd989af9b33658c765d9630cba203e5cb91df8158b2a53ccf158e7b529985dd037e04b44119fcae2c028970d7689e73725a3728e84f936da1c31 Sep 3 23:25:12.312191 unknown[1017]: fetched base config from "system" Sep 3 23:25:12.312504 ignition[1017]: fetch: fetch complete Sep 3 23:25:12.312205 unknown[1017]: fetched base config from "system" Sep 3 23:25:12.312526 ignition[1017]: fetch: fetch passed Sep 3 23:25:12.312208 unknown[1017]: fetched user config from "azure" Sep 3 23:25:12.312580 ignition[1017]: Ignition finished successfully Sep 3 23:25:12.314555 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 3 23:25:12.321465 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 3 23:25:12.353338 ignition[1024]: Ignition 2.21.0 Sep 3 23:25:12.353353 ignition[1024]: Stage: kargs Sep 3 23:25:12.353491 ignition[1024]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:12.353498 ignition[1024]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:12.358034 ignition[1024]: kargs: kargs passed Sep 3 23:25:12.362668 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 3 23:25:12.358081 ignition[1024]: Ignition finished successfully Sep 3 23:25:12.370665 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 3 23:25:12.396813 ignition[1030]: Ignition 2.21.0 Sep 3 23:25:12.398961 ignition[1030]: Stage: disks Sep 3 23:25:12.399430 ignition[1030]: no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:12.401709 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 3 23:25:12.399441 ignition[1030]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:12.406560 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 3 23:25:12.400218 ignition[1030]: disks: disks passed Sep 3 23:25:12.414042 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 3 23:25:12.400263 ignition[1030]: Ignition finished successfully Sep 3 23:25:12.421942 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 3 23:25:12.429187 systemd[1]: Reached target sysinit.target - System Initialization. Sep 3 23:25:12.436833 systemd[1]: Reached target basic.target - Basic System. Sep 3 23:25:12.443022 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 3 23:25:12.528709 systemd-fsck[1038]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 3 23:25:12.537878 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 3 23:25:12.544446 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 3 23:25:12.995759 systemd-networkd[1001]: eth0: Gained IPv6LL Sep 3 23:25:14.699526 kernel: EXT4-fs (sda9): mounted filesystem d953e3b7-a0cb-45f7-b3a7-216a9a578dda r/w with ordered data mode. Quota mode: none. Sep 3 23:25:14.699753 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 3 23:25:14.703352 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 3 23:25:14.740470 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 3 23:25:14.758968 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 3 23:25:14.763608 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 3 23:25:14.776100 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 3 23:25:14.797998 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1052) Sep 3 23:25:14.776137 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 3 23:25:14.798622 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 3 23:25:14.818598 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:14.818621 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:14.815044 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 3 23:25:14.829548 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:14.829570 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:14.835133 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 3 23:25:15.623442 coreos-metadata[1054]: Sep 03 23:25:15.623 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 3 23:25:15.632027 coreos-metadata[1054]: Sep 03 23:25:15.632 INFO Fetch successful Sep 3 23:25:15.635828 coreos-metadata[1054]: Sep 03 23:25:15.635 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 3 23:25:15.644834 coreos-metadata[1054]: Sep 03 23:25:15.644 INFO Fetch successful Sep 3 23:25:15.659083 coreos-metadata[1054]: Sep 03 23:25:15.659 INFO wrote hostname ci-4372.1.0-n-71c6c07a75 to /sysroot/etc/hostname Sep 3 23:25:15.666242 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 3 23:25:15.869068 initrd-setup-root[1082]: cut: /sysroot/etc/passwd: No such file or directory Sep 3 23:25:15.923272 initrd-setup-root[1089]: cut: /sysroot/etc/group: No such file or directory Sep 3 23:25:15.928460 initrd-setup-root[1096]: cut: /sysroot/etc/shadow: No such file or directory Sep 3 23:25:15.952068 initrd-setup-root[1103]: cut: /sysroot/etc/gshadow: No such file or directory Sep 3 23:25:17.070085 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 3 23:25:17.076072 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 3 23:25:17.099179 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 3 23:25:17.109216 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 3 23:25:17.120523 kernel: BTRFS info (device sda6): last unmount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:17.138040 ignition[1171]: INFO : Ignition 2.21.0 Sep 3 23:25:17.138040 ignition[1171]: INFO : Stage: mount Sep 3 23:25:17.145750 ignition[1171]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:17.145750 ignition[1171]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:17.145750 ignition[1171]: INFO : mount: mount passed Sep 3 23:25:17.145750 ignition[1171]: INFO : Ignition finished successfully Sep 3 23:25:17.147539 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 3 23:25:17.159081 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 3 23:25:17.167307 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 3 23:25:17.189418 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 3 23:25:17.212528 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1182) Sep 3 23:25:17.212574 kernel: BTRFS info (device sda6): first mount of filesystem f1885725-917a-44ef-9d71-3c4c588cc4f4 Sep 3 23:25:17.222013 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 3 23:25:17.230931 kernel: BTRFS info (device sda6): turning on async discard Sep 3 23:25:17.230972 kernel: BTRFS info (device sda6): enabling free space tree Sep 3 23:25:17.232547 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 3 23:25:17.261599 ignition[1199]: INFO : Ignition 2.21.0 Sep 3 23:25:17.261599 ignition[1199]: INFO : Stage: files Sep 3 23:25:17.268576 ignition[1199]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:17.268576 ignition[1199]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:17.268576 ignition[1199]: DEBUG : files: compiled without relabeling support, skipping Sep 3 23:25:17.297373 ignition[1199]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 3 23:25:17.297373 ignition[1199]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 3 23:25:17.364425 ignition[1199]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 3 23:25:17.369725 ignition[1199]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 3 23:25:17.369725 ignition[1199]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 3 23:25:17.364805 unknown[1199]: wrote ssh authorized keys file for user: core Sep 3 23:25:17.433722 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 3 23:25:17.441906 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 3 23:25:17.482901 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 3 23:25:17.933482 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 3 23:25:17.995424 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 3 23:25:18.303379 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 3 23:25:18.551884 ignition[1199]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 3 23:25:18.551884 ignition[1199]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 3 23:25:18.604784 ignition[1199]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 3 23:25:18.618809 ignition[1199]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 3 23:25:18.626401 ignition[1199]: INFO : files: files passed Sep 3 23:25:18.626401 ignition[1199]: INFO : Ignition finished successfully Sep 3 23:25:18.620457 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 3 23:25:18.632129 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 3 23:25:18.653026 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 3 23:25:18.665768 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 3 23:25:18.665863 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 3 23:25:18.712534 initrd-setup-root-after-ignition[1228]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:18.712534 initrd-setup-root-after-ignition[1228]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:18.726165 initrd-setup-root-after-ignition[1232]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 3 23:25:18.720031 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 3 23:25:18.731110 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 3 23:25:18.742096 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 3 23:25:18.782026 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 3 23:25:18.782112 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 3 23:25:18.791237 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 3 23:25:18.800119 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 3 23:25:18.807503 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 3 23:25:18.808318 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 3 23:25:18.842018 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 3 23:25:18.848806 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 3 23:25:18.880967 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:18.885586 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:18.894540 systemd[1]: Stopped target timers.target - Timer Units. Sep 3 23:25:18.902366 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 3 23:25:18.902474 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 3 23:25:18.913531 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 3 23:25:18.917442 systemd[1]: Stopped target basic.target - Basic System. Sep 3 23:25:18.925066 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 3 23:25:18.932384 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 3 23:25:18.939916 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 3 23:25:18.947665 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 3 23:25:18.956383 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 3 23:25:18.963955 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 3 23:25:18.973346 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 3 23:25:18.980695 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 3 23:25:18.989351 systemd[1]: Stopped target swap.target - Swaps. Sep 3 23:25:18.996483 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 3 23:25:18.996597 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 3 23:25:19.006698 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:19.010993 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:19.018939 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 3 23:25:19.022865 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:19.027877 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 3 23:25:19.027974 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 3 23:25:19.040740 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 3 23:25:19.040829 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 3 23:25:19.045733 systemd[1]: ignition-files.service: Deactivated successfully. Sep 3 23:25:19.045807 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 3 23:25:19.053609 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 3 23:25:19.053676 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 3 23:25:19.123836 ignition[1252]: INFO : Ignition 2.21.0 Sep 3 23:25:19.123836 ignition[1252]: INFO : Stage: umount Sep 3 23:25:19.123836 ignition[1252]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 3 23:25:19.123836 ignition[1252]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 3 23:25:19.123836 ignition[1252]: INFO : umount: umount passed Sep 3 23:25:19.123836 ignition[1252]: INFO : Ignition finished successfully Sep 3 23:25:19.064268 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 3 23:25:19.077071 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 3 23:25:19.077194 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:19.092309 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 3 23:25:19.105751 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 3 23:25:19.105903 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:19.116317 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 3 23:25:19.116401 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 3 23:25:19.134461 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 3 23:25:19.134566 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 3 23:25:19.140119 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 3 23:25:19.140558 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 3 23:25:19.140650 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 3 23:25:19.149191 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 3 23:25:19.149239 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 3 23:25:19.153186 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 3 23:25:19.153219 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 3 23:25:19.160936 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 3 23:25:19.160974 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 3 23:25:19.171149 systemd[1]: Stopped target network.target - Network. Sep 3 23:25:19.177213 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 3 23:25:19.177282 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 3 23:25:19.185009 systemd[1]: Stopped target paths.target - Path Units. Sep 3 23:25:19.191942 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 3 23:25:19.200538 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:19.209059 systemd[1]: Stopped target slices.target - Slice Units. Sep 3 23:25:19.216797 systemd[1]: Stopped target sockets.target - Socket Units. Sep 3 23:25:19.224416 systemd[1]: iscsid.socket: Deactivated successfully. Sep 3 23:25:19.224465 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 3 23:25:19.232015 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 3 23:25:19.232038 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 3 23:25:19.240084 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 3 23:25:19.240134 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 3 23:25:19.248275 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 3 23:25:19.248304 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 3 23:25:19.255989 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 3 23:25:19.263801 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 3 23:25:19.272710 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 3 23:25:19.272809 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 3 23:25:19.282572 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 3 23:25:19.482652 kernel: hv_netvsc 000d3afb-7590-000d-3afb-7590000d3afb eth0: Data path switched from VF: enP41498s1 Sep 3 23:25:19.282672 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 3 23:25:19.295145 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 3 23:25:19.295325 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 3 23:25:19.295418 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 3 23:25:19.306813 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 3 23:25:19.308166 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 3 23:25:19.315156 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 3 23:25:19.315205 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:19.324138 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 3 23:25:19.324239 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 3 23:25:19.332806 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 3 23:25:19.346021 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 3 23:25:19.346098 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 3 23:25:19.354752 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 3 23:25:19.354804 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:19.363541 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 3 23:25:19.363581 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:19.368593 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 3 23:25:19.368636 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:19.384291 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:19.397213 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 3 23:25:19.397332 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:19.408464 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 3 23:25:19.412482 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:19.421936 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 3 23:25:19.421974 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:19.430333 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 3 23:25:19.430361 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:19.439116 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 3 23:25:19.439171 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 3 23:25:19.452561 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 3 23:25:19.452613 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 3 23:25:19.464497 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 3 23:25:19.464545 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 3 23:25:19.487658 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 3 23:25:19.503560 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 3 23:25:19.503743 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:19.512543 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 3 23:25:19.512591 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:19.522366 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:19.522447 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:19.534498 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 3 23:25:19.708527 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 3 23:25:19.534560 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 3 23:25:19.534590 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 3 23:25:19.534827 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 3 23:25:19.534943 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 3 23:25:19.547660 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 3 23:25:19.547818 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 3 23:25:19.555383 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 3 23:25:19.564392 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 3 23:25:19.608997 systemd[1]: Switching root. Sep 3 23:25:19.747781 systemd-journald[224]: Journal stopped Sep 3 23:25:27.811165 kernel: SELinux: policy capability network_peer_controls=1 Sep 3 23:25:27.811185 kernel: SELinux: policy capability open_perms=1 Sep 3 23:25:27.811193 kernel: SELinux: policy capability extended_socket_class=1 Sep 3 23:25:27.811203 kernel: SELinux: policy capability always_check_network=0 Sep 3 23:25:27.811210 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 3 23:25:27.811215 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 3 23:25:27.811221 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 3 23:25:27.811227 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 3 23:25:27.811232 kernel: SELinux: policy capability userspace_initial_context=0 Sep 3 23:25:27.811238 kernel: audit: type=1403 audit(1756941921.399:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 3 23:25:27.811245 systemd[1]: Successfully loaded SELinux policy in 147.051ms. Sep 3 23:25:27.811254 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.954ms. Sep 3 23:25:27.811261 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 3 23:25:27.811267 systemd[1]: Detected virtualization microsoft. Sep 3 23:25:27.811273 systemd[1]: Detected architecture arm64. Sep 3 23:25:27.811280 systemd[1]: Detected first boot. Sep 3 23:25:27.811286 systemd[1]: Hostname set to . Sep 3 23:25:27.811292 systemd[1]: Initializing machine ID from random generator. Sep 3 23:25:27.811298 zram_generator::config[1295]: No configuration found. Sep 3 23:25:27.811305 kernel: NET: Registered PF_VSOCK protocol family Sep 3 23:25:27.811310 systemd[1]: Populated /etc with preset unit settings. Sep 3 23:25:27.811316 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 3 23:25:27.811323 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 3 23:25:27.811329 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 3 23:25:27.811335 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 3 23:25:27.811341 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 3 23:25:27.811347 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 3 23:25:27.811353 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 3 23:25:27.811359 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 3 23:25:27.811366 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 3 23:25:27.811372 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 3 23:25:27.811377 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 3 23:25:27.811384 systemd[1]: Created slice user.slice - User and Session Slice. Sep 3 23:25:27.811390 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 3 23:25:27.811396 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 3 23:25:27.811402 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 3 23:25:27.811408 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 3 23:25:27.811414 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 3 23:25:27.811421 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 3 23:25:27.811427 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 3 23:25:27.811435 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 3 23:25:27.811441 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 3 23:25:27.811447 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 3 23:25:27.811453 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 3 23:25:27.811459 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 3 23:25:27.811466 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 3 23:25:27.811472 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 3 23:25:27.811478 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 3 23:25:27.811484 systemd[1]: Reached target slices.target - Slice Units. Sep 3 23:25:27.811490 systemd[1]: Reached target swap.target - Swaps. Sep 3 23:25:27.811496 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 3 23:25:27.811502 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 3 23:25:27.816347 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 3 23:25:27.816371 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 3 23:25:27.816379 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 3 23:25:27.816387 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 3 23:25:27.816394 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 3 23:25:27.816400 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 3 23:25:27.816410 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 3 23:25:27.816421 systemd[1]: Mounting media.mount - External Media Directory... Sep 3 23:25:27.816427 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 3 23:25:27.816434 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 3 23:25:27.816449 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 3 23:25:27.816456 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 3 23:25:27.816463 systemd[1]: Reached target machines.target - Containers. Sep 3 23:25:27.816469 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 3 23:25:27.816477 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 3 23:25:27.816483 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 3 23:25:27.816489 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 3 23:25:27.816496 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 3 23:25:27.816502 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 3 23:25:27.816577 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 3 23:25:27.816586 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 3 23:25:27.816593 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 3 23:25:27.816599 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 3 23:25:27.816607 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 3 23:25:27.816614 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 3 23:25:27.816620 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 3 23:25:27.816626 systemd[1]: Stopped systemd-fsck-usr.service. Sep 3 23:25:27.816633 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 3 23:25:27.816639 kernel: fuse: init (API version 7.41) Sep 3 23:25:27.816647 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 3 23:25:27.816653 kernel: loop: module loaded Sep 3 23:25:27.816660 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 3 23:25:27.816666 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 3 23:25:27.816672 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 3 23:25:27.816678 kernel: ACPI: bus type drm_connector registered Sep 3 23:25:27.816684 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 3 23:25:27.816719 systemd-journald[1392]: Collecting audit messages is disabled. Sep 3 23:25:27.816735 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 3 23:25:27.816742 systemd-journald[1392]: Journal started Sep 3 23:25:27.816757 systemd-journald[1392]: Runtime Journal (/run/log/journal/1f441faff0c643399ec3e5c1ba1f9916) is 8M, max 78.5M, 70.5M free. Sep 3 23:25:26.957994 systemd[1]: Queued start job for default target multi-user.target. Sep 3 23:25:26.965126 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 3 23:25:26.965530 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 3 23:25:26.966734 systemd[1]: systemd-journald.service: Consumed 2.261s CPU time. Sep 3 23:25:27.831825 systemd[1]: verity-setup.service: Deactivated successfully. Sep 3 23:25:27.831880 systemd[1]: Stopped verity-setup.service. Sep 3 23:25:27.848636 systemd[1]: Started systemd-journald.service - Journal Service. Sep 3 23:25:27.849412 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 3 23:25:27.855368 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 3 23:25:27.862976 systemd[1]: Mounted media.mount - External Media Directory. Sep 3 23:25:27.867496 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 3 23:25:27.873308 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 3 23:25:27.878662 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 3 23:25:27.882621 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 3 23:25:27.888601 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 3 23:25:27.894967 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 3 23:25:27.895097 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 3 23:25:27.901235 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 3 23:25:27.902606 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 3 23:25:27.907758 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 3 23:25:27.907880 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 3 23:25:27.913020 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 3 23:25:27.913140 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 3 23:25:27.919093 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 3 23:25:27.919210 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 3 23:25:27.924461 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 3 23:25:27.924587 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 3 23:25:27.929722 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 3 23:25:27.935983 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 3 23:25:27.942090 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 3 23:25:27.948075 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 3 23:25:27.955203 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 3 23:25:27.970774 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 3 23:25:27.976830 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 3 23:25:27.987600 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 3 23:25:27.992813 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 3 23:25:27.992898 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 3 23:25:27.998664 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 3 23:25:28.006636 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 3 23:25:28.011826 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 3 23:25:28.036603 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 3 23:25:28.051190 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 3 23:25:28.056151 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 3 23:25:28.056996 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 3 23:25:28.064868 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 3 23:25:28.067648 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 3 23:25:28.074634 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 3 23:25:28.084737 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 3 23:25:28.092505 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 3 23:25:28.099383 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 3 23:25:28.147350 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 3 23:25:28.154915 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 3 23:25:28.160905 systemd-journald[1392]: Time spent on flushing to /var/log/journal/1f441faff0c643399ec3e5c1ba1f9916 is 43.360ms for 936 entries. Sep 3 23:25:28.160905 systemd-journald[1392]: System Journal (/var/log/journal/1f441faff0c643399ec3e5c1ba1f9916) is 11.8M, max 2.6G, 2.6G free. Sep 3 23:25:28.283601 systemd-journald[1392]: Received client request to flush runtime journal. Sep 3 23:25:28.283652 systemd-journald[1392]: /var/log/journal/1f441faff0c643399ec3e5c1ba1f9916/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Sep 3 23:25:28.283676 systemd-journald[1392]: Rotating system journal. Sep 3 23:25:28.283692 kernel: loop0: detected capacity change from 0 to 138376 Sep 3 23:25:28.168694 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 3 23:25:28.241247 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 3 23:25:28.285262 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 3 23:25:28.293925 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 3 23:25:28.294553 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 3 23:25:28.779530 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 3 23:25:28.914725 kernel: loop1: detected capacity change from 0 to 107312 Sep 3 23:25:28.915882 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 3 23:25:28.921622 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 3 23:25:29.086215 systemd-tmpfiles[1452]: ACLs are not supported, ignoring. Sep 3 23:25:29.086230 systemd-tmpfiles[1452]: ACLs are not supported, ignoring. Sep 3 23:25:29.116029 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 3 23:25:29.427540 kernel: loop2: detected capacity change from 0 to 203944 Sep 3 23:25:29.471762 kernel: loop3: detected capacity change from 0 to 28936 Sep 3 23:25:30.077540 kernel: loop4: detected capacity change from 0 to 138376 Sep 3 23:25:30.095527 kernel: loop5: detected capacity change from 0 to 107312 Sep 3 23:25:30.107526 kernel: loop6: detected capacity change from 0 to 203944 Sep 3 23:25:30.123530 kernel: loop7: detected capacity change from 0 to 28936 Sep 3 23:25:30.132087 (sd-merge)[1458]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 3 23:25:30.133563 (sd-merge)[1458]: Merged extensions into '/usr'. Sep 3 23:25:30.138652 systemd[1]: Reload requested from client PID 1434 ('systemd-sysext') (unit systemd-sysext.service)... Sep 3 23:25:30.138664 systemd[1]: Reloading... Sep 3 23:25:30.205566 zram_generator::config[1483]: No configuration found. Sep 3 23:25:30.296378 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:30.375318 systemd[1]: Reloading finished in 236 ms. Sep 3 23:25:30.400504 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 3 23:25:30.405639 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 3 23:25:30.421584 systemd[1]: Starting ensure-sysext.service... Sep 3 23:25:30.425652 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 3 23:25:30.431683 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 3 23:25:30.459441 systemd-udevd[1542]: Using default interface naming scheme 'v255'. Sep 3 23:25:30.463914 systemd[1]: Reload requested from client PID 1540 ('systemctl') (unit ensure-sysext.service)... Sep 3 23:25:30.463929 systemd[1]: Reloading... Sep 3 23:25:30.489226 systemd-tmpfiles[1541]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 3 23:25:30.489254 systemd-tmpfiles[1541]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 3 23:25:30.489470 systemd-tmpfiles[1541]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 3 23:25:30.490723 systemd-tmpfiles[1541]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 3 23:25:30.493608 systemd-tmpfiles[1541]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 3 23:25:30.493896 systemd-tmpfiles[1541]: ACLs are not supported, ignoring. Sep 3 23:25:30.494026 systemd-tmpfiles[1541]: ACLs are not supported, ignoring. Sep 3 23:25:30.522551 zram_generator::config[1573]: No configuration found. Sep 3 23:25:30.563821 systemd-tmpfiles[1541]: Detected autofs mount point /boot during canonicalization of boot. Sep 3 23:25:30.563832 systemd-tmpfiles[1541]: Skipping /boot Sep 3 23:25:30.572431 systemd-tmpfiles[1541]: Detected autofs mount point /boot during canonicalization of boot. Sep 3 23:25:30.572442 systemd-tmpfiles[1541]: Skipping /boot Sep 3 23:25:30.587951 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:30.648698 systemd[1]: Reloading finished in 184 ms. Sep 3 23:25:30.668686 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 3 23:25:30.681807 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 3 23:25:30.732379 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 3 23:25:30.738904 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 3 23:25:30.742458 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 3 23:25:30.748701 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 3 23:25:30.754715 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 3 23:25:30.760677 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 3 23:25:30.760931 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 3 23:25:30.762112 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 3 23:25:30.770709 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 3 23:25:30.776479 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 3 23:25:30.782481 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 3 23:25:30.787677 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 3 23:25:30.793348 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 3 23:25:30.793495 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 3 23:25:30.799054 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 3 23:25:30.799190 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 3 23:25:30.810032 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 3 23:25:30.814198 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 3 23:25:30.815376 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 3 23:25:30.892654 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 3 23:25:30.904731 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 3 23:25:30.911792 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 3 23:25:30.917085 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 3 23:25:30.917206 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 3 23:25:30.917305 systemd[1]: Reached target time-set.target - System Time Set. Sep 3 23:25:30.922369 systemd[1]: Finished ensure-sysext.service. Sep 3 23:25:30.926322 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 3 23:25:30.932559 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 3 23:25:30.932824 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 3 23:25:30.938015 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 3 23:25:30.938254 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 3 23:25:30.943317 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 3 23:25:30.943615 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 3 23:25:30.949439 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 3 23:25:30.951573 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 3 23:25:30.962270 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 3 23:25:30.962441 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 3 23:25:30.964143 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 3 23:25:31.009408 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 3 23:25:31.117332 systemd-resolved[1635]: Positive Trust Anchors: Sep 3 23:25:31.117348 systemd-resolved[1635]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 3 23:25:31.117367 systemd-resolved[1635]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 3 23:25:31.134771 systemd-resolved[1635]: Using system hostname 'ci-4372.1.0-n-71c6c07a75'. Sep 3 23:25:31.135976 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 3 23:25:31.140930 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 3 23:25:31.179302 augenrules[1674]: No rules Sep 3 23:25:31.180653 systemd[1]: audit-rules.service: Deactivated successfully. Sep 3 23:25:31.182693 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 3 23:25:31.222004 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 3 23:25:31.404442 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 3 23:25:31.414610 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 3 23:25:31.507719 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 3 23:25:31.588539 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#270 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 3 23:25:31.598420 systemd-networkd[1698]: lo: Link UP Sep 3 23:25:31.598430 systemd-networkd[1698]: lo: Gained carrier Sep 3 23:25:31.601153 systemd-networkd[1698]: Enumeration completed Sep 3 23:25:31.601250 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 3 23:25:31.607092 systemd-networkd[1698]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:31.607098 systemd-networkd[1698]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:31.609958 systemd[1]: Reached target network.target - Network. Sep 3 23:25:31.619132 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 3 23:25:31.626208 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 3 23:25:31.663937 kernel: mousedev: PS/2 mouse device common for all mice Sep 3 23:25:31.664045 kernel: hv_vmbus: registering driver hv_balloon Sep 3 23:25:31.664056 kernel: hv_vmbus: registering driver hyperv_fb Sep 3 23:25:31.683991 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:31.684542 kernel: mlx5_core a21a:00:02.0 enP41498s1: Link up Sep 3 23:25:31.693190 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 3 23:25:31.711225 kernel: hv_netvsc 000d3afb-7590-000d-3afb-7590000d3afb eth0: Data path switched to VF: enP41498s1 Sep 3 23:25:31.710705 systemd-networkd[1698]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:31.711526 systemd-networkd[1698]: enP41498s1: Link UP Sep 3 23:25:31.711667 systemd-networkd[1698]: eth0: Link UP Sep 3 23:25:31.711670 systemd-networkd[1698]: eth0: Gained carrier Sep 3 23:25:31.711683 systemd-networkd[1698]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:31.722584 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 3 23:25:31.722655 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 3 23:25:31.728009 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 3 23:25:31.730792 kernel: Console: switching to colour dummy device 80x25 Sep 3 23:25:31.730818 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 3 23:25:31.725541 systemd-networkd[1698]: enP41498s1: Gained carrier Sep 3 23:25:31.738597 kernel: Console: switching to colour frame buffer device 128x48 Sep 3 23:25:31.734992 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:31.735165 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:31.745623 systemd-networkd[1698]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:31.747351 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 3 23:25:31.757828 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:31.766256 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 3 23:25:31.766431 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:31.773783 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 3 23:25:31.853414 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 3 23:25:31.859461 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 3 23:25:31.906578 kernel: MACsec IEEE 802.1AE Sep 3 23:25:31.988439 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 3 23:25:33.197554 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 3 23:25:33.369424 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 3 23:25:33.374753 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 3 23:25:33.411606 systemd-networkd[1698]: eth0: Gained IPv6LL Sep 3 23:25:33.413535 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 3 23:25:33.418614 systemd[1]: Reached target network-online.target - Network is Online. Sep 3 23:25:37.847522 ldconfig[1429]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 3 23:25:37.860809 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 3 23:25:37.868346 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 3 23:25:37.896727 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 3 23:25:37.901434 systemd[1]: Reached target sysinit.target - System Initialization. Sep 3 23:25:37.905636 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 3 23:25:37.910379 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 3 23:25:37.915808 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 3 23:25:37.919907 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 3 23:25:37.924530 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 3 23:25:37.929109 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 3 23:25:37.929131 systemd[1]: Reached target paths.target - Path Units. Sep 3 23:25:37.932738 systemd[1]: Reached target timers.target - Timer Units. Sep 3 23:25:37.965562 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 3 23:25:37.971462 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 3 23:25:37.977267 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 3 23:25:37.982950 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 3 23:25:37.988188 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 3 23:25:37.993964 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 3 23:25:37.998650 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 3 23:25:38.004020 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 3 23:25:38.008972 systemd[1]: Reached target sockets.target - Socket Units. Sep 3 23:25:38.013544 systemd[1]: Reached target basic.target - Basic System. Sep 3 23:25:38.017877 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 3 23:25:38.017901 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 3 23:25:38.050311 systemd[1]: Starting chronyd.service - NTP client/server... Sep 3 23:25:38.064616 systemd[1]: Starting containerd.service - containerd container runtime... Sep 3 23:25:38.076724 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 3 23:25:38.083785 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 3 23:25:38.088795 (chronyd)[1826]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Sep 3 23:25:38.096210 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 3 23:25:38.102633 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 3 23:25:38.107931 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 3 23:25:38.112134 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 3 23:25:38.116481 chronyd[1838]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Sep 3 23:25:38.114144 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 3 23:25:38.119683 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 3 23:25:38.120112 KVP[1836]: KVP starting; pid is:1836 Sep 3 23:25:38.120692 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:25:38.123446 jq[1834]: false Sep 3 23:25:38.126623 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 3 23:25:38.128126 KVP[1836]: KVP LIC Version: 3.1 Sep 3 23:25:38.128522 kernel: hv_utils: KVP IC version 4.0 Sep 3 23:25:38.137384 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 3 23:25:38.143491 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 3 23:25:38.149221 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 3 23:25:38.155884 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 3 23:25:38.164036 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 3 23:25:38.168780 chronyd[1838]: Timezone right/UTC failed leap second check, ignoring Sep 3 23:25:38.169529 chronyd[1838]: Loaded seccomp filter (level 2) Sep 3 23:25:38.170214 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 3 23:25:38.174773 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 3 23:25:38.177456 systemd[1]: Starting update-engine.service - Update Engine... Sep 3 23:25:38.184624 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 3 23:25:38.192006 extend-filesystems[1835]: Found /dev/sda6 Sep 3 23:25:38.193450 systemd[1]: Started chronyd.service - NTP client/server. Sep 3 23:25:38.204158 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 3 23:25:38.209141 jq[1857]: true Sep 3 23:25:38.212115 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 3 23:25:38.212313 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 3 23:25:38.217962 systemd[1]: motdgen.service: Deactivated successfully. Sep 3 23:25:38.218153 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 3 23:25:38.223966 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 3 23:25:38.225768 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 3 23:25:38.232367 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 3 23:25:38.236619 extend-filesystems[1835]: Found /dev/sda9 Sep 3 23:25:38.251240 extend-filesystems[1835]: Checking size of /dev/sda9 Sep 3 23:25:38.256403 jq[1870]: true Sep 3 23:25:38.257873 (ntainerd)[1871]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 3 23:25:38.266946 update_engine[1853]: I20250903 23:25:38.266856 1853 main.cc:92] Flatcar Update Engine starting Sep 3 23:25:38.286268 extend-filesystems[1835]: Old size kept for /dev/sda9 Sep 3 23:25:38.289124 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 3 23:25:38.289352 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 3 23:25:38.315284 systemd-logind[1851]: New seat seat0. Sep 3 23:25:38.317470 systemd-logind[1851]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Sep 3 23:25:38.317694 systemd[1]: Started systemd-logind.service - User Login Management. Sep 3 23:25:38.327174 tar[1865]: linux-arm64/helm Sep 3 23:25:38.391603 bash[1901]: Updated "/home/core/.ssh/authorized_keys" Sep 3 23:25:38.394643 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 3 23:25:38.403706 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 3 23:25:38.498516 sshd_keygen[1875]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 3 23:25:38.539539 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 3 23:25:38.552076 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 3 23:25:38.559621 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 3 23:25:38.580960 systemd[1]: issuegen.service: Deactivated successfully. Sep 3 23:25:38.581245 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 3 23:25:38.587406 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 3 23:25:38.600469 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 3 23:25:38.632635 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 3 23:25:38.642122 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 3 23:25:38.650074 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 3 23:25:38.657269 systemd[1]: Reached target getty.target - Login Prompts. Sep 3 23:25:38.679174 dbus-daemon[1829]: [system] SELinux support is enabled Sep 3 23:25:38.679622 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 3 23:25:38.689741 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 3 23:25:38.689768 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 3 23:25:38.697657 update_engine[1853]: I20250903 23:25:38.697598 1853 update_check_scheduler.cc:74] Next update check in 8m6s Sep 3 23:25:38.699873 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 3 23:25:38.699894 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 3 23:25:38.711247 dbus-daemon[1829]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 3 23:25:38.711351 systemd[1]: Started update-engine.service - Update Engine. Sep 3 23:25:38.722025 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 3 23:25:38.785435 coreos-metadata[1828]: Sep 03 23:25:38.785 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 3 23:25:38.792308 coreos-metadata[1828]: Sep 03 23:25:38.792 INFO Fetch successful Sep 3 23:25:38.792308 coreos-metadata[1828]: Sep 03 23:25:38.792 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 3 23:25:38.837406 coreos-metadata[1828]: Sep 03 23:25:38.837 INFO Fetch successful Sep 3 23:25:38.837406 coreos-metadata[1828]: Sep 03 23:25:38.837 INFO Fetching http://168.63.129.16/machine/2247412e-8dfd-4032-975a-1ff2246ba8ee/ff70b649%2Dc20d%2D40dd%2D8d28%2D49f08ea71131.%5Fci%2D4372.1.0%2Dn%2D71c6c07a75?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 3 23:25:38.841045 coreos-metadata[1828]: Sep 03 23:25:38.840 INFO Fetch successful Sep 3 23:25:38.841045 coreos-metadata[1828]: Sep 03 23:25:38.840 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 3 23:25:38.850402 coreos-metadata[1828]: Sep 03 23:25:38.850 INFO Fetch successful Sep 3 23:25:38.895269 tar[1865]: linux-arm64/LICENSE Sep 3 23:25:38.895357 tar[1865]: linux-arm64/README.md Sep 3 23:25:38.909148 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 3 23:25:38.920168 locksmithd[1994]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 3 23:25:39.047530 containerd[1871]: time="2025-09-03T23:25:39Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 3 23:25:39.050536 containerd[1871]: time="2025-09-03T23:25:39.049819928Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 3 23:25:39.055999 containerd[1871]: time="2025-09-03T23:25:39.055964616Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.008µs" Sep 3 23:25:39.055999 containerd[1871]: time="2025-09-03T23:25:39.055994008Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 3 23:25:39.056078 containerd[1871]: time="2025-09-03T23:25:39.056008672Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 3 23:25:39.056161 containerd[1871]: time="2025-09-03T23:25:39.056145664Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 3 23:25:39.056179 containerd[1871]: time="2025-09-03T23:25:39.056161448Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 3 23:25:39.056197 containerd[1871]: time="2025-09-03T23:25:39.056179640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056227 containerd[1871]: time="2025-09-03T23:25:39.056216128Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056227 containerd[1871]: time="2025-09-03T23:25:39.056225504Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056452 containerd[1871]: time="2025-09-03T23:25:39.056424520Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056465 containerd[1871]: time="2025-09-03T23:25:39.056450880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056465 containerd[1871]: time="2025-09-03T23:25:39.056459000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056465 containerd[1871]: time="2025-09-03T23:25:39.056464096Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056571 containerd[1871]: time="2025-09-03T23:25:39.056556904Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056715 containerd[1871]: time="2025-09-03T23:25:39.056700568Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056738 containerd[1871]: time="2025-09-03T23:25:39.056724496Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 3 23:25:39.056738 containerd[1871]: time="2025-09-03T23:25:39.056731104Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 3 23:25:39.056769 containerd[1871]: time="2025-09-03T23:25:39.056755544Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 3 23:25:39.056914 containerd[1871]: time="2025-09-03T23:25:39.056901808Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 3 23:25:39.056970 containerd[1871]: time="2025-09-03T23:25:39.056958080Z" level=info msg="metadata content store policy set" policy=shared Sep 3 23:25:39.060744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:25:39.078470 containerd[1871]: time="2025-09-03T23:25:39.078429712Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 3 23:25:39.078549 containerd[1871]: time="2025-09-03T23:25:39.078495768Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 3 23:25:39.078549 containerd[1871]: time="2025-09-03T23:25:39.078520984Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 3 23:25:39.078549 containerd[1871]: time="2025-09-03T23:25:39.078530672Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 3 23:25:39.078549 containerd[1871]: time="2025-09-03T23:25:39.078538608Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 3 23:25:39.078549 containerd[1871]: time="2025-09-03T23:25:39.078546472Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078554432Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078562176Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078573600Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078581208Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078587272Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 3 23:25:39.078633 containerd[1871]: time="2025-09-03T23:25:39.078596184Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 3 23:25:39.078745 containerd[1871]: time="2025-09-03T23:25:39.078725728Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 3 23:25:39.078763 containerd[1871]: time="2025-09-03T23:25:39.078745288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 3 23:25:39.078763 containerd[1871]: time="2025-09-03T23:25:39.078756904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 3 23:25:39.078789 containerd[1871]: time="2025-09-03T23:25:39.078763904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 3 23:25:39.078789 containerd[1871]: time="2025-09-03T23:25:39.078770808Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 3 23:25:39.078789 containerd[1871]: time="2025-09-03T23:25:39.078777416Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 3 23:25:39.078789 containerd[1871]: time="2025-09-03T23:25:39.078784112Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078790392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078797448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078803608Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078809872Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078868840Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078879480Z" level=info msg="Start snapshots syncer" Sep 3 23:25:39.078893 containerd[1871]: time="2025-09-03T23:25:39.078903288Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 3 23:25:39.079111 containerd[1871]: time="2025-09-03T23:25:39.079067416Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 3 23:25:39.079228 containerd[1871]: time="2025-09-03T23:25:39.079124816Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 3 23:25:39.079228 containerd[1871]: time="2025-09-03T23:25:39.079178112Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 3 23:25:39.079291 containerd[1871]: time="2025-09-03T23:25:39.079273184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 3 23:25:39.079307 containerd[1871]: time="2025-09-03T23:25:39.079292112Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 3 23:25:39.079307 containerd[1871]: time="2025-09-03T23:25:39.079299088Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 3 23:25:39.079307 containerd[1871]: time="2025-09-03T23:25:39.079306928Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 3 23:25:39.079351 containerd[1871]: time="2025-09-03T23:25:39.079314288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 3 23:25:39.079351 containerd[1871]: time="2025-09-03T23:25:39.079320984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 3 23:25:39.079351 containerd[1871]: time="2025-09-03T23:25:39.079332032Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 3 23:25:39.079389 containerd[1871]: time="2025-09-03T23:25:39.079353184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 3 23:25:39.079389 containerd[1871]: time="2025-09-03T23:25:39.079360904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 3 23:25:39.079389 containerd[1871]: time="2025-09-03T23:25:39.079367392Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 3 23:25:39.079389 containerd[1871]: time="2025-09-03T23:25:39.079395592Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079405304Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079410968Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079416416Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079420888Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079426632Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079433128Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079445016Z" level=info msg="runtime interface created" Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079448016Z" level=info msg="created NRI interface" Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079453504Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079462344Z" level=info msg="Connect containerd service" Sep 3 23:25:39.079452 containerd[1871]: time="2025-09-03T23:25:39.079481680Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 3 23:25:39.080539 containerd[1871]: time="2025-09-03T23:25:39.080485080Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 3 23:25:39.257929 (kubelet)[2015]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:25:39.275848 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 3 23:25:39.280989 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 3 23:25:39.679292 kubelet[2015]: E0903 23:25:39.679178 2015 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:25:39.681766 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:25:39.681872 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:25:39.682370 systemd[1]: kubelet.service: Consumed 543ms CPU time, 254.8M memory peak. Sep 3 23:25:39.993013 containerd[1871]: time="2025-09-03T23:25:39.992909008Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 3 23:25:39.993013 containerd[1871]: time="2025-09-03T23:25:39.992969184Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 3 23:25:39.993013 containerd[1871]: time="2025-09-03T23:25:39.992981600Z" level=info msg="Start subscribing containerd event" Sep 3 23:25:39.993013 containerd[1871]: time="2025-09-03T23:25:39.993002672Z" level=info msg="Start recovering state" Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993071488Z" level=info msg="Start event monitor" Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993085272Z" level=info msg="Start cni network conf syncer for default" Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993092424Z" level=info msg="Start streaming server" Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993098944Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993103928Z" level=info msg="runtime interface starting up..." Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993107736Z" level=info msg="starting plugins..." Sep 3 23:25:39.993147 containerd[1871]: time="2025-09-03T23:25:39.993118352Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 3 23:25:39.993229 containerd[1871]: time="2025-09-03T23:25:39.993210096Z" level=info msg="containerd successfully booted in 0.946287s" Sep 3 23:25:39.993653 systemd[1]: Started containerd.service - containerd container runtime. Sep 3 23:25:40.001232 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 3 23:25:40.010607 systemd[1]: Startup finished in 1.623s (kernel) + 14.793s (initrd) + 18.756s (userspace) = 35.174s. Sep 3 23:25:40.817264 waagent[1988]: 2025-09-03T23:25:40.817187Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 3 23:25:40.821279 waagent[1988]: 2025-09-03T23:25:40.821238Z INFO Daemon Daemon OS: flatcar 4372.1.0 Sep 3 23:25:40.824303 waagent[1988]: 2025-09-03T23:25:40.824275Z INFO Daemon Daemon Python: 3.11.12 Sep 3 23:25:40.829507 waagent[1988]: 2025-09-03T23:25:40.827391Z INFO Daemon Daemon Run daemon Sep 3 23:25:40.830191 waagent[1988]: 2025-09-03T23:25:40.830158Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4372.1.0' Sep 3 23:25:40.836687 waagent[1988]: 2025-09-03T23:25:40.836648Z INFO Daemon Daemon Using waagent for provisioning Sep 3 23:25:40.842528 waagent[1988]: 2025-09-03T23:25:40.840935Z INFO Daemon Daemon Activate resource disk Sep 3 23:25:40.841375 login[1990]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:25:40.845810 waagent[1988]: 2025-09-03T23:25:40.845740Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 3 23:25:40.846909 login[1991]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:25:40.849146 waagent[1988]: 2025-09-03T23:25:40.849078Z INFO Daemon Daemon Found device: None Sep 3 23:25:40.849390 waagent[1988]: 2025-09-03T23:25:40.849356Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 3 23:25:40.850551 waagent[1988]: 2025-09-03T23:25:40.849700Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 3 23:25:40.850551 waagent[1988]: 2025-09-03T23:25:40.850381Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 3 23:25:40.850679 waagent[1988]: 2025-09-03T23:25:40.850648Z INFO Daemon Daemon Running default provisioning handler Sep 3 23:25:40.855343 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 3 23:25:40.856500 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 3 23:25:40.859097 systemd-logind[1851]: New session 1 of user core. Sep 3 23:25:40.862332 waagent[1988]: 2025-09-03T23:25:40.861908Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 3 23:25:40.863225 waagent[1988]: 2025-09-03T23:25:40.863189Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 3 23:25:40.863610 waagent[1988]: 2025-09-03T23:25:40.863575Z INFO Daemon Daemon cloud-init is enabled: False Sep 3 23:25:40.864692 waagent[1988]: 2025-09-03T23:25:40.863886Z INFO Daemon Daemon Copying ovf-env.xml Sep 3 23:25:40.865575 systemd-logind[1851]: New session 2 of user core. Sep 3 23:25:40.933547 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 3 23:25:40.938760 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 3 23:25:40.966986 (systemd)[2050]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 3 23:25:40.969232 systemd-logind[1851]: New session c1 of user core. Sep 3 23:25:40.981849 waagent[1988]: 2025-09-03T23:25:40.979592Z INFO Daemon Daemon Successfully mounted dvd Sep 3 23:25:41.005676 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 3 23:25:41.011732 waagent[1988]: 2025-09-03T23:25:41.008315Z INFO Daemon Daemon Detect protocol endpoint Sep 3 23:25:41.011978 waagent[1988]: 2025-09-03T23:25:41.011941Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 3 23:25:41.016562 waagent[1988]: 2025-09-03T23:25:41.016480Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 3 23:25:41.021527 waagent[1988]: 2025-09-03T23:25:41.021483Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 3 23:25:41.025640 waagent[1988]: 2025-09-03T23:25:41.025602Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 3 23:25:41.030173 waagent[1988]: 2025-09-03T23:25:41.030141Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 3 23:25:41.080789 waagent[1988]: 2025-09-03T23:25:41.080692Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 3 23:25:41.086668 waagent[1988]: 2025-09-03T23:25:41.086646Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 3 23:25:41.091212 waagent[1988]: 2025-09-03T23:25:41.091179Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 3 23:25:41.331241 waagent[1988]: 2025-09-03T23:25:41.331109Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 3 23:25:41.337514 waagent[1988]: 2025-09-03T23:25:41.335980Z INFO Daemon Daemon Forcing an update of the goal state. Sep 3 23:25:41.343181 waagent[1988]: 2025-09-03T23:25:41.343146Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 3 23:25:41.364332 systemd[2050]: Queued start job for default target default.target. Sep 3 23:25:41.371199 systemd[2050]: Created slice app.slice - User Application Slice. Sep 3 23:25:41.371221 systemd[2050]: Reached target paths.target - Paths. Sep 3 23:25:41.371249 systemd[2050]: Reached target timers.target - Timers. Sep 3 23:25:41.372226 systemd[2050]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 3 23:25:41.379309 systemd[2050]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 3 23:25:41.379356 systemd[2050]: Reached target sockets.target - Sockets. Sep 3 23:25:41.379393 systemd[2050]: Reached target basic.target - Basic System. Sep 3 23:25:41.379414 systemd[2050]: Reached target default.target - Main User Target. Sep 3 23:25:41.379434 systemd[2050]: Startup finished in 403ms. Sep 3 23:25:41.379483 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 3 23:25:41.381003 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 3 23:25:41.382231 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 3 23:25:41.445135 waagent[1988]: 2025-09-03T23:25:41.445099Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 3 23:25:41.449487 waagent[1988]: 2025-09-03T23:25:41.449451Z INFO Daemon Sep 3 23:25:41.451577 waagent[1988]: 2025-09-03T23:25:41.451543Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 89eae6bb-e86c-4c8a-943e-9b8cc68b152b eTag: 12023061835436433215 source: Fabric] Sep 3 23:25:41.459635 waagent[1988]: 2025-09-03T23:25:41.459603Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 3 23:25:41.465792 waagent[1988]: 2025-09-03T23:25:41.464080Z INFO Daemon Sep 3 23:25:41.466011 waagent[1988]: 2025-09-03T23:25:41.465981Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 3 23:25:41.473944 waagent[1988]: 2025-09-03T23:25:41.473914Z INFO Daemon Daemon Downloading artifacts profile blob Sep 3 23:25:41.602026 waagent[1988]: 2025-09-03T23:25:41.601930Z INFO Daemon Downloaded certificate {'thumbprint': '14D5058C7703BE9772610DFE3DD8C32E48DE5450', 'hasPrivateKey': True} Sep 3 23:25:41.608790 waagent[1988]: 2025-09-03T23:25:41.608752Z INFO Daemon Fetch goal state completed Sep 3 23:25:41.617839 waagent[1988]: 2025-09-03T23:25:41.617809Z INFO Daemon Daemon Starting provisioning Sep 3 23:25:41.621743 waagent[1988]: 2025-09-03T23:25:41.621700Z INFO Daemon Daemon Handle ovf-env.xml. Sep 3 23:25:41.625420 waagent[1988]: 2025-09-03T23:25:41.625386Z INFO Daemon Daemon Set hostname [ci-4372.1.0-n-71c6c07a75] Sep 3 23:25:41.660703 waagent[1988]: 2025-09-03T23:25:41.660667Z INFO Daemon Daemon Publish hostname [ci-4372.1.0-n-71c6c07a75] Sep 3 23:25:41.665640 waagent[1988]: 2025-09-03T23:25:41.665601Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 3 23:25:41.670388 waagent[1988]: 2025-09-03T23:25:41.670352Z INFO Daemon Daemon Primary interface is [eth0] Sep 3 23:25:41.680084 systemd-networkd[1698]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 3 23:25:41.680308 systemd-networkd[1698]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 3 23:25:41.680403 systemd-networkd[1698]: eth0: DHCP lease lost Sep 3 23:25:41.680721 waagent[1988]: 2025-09-03T23:25:41.680683Z INFO Daemon Daemon Create user account if not exists Sep 3 23:25:41.684606 waagent[1988]: 2025-09-03T23:25:41.684573Z INFO Daemon Daemon User core already exists, skip useradd Sep 3 23:25:41.688719 waagent[1988]: 2025-09-03T23:25:41.688685Z INFO Daemon Daemon Configure sudoer Sep 3 23:25:41.695914 waagent[1988]: 2025-09-03T23:25:41.695873Z INFO Daemon Daemon Configure sshd Sep 3 23:25:41.703077 waagent[1988]: 2025-09-03T23:25:41.703038Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 3 23:25:41.712318 waagent[1988]: 2025-09-03T23:25:41.712281Z INFO Daemon Daemon Deploy ssh public key. Sep 3 23:25:41.718557 systemd-networkd[1698]: eth0: DHCPv4 address 10.200.20.15/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 3 23:25:42.864054 waagent[1988]: 2025-09-03T23:25:42.859926Z INFO Daemon Daemon Provisioning complete Sep 3 23:25:42.873217 waagent[1988]: 2025-09-03T23:25:42.873185Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 3 23:25:42.878583 waagent[1988]: 2025-09-03T23:25:42.878552Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 3 23:25:42.886333 waagent[1988]: 2025-09-03T23:25:42.886306Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 3 23:25:42.981060 waagent[2094]: 2025-09-03T23:25:42.980646Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 3 23:25:42.981060 waagent[2094]: 2025-09-03T23:25:42.980762Z INFO ExtHandler ExtHandler OS: flatcar 4372.1.0 Sep 3 23:25:42.981060 waagent[2094]: 2025-09-03T23:25:42.980797Z INFO ExtHandler ExtHandler Python: 3.11.12 Sep 3 23:25:42.981060 waagent[2094]: 2025-09-03T23:25:42.980829Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 3 23:25:43.047410 waagent[2094]: 2025-09-03T23:25:43.047355Z INFO ExtHandler ExtHandler Distro: flatcar-4372.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 3 23:25:43.047710 waagent[2094]: 2025-09-03T23:25:43.047680Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:43.047864 waagent[2094]: 2025-09-03T23:25:43.047836Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:43.053955 waagent[2094]: 2025-09-03T23:25:43.053909Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 3 23:25:43.058765 waagent[2094]: 2025-09-03T23:25:43.058735Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 3 23:25:43.059193 waagent[2094]: 2025-09-03T23:25:43.059159Z INFO ExtHandler Sep 3 23:25:43.059316 waagent[2094]: 2025-09-03T23:25:43.059292Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: fe2ea948-e052-4fda-aed8-9b2b09ed65f9 eTag: 12023061835436433215 source: Fabric] Sep 3 23:25:43.059661 waagent[2094]: 2025-09-03T23:25:43.059626Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 3 23:25:43.060158 waagent[2094]: 2025-09-03T23:25:43.060125Z INFO ExtHandler Sep 3 23:25:43.060264 waagent[2094]: 2025-09-03T23:25:43.060242Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 3 23:25:43.063723 waagent[2094]: 2025-09-03T23:25:43.063699Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 3 23:25:43.124441 waagent[2094]: 2025-09-03T23:25:43.124368Z INFO ExtHandler Downloaded certificate {'thumbprint': '14D5058C7703BE9772610DFE3DD8C32E48DE5450', 'hasPrivateKey': True} Sep 3 23:25:43.124894 waagent[2094]: 2025-09-03T23:25:43.124862Z INFO ExtHandler Fetch goal state completed Sep 3 23:25:43.135712 waagent[2094]: 2025-09-03T23:25:43.135679Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) Sep 3 23:25:43.138985 waagent[2094]: 2025-09-03T23:25:43.138947Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2094 Sep 3 23:25:43.139179 waagent[2094]: 2025-09-03T23:25:43.139149Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 3 23:25:43.139482 waagent[2094]: 2025-09-03T23:25:43.139454Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 3 23:25:43.140642 waagent[2094]: 2025-09-03T23:25:43.140606Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4372.1.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 3 23:25:43.141024 waagent[2094]: 2025-09-03T23:25:43.140993Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4372.1.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 3 23:25:43.141234 waagent[2094]: 2025-09-03T23:25:43.141206Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 3 23:25:43.141907 waagent[2094]: 2025-09-03T23:25:43.141733Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 3 23:25:43.384627 waagent[2094]: 2025-09-03T23:25:43.384200Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 3 23:25:43.384627 waagent[2094]: 2025-09-03T23:25:43.384383Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 3 23:25:43.389118 waagent[2094]: 2025-09-03T23:25:43.389095Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 3 23:25:43.393961 systemd[1]: Reload requested from client PID 2109 ('systemctl') (unit waagent.service)... Sep 3 23:25:43.393974 systemd[1]: Reloading... Sep 3 23:25:43.478548 zram_generator::config[2153]: No configuration found. Sep 3 23:25:43.534858 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:25:43.613877 systemd[1]: Reloading finished in 219 ms. Sep 3 23:25:43.625304 waagent[2094]: 2025-09-03T23:25:43.623683Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 3 23:25:43.625304 waagent[2094]: 2025-09-03T23:25:43.623822Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 3 23:25:44.805441 waagent[2094]: 2025-09-03T23:25:44.805364Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 3 23:25:44.805809 waagent[2094]: 2025-09-03T23:25:44.805699Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 3 23:25:44.806353 waagent[2094]: 2025-09-03T23:25:44.806315Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 3 23:25:44.806663 waagent[2094]: 2025-09-03T23:25:44.806605Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 3 23:25:44.807053 waagent[2094]: 2025-09-03T23:25:44.807014Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 3 23:25:44.807148 waagent[2094]: 2025-09-03T23:25:44.807124Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:44.807183 waagent[2094]: 2025-09-03T23:25:44.807154Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 3 23:25:44.807429 waagent[2094]: 2025-09-03T23:25:44.807364Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 3 23:25:44.807530 waagent[2094]: 2025-09-03T23:25:44.807425Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 3 23:25:44.807737 waagent[2094]: 2025-09-03T23:25:44.807697Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 3 23:25:44.808529 waagent[2094]: 2025-09-03T23:25:44.808099Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:44.808529 waagent[2094]: 2025-09-03T23:25:44.808273Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 3 23:25:44.808529 waagent[2094]: 2025-09-03T23:25:44.808400Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 3 23:25:44.808529 waagent[2094]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 3 23:25:44.808529 waagent[2094]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 3 23:25:44.808529 waagent[2094]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 3 23:25:44.808529 waagent[2094]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:44.808529 waagent[2094]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:44.808529 waagent[2094]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 3 23:25:44.808794 waagent[2094]: 2025-09-03T23:25:44.808760Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 3 23:25:44.808985 waagent[2094]: 2025-09-03T23:25:44.808953Z INFO EnvHandler ExtHandler Configure routes Sep 3 23:25:44.809096 waagent[2094]: 2025-09-03T23:25:44.809074Z INFO EnvHandler ExtHandler Gateway:None Sep 3 23:25:44.809181 waagent[2094]: 2025-09-03T23:25:44.809159Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 3 23:25:44.809799 waagent[2094]: 2025-09-03T23:25:44.809772Z INFO EnvHandler ExtHandler Routes:None Sep 3 23:25:44.814048 waagent[2094]: 2025-09-03T23:25:44.814013Z INFO ExtHandler ExtHandler Sep 3 23:25:44.814350 waagent[2094]: 2025-09-03T23:25:44.814325Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 42c56de9-85b5-4d0e-af0d-1d48cbd56c37 correlation 8674c8fa-8c68-467a-913e-28936fcc509b created: 2025-09-03T23:24:16.155945Z] Sep 3 23:25:44.814942 waagent[2094]: 2025-09-03T23:25:44.814797Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 3 23:25:44.815460 waagent[2094]: 2025-09-03T23:25:44.815431Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Sep 3 23:25:44.843431 waagent[2094]: 2025-09-03T23:25:44.843382Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 3 23:25:44.843431 waagent[2094]: Try `iptables -h' or 'iptables --help' for more information.) Sep 3 23:25:44.843743 waagent[2094]: 2025-09-03T23:25:44.843710Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: C3D2CD7D-671D-493F-9501-BAE4CD08B224;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 3 23:25:44.919783 waagent[2094]: 2025-09-03T23:25:44.919719Z INFO MonitorHandler ExtHandler Network interfaces: Sep 3 23:25:44.919783 waagent[2094]: Executing ['ip', '-a', '-o', 'link']: Sep 3 23:25:44.919783 waagent[2094]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 3 23:25:44.919783 waagent[2094]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fb:75:90 brd ff:ff:ff:ff:ff:ff Sep 3 23:25:44.919783 waagent[2094]: 3: enP41498s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fb:75:90 brd ff:ff:ff:ff:ff:ff\ altname enP41498p0s2 Sep 3 23:25:44.919783 waagent[2094]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 3 23:25:44.919783 waagent[2094]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 3 23:25:44.919783 waagent[2094]: 2: eth0 inet 10.200.20.15/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 3 23:25:44.919783 waagent[2094]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 3 23:25:44.919783 waagent[2094]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 3 23:25:44.919783 waagent[2094]: 2: eth0 inet6 fe80::20d:3aff:fefb:7590/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 3 23:25:44.975402 waagent[2094]: 2025-09-03T23:25:44.975351Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 3 23:25:44.975402 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:44.975402 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.975402 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:44.975402 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.975402 waagent[2094]: Chain OUTPUT (policy ACCEPT 2 packets, 236 bytes) Sep 3 23:25:44.975402 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.975402 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 3 23:25:44.975402 waagent[2094]: 5 647 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 3 23:25:44.975402 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 3 23:25:44.977655 waagent[2094]: 2025-09-03T23:25:44.977612Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 3 23:25:44.977655 waagent[2094]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:44.977655 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.977655 waagent[2094]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 3 23:25:44.977655 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.977655 waagent[2094]: Chain OUTPUT (policy ACCEPT 3 packets, 296 bytes) Sep 3 23:25:44.977655 waagent[2094]: pkts bytes target prot opt in out source destination Sep 3 23:25:44.977655 waagent[2094]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 3 23:25:44.977655 waagent[2094]: 6 699 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 3 23:25:44.977655 waagent[2094]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 3 23:25:44.977847 waagent[2094]: 2025-09-03T23:25:44.977822Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 3 23:25:49.934590 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 3 23:25:49.936335 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:25:50.040425 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:25:50.043141 (kubelet)[2243]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:25:50.151114 kubelet[2243]: E0903 23:25:50.151041 2243 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:25:50.153700 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:25:50.153812 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:25:50.154234 systemd[1]: kubelet.service: Consumed 109ms CPU time, 106.1M memory peak. Sep 3 23:26:00.317000 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 3 23:26:00.318384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:00.420249 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:00.422569 (kubelet)[2258]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:00.545677 kubelet[2258]: E0903 23:26:00.545604 2258 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:00.547888 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:00.548093 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:00.548599 systemd[1]: kubelet.service: Consumed 105ms CPU time, 107.4M memory peak. Sep 3 23:26:02.005044 chronyd[1838]: Selected source PHC0 Sep 3 23:26:03.010375 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 3 23:26:03.011988 systemd[1]: Started sshd@0-10.200.20.15:22-10.200.16.10:58306.service - OpenSSH per-connection server daemon (10.200.16.10:58306). Sep 3 23:26:03.665486 sshd[2266]: Accepted publickey for core from 10.200.16.10 port 58306 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:03.666573 sshd-session[2266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:03.670572 systemd-logind[1851]: New session 3 of user core. Sep 3 23:26:03.676614 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 3 23:26:04.114979 systemd[1]: Started sshd@1-10.200.20.15:22-10.200.16.10:58318.service - OpenSSH per-connection server daemon (10.200.16.10:58318). Sep 3 23:26:04.604378 sshd[2271]: Accepted publickey for core from 10.200.16.10 port 58318 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:04.605457 sshd-session[2271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:04.609039 systemd-logind[1851]: New session 4 of user core. Sep 3 23:26:04.619718 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 3 23:26:04.954497 sshd[2273]: Connection closed by 10.200.16.10 port 58318 Sep 3 23:26:04.954342 sshd-session[2271]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:04.957451 systemd[1]: sshd@1-10.200.20.15:22-10.200.16.10:58318.service: Deactivated successfully. Sep 3 23:26:04.959032 systemd[1]: session-4.scope: Deactivated successfully. Sep 3 23:26:04.959891 systemd-logind[1851]: Session 4 logged out. Waiting for processes to exit. Sep 3 23:26:04.961233 systemd-logind[1851]: Removed session 4. Sep 3 23:26:05.045810 systemd[1]: Started sshd@2-10.200.20.15:22-10.200.16.10:58328.service - OpenSSH per-connection server daemon (10.200.16.10:58328). Sep 3 23:26:05.545425 sshd[2279]: Accepted publickey for core from 10.200.16.10 port 58328 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:05.546502 sshd-session[2279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:05.549820 systemd-logind[1851]: New session 5 of user core. Sep 3 23:26:05.556767 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 3 23:26:05.905384 sshd[2281]: Connection closed by 10.200.16.10 port 58328 Sep 3 23:26:05.905896 sshd-session[2279]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:05.908767 systemd[1]: sshd@2-10.200.20.15:22-10.200.16.10:58328.service: Deactivated successfully. Sep 3 23:26:05.910060 systemd[1]: session-5.scope: Deactivated successfully. Sep 3 23:26:05.910621 systemd-logind[1851]: Session 5 logged out. Waiting for processes to exit. Sep 3 23:26:05.911668 systemd-logind[1851]: Removed session 5. Sep 3 23:26:05.993711 systemd[1]: Started sshd@3-10.200.20.15:22-10.200.16.10:58340.service - OpenSSH per-connection server daemon (10.200.16.10:58340). Sep 3 23:26:06.485655 sshd[2287]: Accepted publickey for core from 10.200.16.10 port 58340 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:06.486786 sshd-session[2287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:06.490427 systemd-logind[1851]: New session 6 of user core. Sep 3 23:26:06.497776 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 3 23:26:06.850038 sshd[2289]: Connection closed by 10.200.16.10 port 58340 Sep 3 23:26:06.849434 sshd-session[2287]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:06.852401 systemd[1]: sshd@3-10.200.20.15:22-10.200.16.10:58340.service: Deactivated successfully. Sep 3 23:26:06.853776 systemd[1]: session-6.scope: Deactivated successfully. Sep 3 23:26:06.854322 systemd-logind[1851]: Session 6 logged out. Waiting for processes to exit. Sep 3 23:26:06.855948 systemd-logind[1851]: Removed session 6. Sep 3 23:26:06.937707 systemd[1]: Started sshd@4-10.200.20.15:22-10.200.16.10:58342.service - OpenSSH per-connection server daemon (10.200.16.10:58342). Sep 3 23:26:07.436795 sshd[2295]: Accepted publickey for core from 10.200.16.10 port 58342 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:07.437887 sshd-session[2295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:07.441484 systemd-logind[1851]: New session 7 of user core. Sep 3 23:26:07.448824 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 3 23:26:07.931077 sudo[2298]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 3 23:26:07.931297 sudo[2298]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:07.958125 sudo[2298]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:08.046754 sshd[2297]: Connection closed by 10.200.16.10 port 58342 Sep 3 23:26:08.047393 sshd-session[2295]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:08.050744 systemd[1]: sshd@4-10.200.20.15:22-10.200.16.10:58342.service: Deactivated successfully. Sep 3 23:26:08.052013 systemd[1]: session-7.scope: Deactivated successfully. Sep 3 23:26:08.052598 systemd-logind[1851]: Session 7 logged out. Waiting for processes to exit. Sep 3 23:26:08.053742 systemd-logind[1851]: Removed session 7. Sep 3 23:26:08.137867 systemd[1]: Started sshd@5-10.200.20.15:22-10.200.16.10:58348.service - OpenSSH per-connection server daemon (10.200.16.10:58348). Sep 3 23:26:08.637355 sshd[2304]: Accepted publickey for core from 10.200.16.10 port 58348 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:08.638433 sshd-session[2304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:08.642060 systemd-logind[1851]: New session 8 of user core. Sep 3 23:26:08.652624 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 3 23:26:08.912161 sudo[2308]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 3 23:26:08.912369 sudo[2308]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:08.918987 sudo[2308]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:08.922329 sudo[2307]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 3 23:26:08.922554 sudo[2307]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:08.929682 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 3 23:26:08.963206 augenrules[2330]: No rules Sep 3 23:26:08.964309 systemd[1]: audit-rules.service: Deactivated successfully. Sep 3 23:26:08.964464 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 3 23:26:08.967101 sudo[2307]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:09.044448 sshd[2306]: Connection closed by 10.200.16.10 port 58348 Sep 3 23:26:09.044764 sshd-session[2304]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:09.048040 systemd-logind[1851]: Session 8 logged out. Waiting for processes to exit. Sep 3 23:26:09.048246 systemd[1]: sshd@5-10.200.20.15:22-10.200.16.10:58348.service: Deactivated successfully. Sep 3 23:26:09.049521 systemd[1]: session-8.scope: Deactivated successfully. Sep 3 23:26:09.051768 systemd-logind[1851]: Removed session 8. Sep 3 23:26:09.129782 systemd[1]: Started sshd@6-10.200.20.15:22-10.200.16.10:58352.service - OpenSSH per-connection server daemon (10.200.16.10:58352). Sep 3 23:26:09.588159 sshd[2339]: Accepted publickey for core from 10.200.16.10 port 58352 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:26:09.589222 sshd-session[2339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:26:09.592676 systemd-logind[1851]: New session 9 of user core. Sep 3 23:26:09.603680 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 3 23:26:09.846479 sudo[2342]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 3 23:26:09.846713 sudo[2342]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 3 23:26:10.566754 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 3 23:26:10.568322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:10.748325 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:10.750641 (kubelet)[2363]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:10.791838 kubelet[2363]: E0903 23:26:10.791787 2363 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:10.793816 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:10.794011 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:10.794567 systemd[1]: kubelet.service: Consumed 103ms CPU time, 104.8M memory peak. Sep 3 23:26:11.530579 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 3 23:26:11.538743 (dockerd)[2374]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 3 23:26:12.346483 dockerd[2374]: time="2025-09-03T23:26:12.346429279Z" level=info msg="Starting up" Sep 3 23:26:12.349051 dockerd[2374]: time="2025-09-03T23:26:12.349027723Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 3 23:26:12.464608 systemd[1]: var-lib-docker-metacopy\x2dcheck210233404-merged.mount: Deactivated successfully. Sep 3 23:26:12.485597 dockerd[2374]: time="2025-09-03T23:26:12.485539123Z" level=info msg="Loading containers: start." Sep 3 23:26:12.563545 kernel: Initializing XFRM netlink socket Sep 3 23:26:13.027687 systemd-networkd[1698]: docker0: Link UP Sep 3 23:26:13.044136 dockerd[2374]: time="2025-09-03T23:26:13.043568138Z" level=info msg="Loading containers: done." Sep 3 23:26:13.059075 dockerd[2374]: time="2025-09-03T23:26:13.059044437Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 3 23:26:13.059298 dockerd[2374]: time="2025-09-03T23:26:13.059278096Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 3 23:26:13.059456 dockerd[2374]: time="2025-09-03T23:26:13.059440106Z" level=info msg="Initializing buildkit" Sep 3 23:26:13.110176 dockerd[2374]: time="2025-09-03T23:26:13.110110933Z" level=info msg="Completed buildkit initialization" Sep 3 23:26:13.115555 dockerd[2374]: time="2025-09-03T23:26:13.115111065Z" level=info msg="Daemon has completed initialization" Sep 3 23:26:13.115611 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 3 23:26:13.115784 dockerd[2374]: time="2025-09-03T23:26:13.115758898Z" level=info msg="API listen on /run/docker.sock" Sep 3 23:26:13.724923 containerd[1871]: time="2025-09-03T23:26:13.724887218Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 3 23:26:14.489577 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2224224338.mount: Deactivated successfully. Sep 3 23:26:15.472527 containerd[1871]: time="2025-09-03T23:26:15.472478107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.474898 containerd[1871]: time="2025-09-03T23:26:15.474873227Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652441" Sep 3 23:26:15.477949 containerd[1871]: time="2025-09-03T23:26:15.477926013Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.481941 containerd[1871]: time="2025-09-03T23:26:15.481917187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:15.482477 containerd[1871]: time="2025-09-03T23:26:15.482243960Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 1.757323253s" Sep 3 23:26:15.482477 containerd[1871]: time="2025-09-03T23:26:15.482269936Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 3 23:26:15.483373 containerd[1871]: time="2025-09-03T23:26:15.483351655Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 3 23:26:16.743224 containerd[1871]: time="2025-09-03T23:26:16.742638174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:16.746242 containerd[1871]: time="2025-09-03T23:26:16.746217223Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460309" Sep 3 23:26:16.749174 containerd[1871]: time="2025-09-03T23:26:16.749150951Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:16.754560 containerd[1871]: time="2025-09-03T23:26:16.754532656Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:16.755044 containerd[1871]: time="2025-09-03T23:26:16.754904349Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.27152867s" Sep 3 23:26:16.755044 containerd[1871]: time="2025-09-03T23:26:16.754931357Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 3 23:26:16.755373 containerd[1871]: time="2025-09-03T23:26:16.755330819Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 3 23:26:17.771677 containerd[1871]: time="2025-09-03T23:26:17.771621690Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.774322 containerd[1871]: time="2025-09-03T23:26:17.774148645Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125903" Sep 3 23:26:17.777273 containerd[1871]: time="2025-09-03T23:26:17.777245415Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.781866 containerd[1871]: time="2025-09-03T23:26:17.781827165Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:17.782524 containerd[1871]: time="2025-09-03T23:26:17.782396597Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.027042434s" Sep 3 23:26:17.782524 containerd[1871]: time="2025-09-03T23:26:17.782423277Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 3 23:26:17.782893 containerd[1871]: time="2025-09-03T23:26:17.782861571Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 3 23:26:18.815217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2212511563.mount: Deactivated successfully. Sep 3 23:26:19.070539 containerd[1871]: time="2025-09-03T23:26:19.070189758Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:19.073839 containerd[1871]: time="2025-09-03T23:26:19.073812846Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916095" Sep 3 23:26:19.096303 containerd[1871]: time="2025-09-03T23:26:19.095806643Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:19.101672 containerd[1871]: time="2025-09-03T23:26:19.101646456Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:19.103954 containerd[1871]: time="2025-09-03T23:26:19.103931216Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 1.320962043s" Sep 3 23:26:19.104050 containerd[1871]: time="2025-09-03T23:26:19.104037572Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 3 23:26:19.104617 containerd[1871]: time="2025-09-03T23:26:19.104600528Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 3 23:26:19.857447 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 3 23:26:20.394468 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount376770601.mount: Deactivated successfully. Sep 3 23:26:20.816701 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 3 23:26:20.818928 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:21.456803 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:21.464737 (kubelet)[2680]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 3 23:26:21.490059 kubelet[2680]: E0903 23:26:21.490005 2680 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 3 23:26:21.492118 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 3 23:26:21.492212 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 3 23:26:21.492423 systemd[1]: kubelet.service: Consumed 102ms CPU time, 106.4M memory peak. Sep 3 23:26:21.837144 containerd[1871]: time="2025-09-03T23:26:21.837104658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:21.839640 containerd[1871]: time="2025-09-03T23:26:21.839616514Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 3 23:26:21.842291 containerd[1871]: time="2025-09-03T23:26:21.842269712Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:21.845988 containerd[1871]: time="2025-09-03T23:26:21.845961249Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:21.846528 containerd[1871]: time="2025-09-03T23:26:21.846320446Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 2.741594346s" Sep 3 23:26:21.846528 containerd[1871]: time="2025-09-03T23:26:21.846343887Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 3 23:26:21.846748 containerd[1871]: time="2025-09-03T23:26:21.846729124Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 3 23:26:22.385563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3685294855.mount: Deactivated successfully. Sep 3 23:26:22.411252 containerd[1871]: time="2025-09-03T23:26:22.410784643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:22.413616 containerd[1871]: time="2025-09-03T23:26:22.413593141Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 3 23:26:22.416302 containerd[1871]: time="2025-09-03T23:26:22.416281868Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:22.419646 containerd[1871]: time="2025-09-03T23:26:22.419624257Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 3 23:26:22.419942 containerd[1871]: time="2025-09-03T23:26:22.419916556Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 573.164214ms" Sep 3 23:26:22.419942 containerd[1871]: time="2025-09-03T23:26:22.419943717Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 3 23:26:22.420490 containerd[1871]: time="2025-09-03T23:26:22.420462015Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 3 23:26:23.069222 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1164385585.mount: Deactivated successfully. Sep 3 23:26:23.847889 update_engine[1853]: I20250903 23:26:23.847815 1853 update_attempter.cc:509] Updating boot flags... Sep 3 23:26:24.961921 containerd[1871]: time="2025-09-03T23:26:24.961876819Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:24.964362 containerd[1871]: time="2025-09-03T23:26:24.964337331Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537161" Sep 3 23:26:24.967162 containerd[1871]: time="2025-09-03T23:26:24.967121881Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:24.970972 containerd[1871]: time="2025-09-03T23:26:24.970935528Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:24.973042 containerd[1871]: time="2025-09-03T23:26:24.971557442Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.551068906s" Sep 3 23:26:24.973042 containerd[1871]: time="2025-09-03T23:26:24.971583459Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 3 23:26:27.754782 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:27.754903 systemd[1]: kubelet.service: Consumed 102ms CPU time, 106.4M memory peak. Sep 3 23:26:27.756614 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:27.776669 systemd[1]: Reload requested from client PID 2964 ('systemctl') (unit session-9.scope)... Sep 3 23:26:27.776764 systemd[1]: Reloading... Sep 3 23:26:27.873542 zram_generator::config[3019]: No configuration found. Sep 3 23:26:27.935018 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:26:28.017470 systemd[1]: Reloading finished in 240 ms. Sep 3 23:26:28.053929 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 3 23:26:28.054117 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 3 23:26:28.054416 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:28.055569 systemd[1]: kubelet.service: Consumed 71ms CPU time, 95M memory peak. Sep 3 23:26:28.057239 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:28.331703 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:28.343718 (kubelet)[3078]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 3 23:26:28.368981 kubelet[3078]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:28.369583 kubelet[3078]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 3 23:26:28.369719 kubelet[3078]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:28.369834 kubelet[3078]: I0903 23:26:28.369807 3078 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 3 23:26:28.697742 kubelet[3078]: I0903 23:26:28.697483 3078 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 3 23:26:28.697742 kubelet[3078]: I0903 23:26:28.697550 3078 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 3 23:26:28.697860 kubelet[3078]: I0903 23:26:28.697764 3078 server.go:934] "Client rotation is on, will bootstrap in background" Sep 3 23:26:28.718777 kubelet[3078]: E0903 23:26:28.718632 3078 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.15:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.15:6443: connect: connection refused" logger="UnhandledError" Sep 3 23:26:28.719488 kubelet[3078]: I0903 23:26:28.719192 3078 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 3 23:26:28.725115 kubelet[3078]: I0903 23:26:28.725080 3078 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 3 23:26:28.728074 kubelet[3078]: I0903 23:26:28.728054 3078 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 3 23:26:28.728586 kubelet[3078]: I0903 23:26:28.728566 3078 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 3 23:26:28.728693 kubelet[3078]: I0903 23:26:28.728674 3078 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 3 23:26:28.728811 kubelet[3078]: I0903 23:26:28.728693 3078 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-71c6c07a75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 3 23:26:28.728886 kubelet[3078]: I0903 23:26:28.728819 3078 topology_manager.go:138] "Creating topology manager with none policy" Sep 3 23:26:28.728886 kubelet[3078]: I0903 23:26:28.728826 3078 container_manager_linux.go:300] "Creating device plugin manager" Sep 3 23:26:28.728930 kubelet[3078]: I0903 23:26:28.728919 3078 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:28.730894 kubelet[3078]: I0903 23:26:28.730875 3078 kubelet.go:408] "Attempting to sync node with API server" Sep 3 23:26:28.730912 kubelet[3078]: I0903 23:26:28.730902 3078 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 3 23:26:28.730927 kubelet[3078]: I0903 23:26:28.730920 3078 kubelet.go:314] "Adding apiserver pod source" Sep 3 23:26:28.730945 kubelet[3078]: I0903 23:26:28.730932 3078 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 3 23:26:28.735444 kubelet[3078]: W0903 23:26:28.735385 3078 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-71c6c07a75&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused Sep 3 23:26:28.735444 kubelet[3078]: E0903 23:26:28.735434 3078 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.15:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372.1.0-n-71c6c07a75&limit=500&resourceVersion=0\": dial tcp 10.200.20.15:6443: connect: connection refused" logger="UnhandledError" Sep 3 23:26:28.735526 kubelet[3078]: I0903 23:26:28.735490 3078 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 3 23:26:28.736621 kubelet[3078]: I0903 23:26:28.736604 3078 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 3 23:26:28.736658 kubelet[3078]: W0903 23:26:28.736646 3078 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 3 23:26:28.738876 kubelet[3078]: I0903 23:26:28.738857 3078 server.go:1274] "Started kubelet" Sep 3 23:26:28.744072 kubelet[3078]: I0903 23:26:28.743944 3078 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 3 23:26:28.745580 kubelet[3078]: E0903 23:26:28.744743 3078 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.15:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.15:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372.1.0-n-71c6c07a75.1861e96e19bf3410 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372.1.0-n-71c6c07a75,UID:ci-4372.1.0-n-71c6c07a75,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372.1.0-n-71c6c07a75,},FirstTimestamp:2025-09-03 23:26:28.738839568 +0000 UTC m=+0.392516770,LastTimestamp:2025-09-03 23:26:28.738839568 +0000 UTC m=+0.392516770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372.1.0-n-71c6c07a75,}" Sep 3 23:26:28.746846 kubelet[3078]: I0903 23:26:28.746257 3078 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 3 23:26:28.746846 kubelet[3078]: W0903 23:26:28.746390 3078 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.15:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused Sep 3 23:26:28.746846 kubelet[3078]: E0903 23:26:28.746427 3078 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.15:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.15:6443: connect: connection refused" logger="UnhandledError" Sep 3 23:26:28.747043 kubelet[3078]: I0903 23:26:28.747024 3078 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 3 23:26:28.747174 kubelet[3078]: E0903 23:26:28.747155 3078 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-71c6c07a75\" not found" Sep 3 23:26:28.747498 kubelet[3078]: I0903 23:26:28.747473 3078 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 3 23:26:28.747561 kubelet[3078]: I0903 23:26:28.747543 3078 reconciler.go:26] "Reconciler: start to sync state" Sep 3 23:26:28.748274 kubelet[3078]: W0903 23:26:28.748028 3078 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused Sep 3 23:26:28.748274 kubelet[3078]: E0903 23:26:28.748063 3078 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.15:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.15:6443: connect: connection refused" logger="UnhandledError" Sep 3 23:26:28.748274 kubelet[3078]: E0903 23:26:28.748099 3078 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-71c6c07a75?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="200ms" Sep 3 23:26:28.748363 kubelet[3078]: I0903 23:26:28.748349 3078 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 3 23:26:28.748474 kubelet[3078]: I0903 23:26:28.748437 3078 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 3 23:26:28.748970 kubelet[3078]: I0903 23:26:28.748951 3078 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 3 23:26:28.749157 kubelet[3078]: I0903 23:26:28.749136 3078 factory.go:221] Registration of the systemd container factory successfully Sep 3 23:26:28.749250 kubelet[3078]: I0903 23:26:28.748956 3078 server.go:449] "Adding debug handlers to kubelet server" Sep 3 23:26:28.749981 kubelet[3078]: I0903 23:26:28.749836 3078 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 3 23:26:28.750257 kubelet[3078]: E0903 23:26:28.750109 3078 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 3 23:26:28.750986 kubelet[3078]: I0903 23:26:28.750967 3078 factory.go:221] Registration of the containerd container factory successfully Sep 3 23:26:28.769086 kubelet[3078]: I0903 23:26:28.769069 3078 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 3 23:26:28.769086 kubelet[3078]: I0903 23:26:28.769080 3078 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 3 23:26:28.769193 kubelet[3078]: I0903 23:26:28.769095 3078 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:28.847540 kubelet[3078]: E0903 23:26:28.847434 3078 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372.1.0-n-71c6c07a75\" not found" Sep 3 23:26:28.866768 kubelet[3078]: I0903 23:26:28.866742 3078 policy_none.go:49] "None policy: Start" Sep 3 23:26:28.867643 kubelet[3078]: I0903 23:26:28.867630 3078 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 3 23:26:28.867809 kubelet[3078]: I0903 23:26:28.867750 3078 state_mem.go:35] "Initializing new in-memory state store" Sep 3 23:26:28.879192 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 3 23:26:28.889725 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 3 23:26:28.892883 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 3 23:26:28.900147 kubelet[3078]: I0903 23:26:28.900119 3078 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 3 23:26:28.900369 kubelet[3078]: I0903 23:26:28.900278 3078 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 3 23:26:28.900369 kubelet[3078]: I0903 23:26:28.900292 3078 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 3 23:26:28.900735 kubelet[3078]: I0903 23:26:28.900719 3078 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 3 23:26:28.901788 kubelet[3078]: E0903 23:26:28.901773 3078 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372.1.0-n-71c6c07a75\" not found" Sep 3 23:26:28.931317 kubelet[3078]: I0903 23:26:28.931276 3078 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 3 23:26:28.932335 kubelet[3078]: I0903 23:26:28.932219 3078 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 3 23:26:28.932335 kubelet[3078]: I0903 23:26:28.932240 3078 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 3 23:26:28.932335 kubelet[3078]: I0903 23:26:28.932258 3078 kubelet.go:2321] "Starting kubelet main sync loop" Sep 3 23:26:28.932876 kubelet[3078]: E0903 23:26:28.932855 3078 kubelet.go:2345] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 3 23:26:28.934280 kubelet[3078]: W0903 23:26:28.934221 3078 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.15:6443: connect: connection refused Sep 3 23:26:28.934370 kubelet[3078]: E0903 23:26:28.934298 3078 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.15:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.15:6443: connect: connection refused" logger="UnhandledError" Sep 3 23:26:28.948506 kubelet[3078]: E0903 23:26:28.948414 3078 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-71c6c07a75?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="400ms" Sep 3 23:26:29.001994 kubelet[3078]: I0903 23:26:29.001962 3078 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.002304 kubelet[3078]: E0903 23:26:29.002267 3078 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.042238 systemd[1]: Created slice kubepods-burstable-pod97bb94e5cc66c20b7f6dd1311acc9c59.slice - libcontainer container kubepods-burstable-pod97bb94e5cc66c20b7f6dd1311acc9c59.slice. Sep 3 23:26:29.048470 kubelet[3078]: I0903 23:26:29.048408 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048470 kubelet[3078]: I0903 23:26:29.048432 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048470 kubelet[3078]: I0903 23:26:29.048444 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048470 kubelet[3078]: I0903 23:26:29.048455 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048727 kubelet[3078]: I0903 23:26:29.048667 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048727 kubelet[3078]: I0903 23:26:29.048690 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048889 kubelet[3078]: I0903 23:26:29.048821 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048889 kubelet[3078]: I0903 23:26:29.048839 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60515c2e9aa4017177c71026ed950d01-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-71c6c07a75\" (UID: \"60515c2e9aa4017177c71026ed950d01\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.048974 kubelet[3078]: I0903 23:26:29.048850 3078 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.051756 systemd[1]: Created slice kubepods-burstable-pod4e3df20c77dc69a552cec0996ea365d0.slice - libcontainer container kubepods-burstable-pod4e3df20c77dc69a552cec0996ea365d0.slice. Sep 3 23:26:29.075289 systemd[1]: Created slice kubepods-burstable-pod60515c2e9aa4017177c71026ed950d01.slice - libcontainer container kubepods-burstable-pod60515c2e9aa4017177c71026ed950d01.slice. Sep 3 23:26:29.204710 kubelet[3078]: I0903 23:26:29.204616 3078 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.204934 kubelet[3078]: E0903 23:26:29.204912 3078 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.349250 kubelet[3078]: E0903 23:26:29.349188 3078 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.15:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372.1.0-n-71c6c07a75?timeout=10s\": dial tcp 10.200.20.15:6443: connect: connection refused" interval="800ms" Sep 3 23:26:29.351395 containerd[1871]: time="2025-09-03T23:26:29.351164673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-71c6c07a75,Uid:97bb94e5cc66c20b7f6dd1311acc9c59,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:29.374712 containerd[1871]: time="2025-09-03T23:26:29.374677713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-71c6c07a75,Uid:4e3df20c77dc69a552cec0996ea365d0,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:29.378350 containerd[1871]: time="2025-09-03T23:26:29.378321973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-71c6c07a75,Uid:60515c2e9aa4017177c71026ed950d01,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:29.451289 containerd[1871]: time="2025-09-03T23:26:29.451238742Z" level=info msg="connecting to shim c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071" address="unix:///run/containerd/s/c02a344389b47e8986187a2375a4f5a54d4c2c0a1974c5fb6caa1ffc998128b7" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:29.452170 containerd[1871]: time="2025-09-03T23:26:29.452106422Z" level=info msg="connecting to shim ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798" address="unix:///run/containerd/s/ff1b47d78d2ecd0534736226b05bd0975a2df8095434589fe374175ec3e842ae" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:29.479089 containerd[1871]: time="2025-09-03T23:26:29.478532630Z" level=info msg="connecting to shim 95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b" address="unix:///run/containerd/s/61bcf8b9def7bc6580a85c5c3d696e8acf30609e75a337c750b510c5c7cf964a" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:29.481673 systemd[1]: Started cri-containerd-c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071.scope - libcontainer container c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071. Sep 3 23:26:29.485044 systemd[1]: Started cri-containerd-ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798.scope - libcontainer container ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798. Sep 3 23:26:29.512008 systemd[1]: Started cri-containerd-95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b.scope - libcontainer container 95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b. Sep 3 23:26:29.529912 containerd[1871]: time="2025-09-03T23:26:29.529881122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372.1.0-n-71c6c07a75,Uid:97bb94e5cc66c20b7f6dd1311acc9c59,Namespace:kube-system,Attempt:0,} returns sandbox id \"c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071\"" Sep 3 23:26:29.536730 containerd[1871]: time="2025-09-03T23:26:29.536700378Z" level=info msg="CreateContainer within sandbox \"c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 3 23:26:29.539114 containerd[1871]: time="2025-09-03T23:26:29.539082719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372.1.0-n-71c6c07a75,Uid:4e3df20c77dc69a552cec0996ea365d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798\"" Sep 3 23:26:29.542550 containerd[1871]: time="2025-09-03T23:26:29.542088760Z" level=info msg="CreateContainer within sandbox \"ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 3 23:26:29.558597 containerd[1871]: time="2025-09-03T23:26:29.558569157Z" level=info msg="Container 64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:29.561378 containerd[1871]: time="2025-09-03T23:26:29.561349721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372.1.0-n-71c6c07a75,Uid:60515c2e9aa4017177c71026ed950d01,Namespace:kube-system,Attempt:0,} returns sandbox id \"95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b\"" Sep 3 23:26:29.562959 containerd[1871]: time="2025-09-03T23:26:29.562936199Z" level=info msg="CreateContainer within sandbox \"95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 3 23:26:29.569184 containerd[1871]: time="2025-09-03T23:26:29.569157188Z" level=info msg="Container 93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:29.580408 containerd[1871]: time="2025-09-03T23:26:29.580378214Z" level=info msg="CreateContainer within sandbox \"c2df8f76afd428c89fe047edf35884c29e0f138440a760569a390d0b68a89071\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da\"" Sep 3 23:26:29.581541 containerd[1871]: time="2025-09-03T23:26:29.580863144Z" level=info msg="StartContainer for \"64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da\"" Sep 3 23:26:29.581933 containerd[1871]: time="2025-09-03T23:26:29.581871426Z" level=info msg="connecting to shim 64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da" address="unix:///run/containerd/s/c02a344389b47e8986187a2375a4f5a54d4c2c0a1974c5fb6caa1ffc998128b7" protocol=ttrpc version=3 Sep 3 23:26:29.595625 systemd[1]: Started cri-containerd-64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da.scope - libcontainer container 64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da. Sep 3 23:26:29.599684 containerd[1871]: time="2025-09-03T23:26:29.599641904Z" level=info msg="CreateContainer within sandbox \"ed8767fae36172bd2a42c272cdedb9cb566c7fe91cfa1ab9ed39842e06a6d798\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df\"" Sep 3 23:26:29.600313 containerd[1871]: time="2025-09-03T23:26:29.600075072Z" level=info msg="StartContainer for \"93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df\"" Sep 3 23:26:29.601263 containerd[1871]: time="2025-09-03T23:26:29.601227086Z" level=info msg="connecting to shim 93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df" address="unix:///run/containerd/s/ff1b47d78d2ecd0534736226b05bd0975a2df8095434589fe374175ec3e842ae" protocol=ttrpc version=3 Sep 3 23:26:29.606352 containerd[1871]: time="2025-09-03T23:26:29.606318277Z" level=info msg="Container 5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:29.607342 kubelet[3078]: I0903 23:26:29.607306 3078 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.607905 kubelet[3078]: E0903 23:26:29.607858 3078 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.15:6443/api/v1/nodes\": dial tcp 10.200.20.15:6443: connect: connection refused" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:29.623683 systemd[1]: Started cri-containerd-93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df.scope - libcontainer container 93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df. Sep 3 23:26:29.625471 containerd[1871]: time="2025-09-03T23:26:29.625436172Z" level=info msg="CreateContainer within sandbox \"95e4d0015117930d9eaa195a2f25ff81e503764a1576c366759add0ba48b968b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700\"" Sep 3 23:26:29.627164 containerd[1871]: time="2025-09-03T23:26:29.627013338Z" level=info msg="StartContainer for \"5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700\"" Sep 3 23:26:29.629029 containerd[1871]: time="2025-09-03T23:26:29.628880445Z" level=info msg="connecting to shim 5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700" address="unix:///run/containerd/s/61bcf8b9def7bc6580a85c5c3d696e8acf30609e75a337c750b510c5c7cf964a" protocol=ttrpc version=3 Sep 3 23:26:29.644785 containerd[1871]: time="2025-09-03T23:26:29.644663549Z" level=info msg="StartContainer for \"64fe7945b64bc991b90b385a935de8766b858e63fc7a09a745364f5c3a2ad5da\" returns successfully" Sep 3 23:26:29.650644 systemd[1]: Started cri-containerd-5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700.scope - libcontainer container 5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700. Sep 3 23:26:29.687356 containerd[1871]: time="2025-09-03T23:26:29.687315822Z" level=info msg="StartContainer for \"93b422c6f1a6baac32f35d6d42b4940d9195760e26b3d5d01c2804091053f0df\" returns successfully" Sep 3 23:26:29.709125 containerd[1871]: time="2025-09-03T23:26:29.709091807Z" level=info msg="StartContainer for \"5b849eb6a584e756af885f5cc2d632297bd7a8becf6663d0d4113d08b208d700\" returns successfully" Sep 3 23:26:30.410251 kubelet[3078]: I0903 23:26:30.410189 3078 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:30.979030 kubelet[3078]: E0903 23:26:30.978987 3078 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372.1.0-n-71c6c07a75\" not found" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:31.034306 kubelet[3078]: I0903 23:26:31.034236 3078 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:31.741669 kubelet[3078]: I0903 23:26:31.741629 3078 apiserver.go:52] "Watching apiserver" Sep 3 23:26:31.747657 kubelet[3078]: I0903 23:26:31.747620 3078 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 3 23:26:32.188945 kubelet[3078]: W0903 23:26:32.188909 3078 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 3 23:26:33.339481 systemd[1]: Reload requested from client PID 3346 ('systemctl') (unit session-9.scope)... Sep 3 23:26:33.339497 systemd[1]: Reloading... Sep 3 23:26:33.423550 zram_generator::config[3392]: No configuration found. Sep 3 23:26:33.495600 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 3 23:26:33.587230 systemd[1]: Reloading finished in 247 ms. Sep 3 23:26:33.617050 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:33.629819 systemd[1]: kubelet.service: Deactivated successfully. Sep 3 23:26:33.629979 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:33.630014 systemd[1]: kubelet.service: Consumed 635ms CPU time, 126M memory peak. Sep 3 23:26:33.633014 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 3 23:26:33.728950 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 3 23:26:33.735818 (kubelet)[3456]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 3 23:26:33.861059 kubelet[3456]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:33.861408 kubelet[3456]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 3 23:26:33.861453 kubelet[3456]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 3 23:26:33.861676 kubelet[3456]: I0903 23:26:33.861641 3456 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 3 23:26:33.868756 kubelet[3456]: I0903 23:26:33.868677 3456 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 3 23:26:33.868756 kubelet[3456]: I0903 23:26:33.868701 3456 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 3 23:26:33.869646 kubelet[3456]: I0903 23:26:33.869624 3456 server.go:934] "Client rotation is on, will bootstrap in background" Sep 3 23:26:33.871400 kubelet[3456]: I0903 23:26:33.870853 3456 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 3 23:26:33.873444 kubelet[3456]: I0903 23:26:33.873329 3456 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 3 23:26:33.878135 kubelet[3456]: I0903 23:26:33.878122 3456 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 3 23:26:33.880753 kubelet[3456]: I0903 23:26:33.880674 3456 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 3 23:26:33.882576 kubelet[3456]: I0903 23:26:33.881822 3456 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 3 23:26:33.882576 kubelet[3456]: I0903 23:26:33.881960 3456 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 3 23:26:33.882576 kubelet[3456]: I0903 23:26:33.881981 3456 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372.1.0-n-71c6c07a75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 3 23:26:33.882576 kubelet[3456]: I0903 23:26:33.882148 3456 topology_manager.go:138] "Creating topology manager with none policy" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882157 3456 container_manager_linux.go:300] "Creating device plugin manager" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882188 3456 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882267 3456 kubelet.go:408] "Attempting to sync node with API server" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882276 3456 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882290 3456 kubelet.go:314] "Adding apiserver pod source" Sep 3 23:26:33.882729 kubelet[3456]: I0903 23:26:33.882300 3456 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 3 23:26:33.892480 kubelet[3456]: I0903 23:26:33.892455 3456 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 3 23:26:33.892776 kubelet[3456]: I0903 23:26:33.892752 3456 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 3 23:26:33.893056 kubelet[3456]: I0903 23:26:33.893037 3456 server.go:1274] "Started kubelet" Sep 3 23:26:33.894149 kubelet[3456]: I0903 23:26:33.894131 3456 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 3 23:26:33.896242 kubelet[3456]: I0903 23:26:33.896208 3456 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 3 23:26:33.897120 kubelet[3456]: I0903 23:26:33.896882 3456 server.go:449] "Adding debug handlers to kubelet server" Sep 3 23:26:33.897880 kubelet[3456]: I0903 23:26:33.897835 3456 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 3 23:26:33.898015 kubelet[3456]: I0903 23:26:33.898000 3456 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 3 23:26:33.903459 kubelet[3456]: I0903 23:26:33.901726 3456 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 3 23:26:33.903459 kubelet[3456]: I0903 23:26:33.903075 3456 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 3 23:26:33.905715 kubelet[3456]: I0903 23:26:33.905691 3456 factory.go:221] Registration of the systemd container factory successfully Sep 3 23:26:33.905915 kubelet[3456]: I0903 23:26:33.905779 3456 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 3 23:26:33.906678 kubelet[3456]: I0903 23:26:33.906622 3456 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 3 23:26:33.907295 kubelet[3456]: I0903 23:26:33.906854 3456 reconciler.go:26] "Reconciler: start to sync state" Sep 3 23:26:33.907630 kubelet[3456]: I0903 23:26:33.907614 3456 factory.go:221] Registration of the containerd container factory successfully Sep 3 23:26:33.908316 kubelet[3456]: I0903 23:26:33.908282 3456 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 3 23:26:33.909143 kubelet[3456]: I0903 23:26:33.909115 3456 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 3 23:26:33.909143 kubelet[3456]: I0903 23:26:33.909137 3456 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 3 23:26:33.909225 kubelet[3456]: I0903 23:26:33.909150 3456 kubelet.go:2321] "Starting kubelet main sync loop" Sep 3 23:26:33.909225 kubelet[3456]: E0903 23:26:33.909181 3456 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 3 23:26:33.952054 kubelet[3456]: I0903 23:26:33.952037 3456 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 3 23:26:33.952177 kubelet[3456]: I0903 23:26:33.952166 3456 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 3 23:26:33.952232 kubelet[3456]: I0903 23:26:33.952225 3456 state_mem.go:36] "Initialized new in-memory state store" Sep 3 23:26:33.952382 kubelet[3456]: I0903 23:26:33.952370 3456 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 3 23:26:33.952457 kubelet[3456]: I0903 23:26:33.952430 3456 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 3 23:26:33.952493 kubelet[3456]: I0903 23:26:33.952488 3456 policy_none.go:49] "None policy: Start" Sep 3 23:26:33.953027 kubelet[3456]: I0903 23:26:33.953015 3456 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 3 23:26:33.953314 kubelet[3456]: I0903 23:26:33.953145 3456 state_mem.go:35] "Initializing new in-memory state store" Sep 3 23:26:33.953314 kubelet[3456]: I0903 23:26:33.953259 3456 state_mem.go:75] "Updated machine memory state" Sep 3 23:26:33.957099 kubelet[3456]: I0903 23:26:33.957081 3456 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 3 23:26:33.957228 kubelet[3456]: I0903 23:26:33.957212 3456 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 3 23:26:33.957265 kubelet[3456]: I0903 23:26:33.957225 3456 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 3 23:26:33.957610 kubelet[3456]: I0903 23:26:33.957599 3456 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 3 23:26:34.018163 kubelet[3456]: W0903 23:26:34.018141 3456 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 3 23:26:34.024112 kubelet[3456]: W0903 23:26:34.023931 3456 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 3 23:26:34.024489 kubelet[3456]: W0903 23:26:34.024308 3456 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 3 23:26:34.024489 kubelet[3456]: E0903 23:26:34.024435 3456 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4372.1.0-n-71c6c07a75\" already exists" pod="kube-system/kube-scheduler-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.063840 kubelet[3456]: I0903 23:26:34.063822 3456 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.085538 kubelet[3456]: I0903 23:26:34.085491 3456 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.085632 kubelet[3456]: I0903 23:26:34.085607 3456 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108758 kubelet[3456]: I0903 23:26:34.108587 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-k8s-certs\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108758 kubelet[3456]: I0903 23:26:34.108616 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-k8s-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108758 kubelet[3456]: I0903 23:26:34.108631 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-kubeconfig\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108758 kubelet[3456]: I0903 23:26:34.108643 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-ca-certs\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108758 kubelet[3456]: I0903 23:26:34.108654 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/97bb94e5cc66c20b7f6dd1311acc9c59-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" (UID: \"97bb94e5cc66c20b7f6dd1311acc9c59\") " pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108947 kubelet[3456]: I0903 23:26:34.108667 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-ca-certs\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108947 kubelet[3456]: I0903 23:26:34.108678 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-flexvolume-dir\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108947 kubelet[3456]: I0903 23:26:34.108689 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4e3df20c77dc69a552cec0996ea365d0-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372.1.0-n-71c6c07a75\" (UID: \"4e3df20c77dc69a552cec0996ea365d0\") " pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.108947 kubelet[3456]: I0903 23:26:34.108699 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60515c2e9aa4017177c71026ed950d01-kubeconfig\") pod \"kube-scheduler-ci-4372.1.0-n-71c6c07a75\" (UID: \"60515c2e9aa4017177c71026ed950d01\") " pod="kube-system/kube-scheduler-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.885748 kubelet[3456]: I0903 23:26:34.885613 3456 apiserver.go:52] "Watching apiserver" Sep 3 23:26:34.907025 kubelet[3456]: I0903 23:26:34.906945 3456 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 3 23:26:34.951918 kubelet[3456]: W0903 23:26:34.951879 3456 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 3 23:26:34.952045 kubelet[3456]: E0903 23:26:34.951943 3456 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4372.1.0-n-71c6c07a75\" already exists" pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" Sep 3 23:26:34.959146 kubelet[3456]: I0903 23:26:34.959073 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372.1.0-n-71c6c07a75" podStartSLOduration=0.959060497 podStartE2EDuration="959.060497ms" podCreationTimestamp="2025-09-03 23:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:34.958269827 +0000 UTC m=+1.219826897" watchObservedRunningTime="2025-09-03 23:26:34.959060497 +0000 UTC m=+1.220617566" Sep 3 23:26:34.979840 kubelet[3456]: I0903 23:26:34.979788 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372.1.0-n-71c6c07a75" podStartSLOduration=0.979779058 podStartE2EDuration="979.779058ms" podCreationTimestamp="2025-09-03 23:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:34.979632383 +0000 UTC m=+1.241189452" watchObservedRunningTime="2025-09-03 23:26:34.979779058 +0000 UTC m=+1.241336119" Sep 3 23:26:34.979951 kubelet[3456]: I0903 23:26:34.979856 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372.1.0-n-71c6c07a75" podStartSLOduration=2.9798531390000003 podStartE2EDuration="2.979853139s" podCreationTimestamp="2025-09-03 23:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:34.968827135 +0000 UTC m=+1.230384228" watchObservedRunningTime="2025-09-03 23:26:34.979853139 +0000 UTC m=+1.241410208" Sep 3 23:26:39.323169 kubelet[3456]: I0903 23:26:39.323127 3456 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 3 23:26:39.323781 containerd[1871]: time="2025-09-03T23:26:39.323690822Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 3 23:26:39.324080 kubelet[3456]: I0903 23:26:39.323885 3456 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 3 23:26:40.278865 systemd[1]: Created slice kubepods-besteffort-pod91b46e3e_cfaf_491e_9f8f_175ac3358db3.slice - libcontainer container kubepods-besteffort-pod91b46e3e_cfaf_491e_9f8f_175ac3358db3.slice. Sep 3 23:26:40.338742 kubelet[3456]: I0903 23:26:40.338702 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/91b46e3e-cfaf-491e-9f8f-175ac3358db3-kube-proxy\") pod \"kube-proxy-jp8dn\" (UID: \"91b46e3e-cfaf-491e-9f8f-175ac3358db3\") " pod="kube-system/kube-proxy-jp8dn" Sep 3 23:26:40.338742 kubelet[3456]: I0903 23:26:40.338738 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b46e3e-cfaf-491e-9f8f-175ac3358db3-lib-modules\") pod \"kube-proxy-jp8dn\" (UID: \"91b46e3e-cfaf-491e-9f8f-175ac3358db3\") " pod="kube-system/kube-proxy-jp8dn" Sep 3 23:26:40.339069 kubelet[3456]: I0903 23:26:40.338753 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mt6t\" (UniqueName: \"kubernetes.io/projected/91b46e3e-cfaf-491e-9f8f-175ac3358db3-kube-api-access-8mt6t\") pod \"kube-proxy-jp8dn\" (UID: \"91b46e3e-cfaf-491e-9f8f-175ac3358db3\") " pod="kube-system/kube-proxy-jp8dn" Sep 3 23:26:40.339069 kubelet[3456]: I0903 23:26:40.338769 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/91b46e3e-cfaf-491e-9f8f-175ac3358db3-xtables-lock\") pod \"kube-proxy-jp8dn\" (UID: \"91b46e3e-cfaf-491e-9f8f-175ac3358db3\") " pod="kube-system/kube-proxy-jp8dn" Sep 3 23:26:40.503394 systemd[1]: Created slice kubepods-besteffort-podd0e0a4be_6312_49ea_8baf_cbd3228d3c0f.slice - libcontainer container kubepods-besteffort-podd0e0a4be_6312_49ea_8baf_cbd3228d3c0f.slice. Sep 3 23:26:40.540929 kubelet[3456]: I0903 23:26:40.540820 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d0e0a4be-6312-49ea-8baf-cbd3228d3c0f-var-lib-calico\") pod \"tigera-operator-58fc44c59b-g6db4\" (UID: \"d0e0a4be-6312-49ea-8baf-cbd3228d3c0f\") " pod="tigera-operator/tigera-operator-58fc44c59b-g6db4" Sep 3 23:26:40.540929 kubelet[3456]: I0903 23:26:40.540858 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4hn\" (UniqueName: \"kubernetes.io/projected/d0e0a4be-6312-49ea-8baf-cbd3228d3c0f-kube-api-access-kg4hn\") pod \"tigera-operator-58fc44c59b-g6db4\" (UID: \"d0e0a4be-6312-49ea-8baf-cbd3228d3c0f\") " pod="tigera-operator/tigera-operator-58fc44c59b-g6db4" Sep 3 23:26:40.586820 containerd[1871]: time="2025-09-03T23:26:40.586782556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jp8dn,Uid:91b46e3e-cfaf-491e-9f8f-175ac3358db3,Namespace:kube-system,Attempt:0,}" Sep 3 23:26:40.629929 containerd[1871]: time="2025-09-03T23:26:40.629889290Z" level=info msg="connecting to shim 23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18" address="unix:///run/containerd/s/7260ef5b588a75b208033e80bb5761964fb617fe26315edae35cb6b091f89083" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:40.658634 systemd[1]: Started cri-containerd-23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18.scope - libcontainer container 23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18. Sep 3 23:26:40.679161 containerd[1871]: time="2025-09-03T23:26:40.679118456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jp8dn,Uid:91b46e3e-cfaf-491e-9f8f-175ac3358db3,Namespace:kube-system,Attempt:0,} returns sandbox id \"23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18\"" Sep 3 23:26:40.681772 containerd[1871]: time="2025-09-03T23:26:40.681679388Z" level=info msg="CreateContainer within sandbox \"23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 3 23:26:40.704257 containerd[1871]: time="2025-09-03T23:26:40.703721355Z" level=info msg="Container 7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:40.720024 containerd[1871]: time="2025-09-03T23:26:40.720000352Z" level=info msg="CreateContainer within sandbox \"23b9e774176728076b702d9c0296ef1925b526c08e3a2613a1ffdaf3d3ecfa18\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63\"" Sep 3 23:26:40.720758 containerd[1871]: time="2025-09-03T23:26:40.720723812Z" level=info msg="StartContainer for \"7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63\"" Sep 3 23:26:40.726029 containerd[1871]: time="2025-09-03T23:26:40.726004598Z" level=info msg="connecting to shim 7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63" address="unix:///run/containerd/s/7260ef5b588a75b208033e80bb5761964fb617fe26315edae35cb6b091f89083" protocol=ttrpc version=3 Sep 3 23:26:40.742635 systemd[1]: Started cri-containerd-7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63.scope - libcontainer container 7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63. Sep 3 23:26:40.775651 containerd[1871]: time="2025-09-03T23:26:40.775619483Z" level=info msg="StartContainer for \"7f9a22839c1fed837dfdf6fddac3d6b9b0ffd66b8f9dcb636a3d7358857a8c63\" returns successfully" Sep 3 23:26:40.807075 containerd[1871]: time="2025-09-03T23:26:40.806873152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-g6db4,Uid:d0e0a4be-6312-49ea-8baf-cbd3228d3c0f,Namespace:tigera-operator,Attempt:0,}" Sep 3 23:26:40.872161 containerd[1871]: time="2025-09-03T23:26:40.872112407Z" level=info msg="connecting to shim 7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848" address="unix:///run/containerd/s/470a982556c58803aed455d72f258f6a43a2a793f68e87e2cf504e362e7ed500" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:40.889754 systemd[1]: Started cri-containerd-7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848.scope - libcontainer container 7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848. Sep 3 23:26:40.924017 containerd[1871]: time="2025-09-03T23:26:40.923990810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-g6db4,Uid:d0e0a4be-6312-49ea-8baf-cbd3228d3c0f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848\"" Sep 3 23:26:40.927278 containerd[1871]: time="2025-09-03T23:26:40.927081511Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 3 23:26:42.470113 kubelet[3456]: I0903 23:26:42.469729 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jp8dn" podStartSLOduration=2.469704547 podStartE2EDuration="2.469704547s" podCreationTimestamp="2025-09-03 23:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:26:40.985301102 +0000 UTC m=+7.246858163" watchObservedRunningTime="2025-09-03 23:26:42.469704547 +0000 UTC m=+8.731261608" Sep 3 23:26:42.479245 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3014844444.mount: Deactivated successfully. Sep 3 23:26:42.784634 containerd[1871]: time="2025-09-03T23:26:42.784128976Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:42.786853 containerd[1871]: time="2025-09-03T23:26:42.786830694Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 3 23:26:42.789489 containerd[1871]: time="2025-09-03T23:26:42.789463955Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:42.796703 containerd[1871]: time="2025-09-03T23:26:42.796360200Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:42.796839 containerd[1871]: time="2025-09-03T23:26:42.796819104Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.869684808s" Sep 3 23:26:42.796918 containerd[1871]: time="2025-09-03T23:26:42.796905017Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 3 23:26:42.800029 containerd[1871]: time="2025-09-03T23:26:42.799973005Z" level=info msg="CreateContainer within sandbox \"7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 3 23:26:42.819653 containerd[1871]: time="2025-09-03T23:26:42.819626699Z" level=info msg="Container f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:42.833825 containerd[1871]: time="2025-09-03T23:26:42.833796179Z" level=info msg="CreateContainer within sandbox \"7fbddb4a9c2891ad947c88ca411f7fc286031eb7c309e793d75b761ffa316848\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b\"" Sep 3 23:26:42.834333 containerd[1871]: time="2025-09-03T23:26:42.834192226Z" level=info msg="StartContainer for \"f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b\"" Sep 3 23:26:42.834932 containerd[1871]: time="2025-09-03T23:26:42.834903486Z" level=info msg="connecting to shim f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b" address="unix:///run/containerd/s/470a982556c58803aed455d72f258f6a43a2a793f68e87e2cf504e362e7ed500" protocol=ttrpc version=3 Sep 3 23:26:42.853630 systemd[1]: Started cri-containerd-f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b.scope - libcontainer container f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b. Sep 3 23:26:42.877687 containerd[1871]: time="2025-09-03T23:26:42.877650987Z" level=info msg="StartContainer for \"f73ecdd38bb6b908e03166b4e733dbab00032c10798129eb50cbb98bbfc6826b\" returns successfully" Sep 3 23:26:42.982991 kubelet[3456]: I0903 23:26:42.982685 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-g6db4" podStartSLOduration=1.109783915 podStartE2EDuration="2.982650305s" podCreationTimestamp="2025-09-03 23:26:40 +0000 UTC" firstStartedPulling="2025-09-03 23:26:40.925331777 +0000 UTC m=+7.186888838" lastFinishedPulling="2025-09-03 23:26:42.798198167 +0000 UTC m=+9.059755228" observedRunningTime="2025-09-03 23:26:42.982369132 +0000 UTC m=+9.243926201" watchObservedRunningTime="2025-09-03 23:26:42.982650305 +0000 UTC m=+9.244207374" Sep 3 23:26:48.015703 sudo[2342]: pam_unix(sudo:session): session closed for user root Sep 3 23:26:48.101371 sshd[2341]: Connection closed by 10.200.16.10 port 58352 Sep 3 23:26:48.101958 sshd-session[2339]: pam_unix(sshd:session): session closed for user core Sep 3 23:26:48.107158 systemd[1]: sshd@6-10.200.20.15:22-10.200.16.10:58352.service: Deactivated successfully. Sep 3 23:26:48.107469 systemd-logind[1851]: Session 9 logged out. Waiting for processes to exit. Sep 3 23:26:48.111310 systemd[1]: session-9.scope: Deactivated successfully. Sep 3 23:26:48.111496 systemd[1]: session-9.scope: Consumed 3.550s CPU time, 226.7M memory peak. Sep 3 23:26:48.114459 systemd-logind[1851]: Removed session 9. Sep 3 23:26:52.453863 systemd[1]: Created slice kubepods-besteffort-pod00078e0b_34d6_49c7_ac39_c578496cc6ba.slice - libcontainer container kubepods-besteffort-pod00078e0b_34d6_49c7_ac39_c578496cc6ba.slice. Sep 3 23:26:52.506773 kubelet[3456]: I0903 23:26:52.506725 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00078e0b-34d6-49c7-ac39-c578496cc6ba-tigera-ca-bundle\") pod \"calico-typha-6475fc57d7-62nz4\" (UID: \"00078e0b-34d6-49c7-ac39-c578496cc6ba\") " pod="calico-system/calico-typha-6475fc57d7-62nz4" Sep 3 23:26:52.507424 kubelet[3456]: I0903 23:26:52.506756 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/00078e0b-34d6-49c7-ac39-c578496cc6ba-typha-certs\") pod \"calico-typha-6475fc57d7-62nz4\" (UID: \"00078e0b-34d6-49c7-ac39-c578496cc6ba\") " pod="calico-system/calico-typha-6475fc57d7-62nz4" Sep 3 23:26:52.507424 kubelet[3456]: I0903 23:26:52.506912 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hwfr\" (UniqueName: \"kubernetes.io/projected/00078e0b-34d6-49c7-ac39-c578496cc6ba-kube-api-access-5hwfr\") pod \"calico-typha-6475fc57d7-62nz4\" (UID: \"00078e0b-34d6-49c7-ac39-c578496cc6ba\") " pod="calico-system/calico-typha-6475fc57d7-62nz4" Sep 3 23:26:52.718046 systemd[1]: Created slice kubepods-besteffort-pod05bf84d3_e41b_41f3_87ad_cb3bd6e5f93e.slice - libcontainer container kubepods-besteffort-pod05bf84d3_e41b_41f3_87ad_cb3bd6e5f93e.slice. Sep 3 23:26:52.759257 containerd[1871]: time="2025-09-03T23:26:52.758851987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6475fc57d7-62nz4,Uid:00078e0b-34d6-49c7-ac39-c578496cc6ba,Namespace:calico-system,Attempt:0,}" Sep 3 23:26:52.800679 containerd[1871]: time="2025-09-03T23:26:52.800577195Z" level=info msg="connecting to shim 8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0" address="unix:///run/containerd/s/cdd9187bb6e9180f92b9c013e64213beb88d6ec71d965793802d1d3effefca2f" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:52.809391 kubelet[3456]: I0903 23:26:52.809299 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-cni-bin-dir\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809391 kubelet[3456]: I0903 23:26:52.809343 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-cni-log-dir\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809391 kubelet[3456]: I0903 23:26:52.809363 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-lib-modules\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809391 kubelet[3456]: I0903 23:26:52.809395 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-var-run-calico\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809737 kubelet[3456]: I0903 23:26:52.809408 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-node-certs\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809737 kubelet[3456]: I0903 23:26:52.809419 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-policysync\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809737 kubelet[3456]: I0903 23:26:52.809430 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-tigera-ca-bundle\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809737 kubelet[3456]: I0903 23:26:52.809440 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-xtables-lock\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.809737 kubelet[3456]: I0903 23:26:52.809481 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-cni-net-dir\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.810023 kubelet[3456]: I0903 23:26:52.809490 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-flexvol-driver-host\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.810023 kubelet[3456]: I0903 23:26:52.809501 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-var-lib-calico\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.810023 kubelet[3456]: I0903 23:26:52.809549 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrthd\" (UniqueName: \"kubernetes.io/projected/05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e-kube-api-access-mrthd\") pod \"calico-node-6476q\" (UID: \"05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e\") " pod="calico-system/calico-node-6476q" Sep 3 23:26:52.821638 systemd[1]: Started cri-containerd-8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0.scope - libcontainer container 8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0. Sep 3 23:26:52.855428 containerd[1871]: time="2025-09-03T23:26:52.855000429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6475fc57d7-62nz4,Uid:00078e0b-34d6-49c7-ac39-c578496cc6ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0\"" Sep 3 23:26:52.858713 containerd[1871]: time="2025-09-03T23:26:52.858680154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 3 23:26:52.911634 kubelet[3456]: E0903 23:26:52.911610 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.911850 kubelet[3456]: W0903 23:26:52.911744 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.911850 kubelet[3456]: E0903 23:26:52.911767 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.912080 kubelet[3456]: E0903 23:26:52.912001 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.912080 kubelet[3456]: W0903 23:26:52.912012 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.912080 kubelet[3456]: E0903 23:26:52.912023 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.912342 kubelet[3456]: E0903 23:26:52.912236 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.912342 kubelet[3456]: W0903 23:26:52.912246 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.912342 kubelet[3456]: E0903 23:26:52.912257 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.912506 kubelet[3456]: E0903 23:26:52.912492 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.912592 kubelet[3456]: W0903 23:26:52.912580 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.912647 kubelet[3456]: E0903 23:26:52.912638 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.913299 kubelet[3456]: E0903 23:26:52.913186 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.913299 kubelet[3456]: W0903 23:26:52.913296 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.913446 kubelet[3456]: E0903 23:26:52.913314 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.916194 kubelet[3456]: E0903 23:26:52.916169 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.916194 kubelet[3456]: W0903 23:26:52.916185 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.916437 kubelet[3456]: E0903 23:26:52.916419 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.916748 kubelet[3456]: E0903 23:26:52.916585 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.916748 kubelet[3456]: W0903 23:26:52.916596 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.917530 kubelet[3456]: E0903 23:26:52.916887 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.917592 kubelet[3456]: E0903 23:26:52.917558 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.917592 kubelet[3456]: W0903 23:26:52.917568 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.917626 kubelet[3456]: E0903 23:26:52.917607 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.917847 kubelet[3456]: E0903 23:26:52.917834 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.917847 kubelet[3456]: W0903 23:26:52.917843 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.917930 kubelet[3456]: E0903 23:26:52.917915 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.918003 kubelet[3456]: E0903 23:26:52.917992 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.918003 kubelet[3456]: W0903 23:26:52.918000 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.918088 kubelet[3456]: E0903 23:26:52.918021 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.919817 kubelet[3456]: E0903 23:26:52.919791 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.919817 kubelet[3456]: W0903 23:26:52.919808 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.919966 kubelet[3456]: E0903 23:26:52.919922 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.920215 kubelet[3456]: E0903 23:26:52.920201 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.920215 kubelet[3456]: W0903 23:26:52.920213 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.920479 kubelet[3456]: E0903 23:26:52.920459 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.920808 kubelet[3456]: E0903 23:26:52.920791 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.920808 kubelet[3456]: W0903 23:26:52.920803 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.921033 kubelet[3456]: E0903 23:26:52.920954 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.921242 kubelet[3456]: E0903 23:26:52.921211 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.921242 kubelet[3456]: W0903 23:26:52.921224 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.921242 kubelet[3456]: E0903 23:26:52.921237 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.921673 kubelet[3456]: E0903 23:26:52.921656 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.921673 kubelet[3456]: W0903 23:26:52.921667 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.921822 kubelet[3456]: E0903 23:26:52.921683 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.922879 kubelet[3456]: E0903 23:26:52.922770 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.922879 kubelet[3456]: W0903 23:26:52.922786 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.922879 kubelet[3456]: E0903 23:26:52.922798 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.923239 kubelet[3456]: E0903 23:26:52.923212 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.923239 kubelet[3456]: W0903 23:26:52.923224 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.923239 kubelet[3456]: E0903 23:26:52.923233 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.923766 kubelet[3456]: E0903 23:26:52.923655 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.923766 kubelet[3456]: W0903 23:26:52.923667 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.923766 kubelet[3456]: E0903 23:26:52.923676 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.932316 kubelet[3456]: E0903 23:26:52.932288 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:52.932316 kubelet[3456]: W0903 23:26:52.932310 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:52.932671 kubelet[3456]: E0903 23:26:52.932589 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:52.947887 kubelet[3456]: E0903 23:26:52.947423 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:26:53.005254 kubelet[3456]: E0903 23:26:53.005160 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.005545 kubelet[3456]: W0903 23:26:53.005416 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.005545 kubelet[3456]: E0903 23:26:53.005444 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.005784 kubelet[3456]: E0903 23:26:53.005760 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.005992 kubelet[3456]: W0903 23:26:53.005859 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.005992 kubelet[3456]: E0903 23:26:53.005876 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.006199 kubelet[3456]: E0903 23:26:53.006185 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.006280 kubelet[3456]: W0903 23:26:53.006248 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.006349 kubelet[3456]: E0903 23:26:53.006336 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.006552 kubelet[3456]: E0903 23:26:53.006541 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.006687 kubelet[3456]: W0903 23:26:53.006601 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.006687 kubelet[3456]: E0903 23:26:53.006615 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.007237 kubelet[3456]: E0903 23:26:53.007224 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.007301 kubelet[3456]: W0903 23:26:53.007291 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.007357 kubelet[3456]: E0903 23:26:53.007346 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.007559 kubelet[3456]: E0903 23:26:53.007539 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.007721 kubelet[3456]: W0903 23:26:53.007631 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.007721 kubelet[3456]: E0903 23:26:53.007647 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.007861 kubelet[3456]: E0903 23:26:53.007850 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.008046 kubelet[3456]: W0903 23:26:53.007954 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.008046 kubelet[3456]: E0903 23:26:53.007970 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.008303 kubelet[3456]: E0903 23:26:53.008289 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.008434 kubelet[3456]: W0903 23:26:53.008357 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.008434 kubelet[3456]: E0903 23:26:53.008374 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.008848 kubelet[3456]: E0903 23:26:53.008836 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.009157 kubelet[3456]: W0903 23:26:53.009035 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.009157 kubelet[3456]: E0903 23:26:53.009066 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.009750 kubelet[3456]: E0903 23:26:53.009569 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.009750 kubelet[3456]: W0903 23:26:53.009582 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.009750 kubelet[3456]: E0903 23:26:53.009592 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.009962 kubelet[3456]: E0903 23:26:53.009949 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.010081 kubelet[3456]: W0903 23:26:53.010035 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.010149 kubelet[3456]: E0903 23:26:53.010137 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.010448 kubelet[3456]: E0903 23:26:53.010435 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.010641 kubelet[3456]: W0903 23:26:53.010539 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.010641 kubelet[3456]: E0903 23:26:53.010558 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.010808 kubelet[3456]: E0903 23:26:53.010797 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.010987 kubelet[3456]: W0903 23:26:53.010885 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.010987 kubelet[3456]: E0903 23:26:53.010901 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.011367 kubelet[3456]: E0903 23:26:53.011328 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.011460 kubelet[3456]: W0903 23:26:53.011421 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.011554 kubelet[3456]: E0903 23:26:53.011528 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.011781 kubelet[3456]: E0903 23:26:53.011769 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.011984 kubelet[3456]: W0903 23:26:53.011868 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.011984 kubelet[3456]: E0903 23:26:53.011907 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.012439 kubelet[3456]: E0903 23:26:53.012223 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.012439 kubelet[3456]: W0903 23:26:53.012234 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.012439 kubelet[3456]: E0903 23:26:53.012244 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.012768 kubelet[3456]: E0903 23:26:53.012746 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.012828 kubelet[3456]: W0903 23:26:53.012761 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.012828 kubelet[3456]: E0903 23:26:53.012789 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.013080 kubelet[3456]: E0903 23:26:53.013063 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.013080 kubelet[3456]: W0903 23:26:53.013074 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.013153 kubelet[3456]: E0903 23:26:53.013087 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.013233 kubelet[3456]: E0903 23:26:53.013217 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.013262 kubelet[3456]: W0903 23:26:53.013243 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.013262 kubelet[3456]: E0903 23:26:53.013253 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.013464 kubelet[3456]: E0903 23:26:53.013441 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.013464 kubelet[3456]: W0903 23:26:53.013453 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.013464 kubelet[3456]: E0903 23:26:53.013462 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.014022 kubelet[3456]: E0903 23:26:53.013999 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.014022 kubelet[3456]: W0903 23:26:53.014015 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.014022 kubelet[3456]: E0903 23:26:53.014026 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.014182 kubelet[3456]: I0903 23:26:53.014045 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9-kubelet-dir\") pod \"csi-node-driver-6rww2\" (UID: \"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9\") " pod="calico-system/csi-node-driver-6rww2" Sep 3 23:26:53.014429 kubelet[3456]: E0903 23:26:53.014409 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.014465 kubelet[3456]: W0903 23:26:53.014428 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.014465 kubelet[3456]: E0903 23:26:53.014442 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.014465 kubelet[3456]: I0903 23:26:53.014456 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9-registration-dir\") pod \"csi-node-driver-6rww2\" (UID: \"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9\") " pod="calico-system/csi-node-driver-6rww2" Sep 3 23:26:53.014921 kubelet[3456]: E0903 23:26:53.014902 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.014921 kubelet[3456]: W0903 23:26:53.014918 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.014986 kubelet[3456]: E0903 23:26:53.014932 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.014986 kubelet[3456]: I0903 23:26:53.014949 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9-varrun\") pod \"csi-node-driver-6rww2\" (UID: \"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9\") " pod="calico-system/csi-node-driver-6rww2" Sep 3 23:26:53.015394 kubelet[3456]: E0903 23:26:53.015374 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.015394 kubelet[3456]: W0903 23:26:53.015390 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.015467 kubelet[3456]: E0903 23:26:53.015413 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.015467 kubelet[3456]: I0903 23:26:53.015428 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9-socket-dir\") pod \"csi-node-driver-6rww2\" (UID: \"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9\") " pod="calico-system/csi-node-driver-6rww2" Sep 3 23:26:53.015806 kubelet[3456]: E0903 23:26:53.015786 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.015806 kubelet[3456]: W0903 23:26:53.015800 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.015941 kubelet[3456]: E0903 23:26:53.015918 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.015985 kubelet[3456]: I0903 23:26:53.015941 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5wzp\" (UniqueName: \"kubernetes.io/projected/43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9-kube-api-access-x5wzp\") pod \"csi-node-driver-6rww2\" (UID: \"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9\") " pod="calico-system/csi-node-driver-6rww2" Sep 3 23:26:53.016180 kubelet[3456]: E0903 23:26:53.016107 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.016180 kubelet[3456]: W0903 23:26:53.016118 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.016180 kubelet[3456]: E0903 23:26:53.016144 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.016388 kubelet[3456]: E0903 23:26:53.016371 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.016388 kubelet[3456]: W0903 23:26:53.016384 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.016504 kubelet[3456]: E0903 23:26:53.016453 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.016597 kubelet[3456]: E0903 23:26:53.016578 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.016597 kubelet[3456]: W0903 23:26:53.016585 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.016629 kubelet[3456]: E0903 23:26:53.016620 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.016737 kubelet[3456]: E0903 23:26:53.016725 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.016737 kubelet[3456]: W0903 23:26:53.016734 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.016792 kubelet[3456]: E0903 23:26:53.016775 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.016907 kubelet[3456]: E0903 23:26:53.016896 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.016907 kubelet[3456]: W0903 23:26:53.016904 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.016956 kubelet[3456]: E0903 23:26:53.016913 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.017094 kubelet[3456]: E0903 23:26:53.017070 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.017094 kubelet[3456]: W0903 23:26:53.017092 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.017136 kubelet[3456]: E0903 23:26:53.017100 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.017237 kubelet[3456]: E0903 23:26:53.017223 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.017237 kubelet[3456]: W0903 23:26:53.017232 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.017287 kubelet[3456]: E0903 23:26:53.017238 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.017397 kubelet[3456]: E0903 23:26:53.017386 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.017397 kubelet[3456]: W0903 23:26:53.017395 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.017442 kubelet[3456]: E0903 23:26:53.017401 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.017533 kubelet[3456]: E0903 23:26:53.017520 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.017533 kubelet[3456]: W0903 23:26:53.017527 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.017589 kubelet[3456]: E0903 23:26:53.017540 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.017705 kubelet[3456]: E0903 23:26:53.017692 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.017705 kubelet[3456]: W0903 23:26:53.017700 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.017742 kubelet[3456]: E0903 23:26:53.017706 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.023485 containerd[1871]: time="2025-09-03T23:26:53.023244013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6476q,Uid:05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e,Namespace:calico-system,Attempt:0,}" Sep 3 23:26:53.074737 containerd[1871]: time="2025-09-03T23:26:53.074704526Z" level=info msg="connecting to shim 5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690" address="unix:///run/containerd/s/1f231755159e60f1bf13ab9a43fd285762be97b022dacecfe74b70b2aeac1d23" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:26:53.097689 systemd[1]: Started cri-containerd-5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690.scope - libcontainer container 5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690. Sep 3 23:26:53.117530 kubelet[3456]: E0903 23:26:53.117323 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.117530 kubelet[3456]: W0903 23:26:53.117339 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.117530 kubelet[3456]: E0903 23:26:53.117356 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.117849 kubelet[3456]: E0903 23:26:53.117806 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.118038 kubelet[3456]: W0903 23:26:53.117922 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.118038 kubelet[3456]: E0903 23:26:53.117945 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.118476 kubelet[3456]: E0903 23:26:53.118454 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.118476 kubelet[3456]: W0903 23:26:53.118468 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.118476 kubelet[3456]: E0903 23:26:53.118482 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.118687 kubelet[3456]: E0903 23:26:53.118622 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.118687 kubelet[3456]: W0903 23:26:53.118636 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.118687 kubelet[3456]: E0903 23:26:53.118648 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.118932 kubelet[3456]: E0903 23:26:53.118759 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.118932 kubelet[3456]: W0903 23:26:53.118767 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.118999 kubelet[3456]: E0903 23:26:53.118975 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.119119 kubelet[3456]: E0903 23:26:53.119107 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.119371 kubelet[3456]: W0903 23:26:53.119258 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.119371 kubelet[3456]: E0903 23:26:53.119285 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.119746 kubelet[3456]: E0903 23:26:53.119630 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.119746 kubelet[3456]: W0903 23:26:53.119643 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.119746 kubelet[3456]: E0903 23:26:53.119667 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.121089 kubelet[3456]: E0903 23:26:53.120725 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.121089 kubelet[3456]: W0903 23:26:53.120741 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.121089 kubelet[3456]: E0903 23:26:53.120873 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.121213 kubelet[3456]: E0903 23:26:53.121192 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.121213 kubelet[3456]: W0903 23:26:53.121207 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.121627 kubelet[3456]: E0903 23:26:53.121266 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.121627 kubelet[3456]: E0903 23:26:53.121556 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.121627 kubelet[3456]: W0903 23:26:53.121579 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.121830 kubelet[3456]: E0903 23:26:53.121804 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.121912 kubelet[3456]: E0903 23:26:53.121898 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.121912 kubelet[3456]: W0903 23:26:53.121908 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.122017 kubelet[3456]: E0903 23:26:53.122003 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.122167 kubelet[3456]: E0903 23:26:53.122124 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.122167 kubelet[3456]: W0903 23:26:53.122135 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.122560 kubelet[3456]: E0903 23:26:53.122233 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.122631 kubelet[3456]: E0903 23:26:53.122622 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.122651 kubelet[3456]: W0903 23:26:53.122633 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.122929 kubelet[3456]: E0903 23:26:53.122909 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.123582 kubelet[3456]: E0903 23:26:53.123560 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.123582 kubelet[3456]: W0903 23:26:53.123576 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.123740 kubelet[3456]: E0903 23:26:53.123719 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.124458 kubelet[3456]: E0903 23:26:53.124331 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.124458 kubelet[3456]: W0903 23:26:53.124355 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.124458 kubelet[3456]: E0903 23:26:53.124432 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.124704 kubelet[3456]: E0903 23:26:53.124686 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.124704 kubelet[3456]: W0903 23:26:53.124698 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.124874 kubelet[3456]: E0903 23:26:53.124857 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.125014 kubelet[3456]: E0903 23:26:53.124984 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.125014 kubelet[3456]: W0903 23:26:53.124995 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.125183 kubelet[3456]: E0903 23:26:53.125139 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.125357 kubelet[3456]: E0903 23:26:53.125332 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.125357 kubelet[3456]: W0903 23:26:53.125351 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.125476 kubelet[3456]: E0903 23:26:53.125457 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.125689 kubelet[3456]: E0903 23:26:53.125670 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.125689 kubelet[3456]: W0903 23:26:53.125682 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.125889 kubelet[3456]: E0903 23:26:53.125762 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.125993 kubelet[3456]: E0903 23:26:53.125973 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.125993 kubelet[3456]: W0903 23:26:53.125987 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.126047 kubelet[3456]: E0903 23:26:53.125999 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.126262 kubelet[3456]: E0903 23:26:53.126247 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.126262 kubelet[3456]: W0903 23:26:53.126259 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.126333 kubelet[3456]: E0903 23:26:53.126321 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.126627 kubelet[3456]: E0903 23:26:53.126609 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.126627 kubelet[3456]: W0903 23:26:53.126624 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.126723 kubelet[3456]: E0903 23:26:53.126654 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.126865 kubelet[3456]: E0903 23:26:53.126850 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.126865 kubelet[3456]: W0903 23:26:53.126860 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.126958 kubelet[3456]: E0903 23:26:53.126942 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.127057 kubelet[3456]: E0903 23:26:53.127043 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.127057 kubelet[3456]: W0903 23:26:53.127054 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.127117 kubelet[3456]: E0903 23:26:53.127103 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.127236 kubelet[3456]: E0903 23:26:53.127221 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.127236 kubelet[3456]: W0903 23:26:53.127232 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.127278 kubelet[3456]: E0903 23:26:53.127240 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:53.129917 containerd[1871]: time="2025-09-03T23:26:53.129678457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6476q,Uid:05bf84d3-e41b-41f3-87ad-cb3bd6e5f93e,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\"" Sep 3 23:26:53.136044 kubelet[3456]: E0903 23:26:53.136024 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:53.136044 kubelet[3456]: W0903 23:26:53.136040 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:53.136133 kubelet[3456]: E0903 23:26:53.136050 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:54.445773 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2241064406.mount: Deactivated successfully. Sep 3 23:26:54.829716 containerd[1871]: time="2025-09-03T23:26:54.829671787Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:54.832208 containerd[1871]: time="2025-09-03T23:26:54.832178044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 3 23:26:54.835150 containerd[1871]: time="2025-09-03T23:26:54.835087340Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:54.842067 containerd[1871]: time="2025-09-03T23:26:54.842025655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:54.842638 containerd[1871]: time="2025-09-03T23:26:54.842545391Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.981963014s" Sep 3 23:26:54.842638 containerd[1871]: time="2025-09-03T23:26:54.842573400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 3 23:26:54.844700 containerd[1871]: time="2025-09-03T23:26:54.844623690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 3 23:26:54.855478 containerd[1871]: time="2025-09-03T23:26:54.855352611Z" level=info msg="CreateContainer within sandbox \"8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 3 23:26:54.873533 containerd[1871]: time="2025-09-03T23:26:54.871651112Z" level=info msg="Container c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:54.889607 containerd[1871]: time="2025-09-03T23:26:54.889570103Z" level=info msg="CreateContainer within sandbox \"8573641ef590f11f5d89ceee13ca1c1d86ccb99a9433180a872d433f9e7583b0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84\"" Sep 3 23:26:54.890945 containerd[1871]: time="2025-09-03T23:26:54.890921470Z" level=info msg="StartContainer for \"c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84\"" Sep 3 23:26:54.892268 containerd[1871]: time="2025-09-03T23:26:54.891951343Z" level=info msg="connecting to shim c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84" address="unix:///run/containerd/s/cdd9187bb6e9180f92b9c013e64213beb88d6ec71d965793802d1d3effefca2f" protocol=ttrpc version=3 Sep 3 23:26:54.909803 kubelet[3456]: E0903 23:26:54.909762 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:26:54.913681 systemd[1]: Started cri-containerd-c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84.scope - libcontainer container c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84. Sep 3 23:26:54.947583 containerd[1871]: time="2025-09-03T23:26:54.947552236Z" level=info msg="StartContainer for \"c0294189e8343388d1dd1e2e550efd8e4e09f27cf6feca7eede0ff7437dd9a84\" returns successfully" Sep 3 23:26:54.997433 kubelet[3456]: I0903 23:26:54.996733 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6475fc57d7-62nz4" podStartSLOduration=1.010519403 podStartE2EDuration="2.996718199s" podCreationTimestamp="2025-09-03 23:26:52 +0000 UTC" firstStartedPulling="2025-09-03 23:26:52.857876661 +0000 UTC m=+19.119433722" lastFinishedPulling="2025-09-03 23:26:54.844075457 +0000 UTC m=+21.105632518" observedRunningTime="2025-09-03 23:26:54.996637398 +0000 UTC m=+21.258194459" watchObservedRunningTime="2025-09-03 23:26:54.996718199 +0000 UTC m=+21.258275268" Sep 3 23:26:55.027165 kubelet[3456]: E0903 23:26:55.027136 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.028084 kubelet[3456]: W0903 23:26:55.027966 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.028084 kubelet[3456]: E0903 23:26:55.028031 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.028350 kubelet[3456]: E0903 23:26:55.028332 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.028462 kubelet[3456]: W0903 23:26:55.028408 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.028462 kubelet[3456]: E0903 23:26:55.028426 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.028936 kubelet[3456]: E0903 23:26:55.028860 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.028936 kubelet[3456]: W0903 23:26:55.028879 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.028936 kubelet[3456]: E0903 23:26:55.028891 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.029449 kubelet[3456]: E0903 23:26:55.029362 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.029449 kubelet[3456]: W0903 23:26:55.029380 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.029449 kubelet[3456]: E0903 23:26:55.029392 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.030238 kubelet[3456]: E0903 23:26:55.030166 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.030238 kubelet[3456]: W0903 23:26:55.030187 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.030238 kubelet[3456]: E0903 23:26:55.030199 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.030638 kubelet[3456]: E0903 23:26:55.030573 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.030638 kubelet[3456]: W0903 23:26:55.030591 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.030638 kubelet[3456]: E0903 23:26:55.030601 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.031194 kubelet[3456]: E0903 23:26:55.031174 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.031308 kubelet[3456]: W0903 23:26:55.031254 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.031308 kubelet[3456]: E0903 23:26:55.031270 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.032109 kubelet[3456]: E0903 23:26:55.032013 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.032109 kubelet[3456]: W0903 23:26:55.032031 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.032109 kubelet[3456]: E0903 23:26:55.032045 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.032406 kubelet[3456]: E0903 23:26:55.032389 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.032547 kubelet[3456]: W0903 23:26:55.032470 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.032547 kubelet[3456]: E0903 23:26:55.032486 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.032851 kubelet[3456]: E0903 23:26:55.032763 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.032851 kubelet[3456]: W0903 23:26:55.032776 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.032851 kubelet[3456]: E0903 23:26:55.032785 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.033088 kubelet[3456]: E0903 23:26:55.033031 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.033088 kubelet[3456]: W0903 23:26:55.033043 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.033088 kubelet[3456]: E0903 23:26:55.033053 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.033309 kubelet[3456]: E0903 23:26:55.033294 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.033435 kubelet[3456]: W0903 23:26:55.033363 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.033435 kubelet[3456]: E0903 23:26:55.033378 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.033635 kubelet[3456]: E0903 23:26:55.033620 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.033747 kubelet[3456]: W0903 23:26:55.033695 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.033747 kubelet[3456]: E0903 23:26:55.033711 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.033935 kubelet[3456]: E0903 23:26:55.033921 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.034029 kubelet[3456]: W0903 23:26:55.033979 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.034029 kubelet[3456]: E0903 23:26:55.033993 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.034206 kubelet[3456]: E0903 23:26:55.034193 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.034357 kubelet[3456]: W0903 23:26:55.034255 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.034357 kubelet[3456]: E0903 23:26:55.034269 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.034577 kubelet[3456]: E0903 23:26:55.034563 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.034713 kubelet[3456]: W0903 23:26:55.034631 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.034713 kubelet[3456]: E0903 23:26:55.034648 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.034904 kubelet[3456]: E0903 23:26:55.034892 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.035022 kubelet[3456]: W0903 23:26:55.034963 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.035022 kubelet[3456]: E0903 23:26:55.034987 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.035221 kubelet[3456]: E0903 23:26:55.035198 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.035221 kubelet[3456]: W0903 23:26:55.035219 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.035275 kubelet[3456]: E0903 23:26:55.035237 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.036010 kubelet[3456]: E0903 23:26:55.035985 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.036010 kubelet[3456]: W0903 23:26:55.036005 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.036010 kubelet[3456]: E0903 23:26:55.036022 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.036420 kubelet[3456]: E0903 23:26:55.036400 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.036420 kubelet[3456]: W0903 23:26:55.036418 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.036625 kubelet[3456]: E0903 23:26:55.036481 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.037640 kubelet[3456]: E0903 23:26:55.037618 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.037640 kubelet[3456]: W0903 23:26:55.037635 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.037991 kubelet[3456]: E0903 23:26:55.037706 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.038178 kubelet[3456]: E0903 23:26:55.038157 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.038178 kubelet[3456]: W0903 23:26:55.038175 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.038335 kubelet[3456]: E0903 23:26:55.038274 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.038335 kubelet[3456]: E0903 23:26:55.038334 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.038453 kubelet[3456]: W0903 23:26:55.038343 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.038801 kubelet[3456]: E0903 23:26:55.038412 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.038942 kubelet[3456]: E0903 23:26:55.038923 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.038942 kubelet[3456]: W0903 23:26:55.038940 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.039116 kubelet[3456]: E0903 23:26:55.038957 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.039664 kubelet[3456]: E0903 23:26:55.039643 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.039664 kubelet[3456]: W0903 23:26:55.039662 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.039854 kubelet[3456]: E0903 23:26:55.039679 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.040527 kubelet[3456]: E0903 23:26:55.040499 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.040527 kubelet[3456]: W0903 23:26:55.040523 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.040798 kubelet[3456]: E0903 23:26:55.040778 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.040969 kubelet[3456]: E0903 23:26:55.040948 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.040969 kubelet[3456]: W0903 23:26:55.040966 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.041152 kubelet[3456]: E0903 23:26:55.041031 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.041345 kubelet[3456]: E0903 23:26:55.041323 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.041345 kubelet[3456]: W0903 23:26:55.041340 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.041495 kubelet[3456]: E0903 23:26:55.041425 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.042663 kubelet[3456]: E0903 23:26:55.042642 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.042663 kubelet[3456]: W0903 23:26:55.042659 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.042663 kubelet[3456]: E0903 23:26:55.042675 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.042973 kubelet[3456]: E0903 23:26:55.042942 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.043155 kubelet[3456]: W0903 23:26:55.043130 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.043214 kubelet[3456]: E0903 23:26:55.043206 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.043534 kubelet[3456]: E0903 23:26:55.043507 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.043620 kubelet[3456]: W0903 23:26:55.043594 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.043855 kubelet[3456]: E0903 23:26:55.043839 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.044089 kubelet[3456]: E0903 23:26:55.044050 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.044551 kubelet[3456]: W0903 23:26:55.044336 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.044741 kubelet[3456]: E0903 23:26:55.044694 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.044931 kubelet[3456]: E0903 23:26:55.044911 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:55.044931 kubelet[3456]: W0903 23:26:55.044929 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:55.045008 kubelet[3456]: E0903 23:26:55.044940 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:55.980845 kubelet[3456]: I0903 23:26:55.980708 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:26:56.013527 containerd[1871]: time="2025-09-03T23:26:56.013405511Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.016882 containerd[1871]: time="2025-09-03T23:26:56.016842391Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 3 23:26:56.020115 containerd[1871]: time="2025-09-03T23:26:56.020061476Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.024306 containerd[1871]: time="2025-09-03T23:26:56.024269482Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:26:56.024785 containerd[1871]: time="2025-09-03T23:26:56.024529910Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.179857948s" Sep 3 23:26:56.024785 containerd[1871]: time="2025-09-03T23:26:56.024557463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 3 23:26:56.026900 containerd[1871]: time="2025-09-03T23:26:56.026870109Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 3 23:26:56.040732 kubelet[3456]: E0903 23:26:56.040656 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.040905 kubelet[3456]: W0903 23:26:56.040817 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.040905 kubelet[3456]: E0903 23:26:56.040841 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.041293 kubelet[3456]: E0903 23:26:56.041203 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.041293 kubelet[3456]: W0903 23:26:56.041214 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.041518 kubelet[3456]: E0903 23:26:56.041224 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.041741 kubelet[3456]: E0903 23:26:56.041722 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.041899 kubelet[3456]: W0903 23:26:56.041796 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.041899 kubelet[3456]: E0903 23:26:56.041827 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.042158 kubelet[3456]: E0903 23:26:56.042140 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.042266 kubelet[3456]: W0903 23:26:56.042212 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.042266 kubelet[3456]: E0903 23:26:56.042245 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.042532 kubelet[3456]: E0903 23:26:56.042522 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.042693 kubelet[3456]: W0903 23:26:56.042588 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.042693 kubelet[3456]: E0903 23:26:56.042601 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.042898 kubelet[3456]: E0903 23:26:56.042865 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.042898 kubelet[3456]: W0903 23:26:56.042876 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.043075 kubelet[3456]: E0903 23:26:56.042953 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.043264 kubelet[3456]: E0903 23:26:56.043217 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.043264 kubelet[3456]: W0903 23:26:56.043228 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.043264 kubelet[3456]: E0903 23:26:56.043237 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.043670 kubelet[3456]: E0903 23:26:56.043601 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.043670 kubelet[3456]: W0903 23:26:56.043612 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.043670 kubelet[3456]: E0903 23:26:56.043621 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.044014 kubelet[3456]: E0903 23:26:56.043920 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.044014 kubelet[3456]: W0903 23:26:56.043944 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.044014 kubelet[3456]: E0903 23:26:56.043955 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.044307 kubelet[3456]: E0903 23:26:56.044281 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.044444 kubelet[3456]: W0903 23:26:56.044380 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.044444 kubelet[3456]: E0903 23:26:56.044395 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.045051 kubelet[3456]: E0903 23:26:56.044956 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.045051 kubelet[3456]: W0903 23:26:56.044971 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.045051 kubelet[3456]: E0903 23:26:56.044981 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045169 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046803 kubelet[3456]: W0903 23:26:56.045178 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045187 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045407 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046803 kubelet[3456]: W0903 23:26:56.045415 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045423 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045577 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046803 kubelet[3456]: W0903 23:26:56.045585 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045592 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046803 kubelet[3456]: E0903 23:26:56.045705 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046947 kubelet[3456]: W0903 23:26:56.045711 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.045717 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.045890 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046947 kubelet[3456]: W0903 23:26:56.045897 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.045904 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.046065 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046947 kubelet[3456]: W0903 23:26:56.046070 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.046083 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.046947 kubelet[3456]: E0903 23:26:56.046214 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.046947 kubelet[3456]: W0903 23:26:56.046221 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046230 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046352 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.047090 kubelet[3456]: W0903 23:26:56.046360 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046368 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046470 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.047090 kubelet[3456]: W0903 23:26:56.046475 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046495 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046614 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.047090 kubelet[3456]: W0903 23:26:56.046621 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.047090 kubelet[3456]: E0903 23:26:56.046633 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.047382 kubelet[3456]: E0903 23:26:56.047358 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.047382 kubelet[3456]: W0903 23:26:56.047369 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.047530 kubelet[3456]: E0903 23:26:56.047455 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.047895 kubelet[3456]: E0903 23:26:56.047883 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.048042 kubelet[3456]: W0903 23:26:56.047956 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.048042 kubelet[3456]: E0903 23:26:56.047979 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.048201 kubelet[3456]: E0903 23:26:56.048192 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.048292 kubelet[3456]: W0903 23:26:56.048247 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.048292 kubelet[3456]: E0903 23:26:56.048282 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.048506 kubelet[3456]: E0903 23:26:56.048463 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.048506 kubelet[3456]: W0903 23:26:56.048473 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.048506 kubelet[3456]: E0903 23:26:56.048492 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.048812 kubelet[3456]: E0903 23:26:56.048744 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.048812 kubelet[3456]: W0903 23:26:56.048755 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.048812 kubelet[3456]: E0903 23:26:56.048770 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.049080 kubelet[3456]: E0903 23:26:56.049008 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.049080 kubelet[3456]: W0903 23:26:56.049020 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.049080 kubelet[3456]: E0903 23:26:56.049034 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.049384 kubelet[3456]: E0903 23:26:56.049271 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.049384 kubelet[3456]: W0903 23:26:56.049283 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.049384 kubelet[3456]: E0903 23:26:56.049296 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.049722 kubelet[3456]: E0903 23:26:56.049704 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.049722 kubelet[3456]: W0903 23:26:56.049718 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.049818 kubelet[3456]: E0903 23:26:56.049733 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.050642 kubelet[3456]: E0903 23:26:56.050630 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.050779 kubelet[3456]: W0903 23:26:56.050710 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.050779 kubelet[3456]: E0903 23:26:56.050735 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.050967 kubelet[3456]: E0903 23:26:56.050956 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.051083 kubelet[3456]: W0903 23:26:56.051040 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.051338 kubelet[3456]: E0903 23:26:56.051125 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.051468 containerd[1871]: time="2025-09-03T23:26:56.051440418Z" level=info msg="Container 94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:26:56.054110 kubelet[3456]: E0903 23:26:56.053943 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.054110 kubelet[3456]: W0903 23:26:56.053959 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.054110 kubelet[3456]: E0903 23:26:56.053980 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.054284 kubelet[3456]: E0903 23:26:56.054241 3456 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 3 23:26:56.054284 kubelet[3456]: W0903 23:26:56.054256 3456 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 3 23:26:56.054284 kubelet[3456]: E0903 23:26:56.054266 3456 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 3 23:26:56.070869 containerd[1871]: time="2025-09-03T23:26:56.070841722Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\"" Sep 3 23:26:56.071429 containerd[1871]: time="2025-09-03T23:26:56.071369083Z" level=info msg="StartContainer for \"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\"" Sep 3 23:26:56.072672 containerd[1871]: time="2025-09-03T23:26:56.072652392Z" level=info msg="connecting to shim 94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d" address="unix:///run/containerd/s/1f231755159e60f1bf13ab9a43fd285762be97b022dacecfe74b70b2aeac1d23" protocol=ttrpc version=3 Sep 3 23:26:56.089626 systemd[1]: Started cri-containerd-94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d.scope - libcontainer container 94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d. Sep 3 23:26:56.119933 containerd[1871]: time="2025-09-03T23:26:56.119905660Z" level=info msg="StartContainer for \"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\" returns successfully" Sep 3 23:26:56.124972 systemd[1]: cri-containerd-94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d.scope: Deactivated successfully. Sep 3 23:26:56.127695 containerd[1871]: time="2025-09-03T23:26:56.127636715Z" level=info msg="received exit event container_id:\"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\" id:\"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\" pid:4169 exited_at:{seconds:1756942016 nanos:127109427}" Sep 3 23:26:56.127812 containerd[1871]: time="2025-09-03T23:26:56.127692780Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\" id:\"94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d\" pid:4169 exited_at:{seconds:1756942016 nanos:127109427}" Sep 3 23:26:56.141988 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-94b74b1b77d109407fa1b63516396bcf53062d312884444200d71882580f644d-rootfs.mount: Deactivated successfully. Sep 3 23:26:56.910266 kubelet[3456]: E0903 23:26:56.910224 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:26:57.991068 containerd[1871]: time="2025-09-03T23:26:57.990978396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 3 23:26:58.909683 kubelet[3456]: E0903 23:26:58.909634 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:27:00.356278 containerd[1871]: time="2025-09-03T23:27:00.356220865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:00.359026 containerd[1871]: time="2025-09-03T23:27:00.358999386Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 3 23:27:00.362537 containerd[1871]: time="2025-09-03T23:27:00.362351098Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:00.366265 containerd[1871]: time="2025-09-03T23:27:00.366227786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:00.366772 containerd[1871]: time="2025-09-03T23:27:00.366747674Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.375638124s" Sep 3 23:27:00.366836 containerd[1871]: time="2025-09-03T23:27:00.366773330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 3 23:27:00.370061 containerd[1871]: time="2025-09-03T23:27:00.370023897Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 3 23:27:00.391537 containerd[1871]: time="2025-09-03T23:27:00.391495745Z" level=info msg="Container b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:00.394151 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2606781747.mount: Deactivated successfully. Sep 3 23:27:00.414725 containerd[1871]: time="2025-09-03T23:27:00.414686289Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\"" Sep 3 23:27:00.416043 containerd[1871]: time="2025-09-03T23:27:00.416012204Z" level=info msg="StartContainer for \"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\"" Sep 3 23:27:00.417355 containerd[1871]: time="2025-09-03T23:27:00.417328287Z" level=info msg="connecting to shim b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd" address="unix:///run/containerd/s/1f231755159e60f1bf13ab9a43fd285762be97b022dacecfe74b70b2aeac1d23" protocol=ttrpc version=3 Sep 3 23:27:00.437680 systemd[1]: Started cri-containerd-b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd.scope - libcontainer container b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd. Sep 3 23:27:00.469977 containerd[1871]: time="2025-09-03T23:27:00.469938954Z" level=info msg="StartContainer for \"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\" returns successfully" Sep 3 23:27:00.910454 kubelet[3456]: E0903 23:27:00.910405 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:27:01.824018 containerd[1871]: time="2025-09-03T23:27:01.823940612Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 3 23:27:01.827138 systemd[1]: cri-containerd-b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd.scope: Deactivated successfully. Sep 3 23:27:01.827379 systemd[1]: cri-containerd-b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd.scope: Consumed 313ms CPU time, 189.3M memory peak, 165.8M written to disk. Sep 3 23:27:01.828679 containerd[1871]: time="2025-09-03T23:27:01.828629224Z" level=info msg="received exit event container_id:\"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\" id:\"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\" pid:4227 exited_at:{seconds:1756942021 nanos:828273107}" Sep 3 23:27:01.828886 containerd[1871]: time="2025-09-03T23:27:01.828829019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\" id:\"b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd\" pid:4227 exited_at:{seconds:1756942021 nanos:828273107}" Sep 3 23:27:01.847739 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b88815f8c762c7b53be9948f406c97ee0fe1d34af5c1abd595eaaac19054d5dd-rootfs.mount: Deactivated successfully. Sep 3 23:27:01.861235 kubelet[3456]: I0903 23:27:01.861209 3456 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 3 23:27:02.315314 kubelet[3456]: I0903 23:27:02.085718 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzckg\" (UniqueName: \"kubernetes.io/projected/0bb920d0-73e4-40ad-a06a-66374e99ced3-kube-api-access-jzckg\") pod \"coredns-7c65d6cfc9-nvtt7\" (UID: \"0bb920d0-73e4-40ad-a06a-66374e99ced3\") " pod="kube-system/coredns-7c65d6cfc9-nvtt7" Sep 3 23:27:02.315314 kubelet[3456]: I0903 23:27:02.085763 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw4r2\" (UniqueName: \"kubernetes.io/projected/443c995a-7c9b-4cf9-bf5b-9413159cbdf5-kube-api-access-mw4r2\") pod \"calico-apiserver-549b6b9dbd-c6twc\" (UID: \"443c995a-7c9b-4cf9-bf5b-9413159cbdf5\") " pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" Sep 3 23:27:02.315314 kubelet[3456]: I0903 23:27:02.085776 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b345f694-032e-4f26-8e13-4a3335092435-calico-apiserver-certs\") pod \"calico-apiserver-549b6b9dbd-h5cx7\" (UID: \"b345f694-032e-4f26-8e13-4a3335092435\") " pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" Sep 3 23:27:02.315314 kubelet[3456]: I0903 23:27:02.085790 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b58582-70ce-49c0-bf27-d782f23aaab2-tigera-ca-bundle\") pod \"calico-kube-controllers-55f546bb6c-svbsf\" (UID: \"49b58582-70ce-49c0-bf27-d782f23aaab2\") " pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" Sep 3 23:27:02.315314 kubelet[3456]: I0903 23:27:02.085801 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-ca-bundle\") pod \"whisker-864f5cd4f6-clbjf\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " pod="calico-system/whisker-864f5cd4f6-clbjf" Sep 3 23:27:01.901022 systemd[1]: Created slice kubepods-burstable-pod8b08a17c_b086_43b4_863d_441927f3822c.slice - libcontainer container kubepods-burstable-pod8b08a17c_b086_43b4_863d_441927f3822c.slice. Sep 3 23:27:02.315805 kubelet[3456]: I0903 23:27:02.085814 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4g2p\" (UniqueName: \"kubernetes.io/projected/b345f694-032e-4f26-8e13-4a3335092435-kube-api-access-k4g2p\") pod \"calico-apiserver-549b6b9dbd-h5cx7\" (UID: \"b345f694-032e-4f26-8e13-4a3335092435\") " pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" Sep 3 23:27:02.315805 kubelet[3456]: I0903 23:27:02.085827 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbq8d\" (UniqueName: \"kubernetes.io/projected/de303087-ad15-475d-bdd3-49d0d4d476d9-kube-api-access-wbq8d\") pod \"whisker-864f5cd4f6-clbjf\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " pod="calico-system/whisker-864f5cd4f6-clbjf" Sep 3 23:27:02.315805 kubelet[3456]: I0903 23:27:02.085837 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a1f6b3-d706-4773-8e38-f93a88d10c00-config\") pod \"goldmane-7988f88666-4qkvb\" (UID: \"65a1f6b3-d706-4773-8e38-f93a88d10c00\") " pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:02.315805 kubelet[3456]: I0903 23:27:02.085853 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4vm\" (UniqueName: \"kubernetes.io/projected/49b58582-70ce-49c0-bf27-d782f23aaab2-kube-api-access-mq4vm\") pod \"calico-kube-controllers-55f546bb6c-svbsf\" (UID: \"49b58582-70ce-49c0-bf27-d782f23aaab2\") " pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" Sep 3 23:27:02.315805 kubelet[3456]: I0903 23:27:02.085864 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/65a1f6b3-d706-4773-8e38-f93a88d10c00-goldmane-key-pair\") pod \"goldmane-7988f88666-4qkvb\" (UID: \"65a1f6b3-d706-4773-8e38-f93a88d10c00\") " pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:01.921166 systemd[1]: Created slice kubepods-besteffort-podb345f694_032e_4f26_8e13_4a3335092435.slice - libcontainer container kubepods-besteffort-podb345f694_032e_4f26_8e13_4a3335092435.slice. Sep 3 23:27:02.315920 kubelet[3456]: I0903 23:27:02.085876 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/443c995a-7c9b-4cf9-bf5b-9413159cbdf5-calico-apiserver-certs\") pod \"calico-apiserver-549b6b9dbd-c6twc\" (UID: \"443c995a-7c9b-4cf9-bf5b-9413159cbdf5\") " pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" Sep 3 23:27:02.315920 kubelet[3456]: I0903 23:27:02.085888 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bb920d0-73e4-40ad-a06a-66374e99ced3-config-volume\") pod \"coredns-7c65d6cfc9-nvtt7\" (UID: \"0bb920d0-73e4-40ad-a06a-66374e99ced3\") " pod="kube-system/coredns-7c65d6cfc9-nvtt7" Sep 3 23:27:02.315920 kubelet[3456]: I0903 23:27:02.085897 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b08a17c-b086-43b4-863d-441927f3822c-config-volume\") pod \"coredns-7c65d6cfc9-xhw6w\" (UID: \"8b08a17c-b086-43b4-863d-441927f3822c\") " pod="kube-system/coredns-7c65d6cfc9-xhw6w" Sep 3 23:27:02.315920 kubelet[3456]: I0903 23:27:02.085907 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a1f6b3-d706-4773-8e38-f93a88d10c00-goldmane-ca-bundle\") pod \"goldmane-7988f88666-4qkvb\" (UID: \"65a1f6b3-d706-4773-8e38-f93a88d10c00\") " pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:02.315920 kubelet[3456]: I0903 23:27:02.085918 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tp8\" (UniqueName: \"kubernetes.io/projected/65a1f6b3-d706-4773-8e38-f93a88d10c00-kube-api-access-n8tp8\") pod \"goldmane-7988f88666-4qkvb\" (UID: \"65a1f6b3-d706-4773-8e38-f93a88d10c00\") " pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:01.927719 systemd[1]: Created slice kubepods-besteffort-pod443c995a_7c9b_4cf9_bf5b_9413159cbdf5.slice - libcontainer container kubepods-besteffort-pod443c995a_7c9b_4cf9_bf5b_9413159cbdf5.slice. Sep 3 23:27:02.316037 kubelet[3456]: I0903 23:27:02.085927 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-backend-key-pair\") pod \"whisker-864f5cd4f6-clbjf\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " pod="calico-system/whisker-864f5cd4f6-clbjf" Sep 3 23:27:02.316037 kubelet[3456]: I0903 23:27:02.085943 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lm7z\" (UniqueName: \"kubernetes.io/projected/8b08a17c-b086-43b4-863d-441927f3822c-kube-api-access-7lm7z\") pod \"coredns-7c65d6cfc9-xhw6w\" (UID: \"8b08a17c-b086-43b4-863d-441927f3822c\") " pod="kube-system/coredns-7c65d6cfc9-xhw6w" Sep 3 23:27:01.934308 systemd[1]: Created slice kubepods-besteffort-pod49b58582_70ce_49c0_bf27_d782f23aaab2.slice - libcontainer container kubepods-besteffort-pod49b58582_70ce_49c0_bf27_d782f23aaab2.slice. Sep 3 23:27:01.943875 systemd[1]: Created slice kubepods-besteffort-pod65a1f6b3_d706_4773_8e38_f93a88d10c00.slice - libcontainer container kubepods-besteffort-pod65a1f6b3_d706_4773_8e38_f93a88d10c00.slice. Sep 3 23:27:01.954318 systemd[1]: Created slice kubepods-burstable-pod0bb920d0_73e4_40ad_a06a_66374e99ced3.slice - libcontainer container kubepods-burstable-pod0bb920d0_73e4_40ad_a06a_66374e99ced3.slice. Sep 3 23:27:01.959675 systemd[1]: Created slice kubepods-besteffort-podde303087_ad15_475d_bdd3_49d0d4d476d9.slice - libcontainer container kubepods-besteffort-podde303087_ad15_475d_bdd3_49d0d4d476d9.slice. Sep 3 23:27:02.348779 systemd[1]: Created slice kubepods-besteffort-pod43cbfbbe_3cf1_4bc5_8e46_c7e46d49d7a9.slice - libcontainer container kubepods-besteffort-pod43cbfbbe_3cf1_4bc5_8e46_c7e46d49d7a9.slice. Sep 3 23:27:02.350865 containerd[1871]: time="2025-09-03T23:27:02.350816916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6rww2,Uid:43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:02.620668 containerd[1871]: time="2025-09-03T23:27:02.620416762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-864f5cd4f6-clbjf,Uid:de303087-ad15-475d-bdd3-49d0d4d476d9,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:02.620668 containerd[1871]: time="2025-09-03T23:27:02.620493227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-4qkvb,Uid:65a1f6b3-d706-4773-8e38-f93a88d10c00,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:02.620668 containerd[1871]: time="2025-09-03T23:27:02.620418178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhw6w,Uid:8b08a17c-b086-43b4-863d-441927f3822c,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:02.620668 containerd[1871]: time="2025-09-03T23:27:02.620612260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-h5cx7,Uid:b345f694-032e-4f26-8e13-4a3335092435,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:02.620668 containerd[1871]: time="2025-09-03T23:27:02.620441210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-nvtt7,Uid:0bb920d0-73e4-40ad-a06a-66374e99ced3,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:02.643258 containerd[1871]: time="2025-09-03T23:27:02.643196460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-c6twc,Uid:443c995a-7c9b-4cf9-bf5b-9413159cbdf5,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:02.643361 containerd[1871]: time="2025-09-03T23:27:02.643195844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f546bb6c-svbsf,Uid:49b58582-70ce-49c0-bf27-d782f23aaab2,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:02.917397 containerd[1871]: time="2025-09-03T23:27:02.916709322Z" level=error msg="Failed to destroy network for sandbox \"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.918824 systemd[1]: run-netns-cni\x2ddb93a2eb\x2d1789\x2df98e\x2d7560\x2d4d0f2a67f427.mount: Deactivated successfully. Sep 3 23:27:02.924954 containerd[1871]: time="2025-09-03T23:27:02.924686678Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6rww2,Uid:43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.925160 kubelet[3456]: E0903 23:27:02.925100 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.925319 kubelet[3456]: E0903 23:27:02.925169 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6rww2" Sep 3 23:27:02.925319 kubelet[3456]: E0903 23:27:02.925185 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6rww2" Sep 3 23:27:02.925319 kubelet[3456]: E0903 23:27:02.925229 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6rww2_calico-system(43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6rww2_calico-system(43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b70c62894faddd66a8736ea47f60bb1662ab27d6addf01823bfdf0d424cdbde\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6rww2" podUID="43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9" Sep 3 23:27:02.942663 containerd[1871]: time="2025-09-03T23:27:02.942632258Z" level=error msg="Failed to destroy network for sandbox \"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.944929 systemd[1]: run-netns-cni\x2d94ea277a\x2dfc9e\x2d1334\x2d4c86\x2d57b0dc58b4d1.mount: Deactivated successfully. Sep 3 23:27:02.948087 containerd[1871]: time="2025-09-03T23:27:02.948054809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-4qkvb,Uid:65a1f6b3-d706-4773-8e38-f93a88d10c00,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.948410 kubelet[3456]: E0903 23:27:02.948335 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.948410 kubelet[3456]: E0903 23:27:02.948395 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:02.948410 kubelet[3456]: E0903 23:27:02.948410 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-4qkvb" Sep 3 23:27:02.948683 kubelet[3456]: E0903 23:27:02.948441 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-4qkvb_calico-system(65a1f6b3-d706-4773-8e38-f93a88d10c00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-4qkvb_calico-system(65a1f6b3-d706-4773-8e38-f93a88d10c00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f997fdf5ae3bacab84e366ea3c9fc9b08244278d66ff11d02d9e024673f73c9f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-4qkvb" podUID="65a1f6b3-d706-4773-8e38-f93a88d10c00" Sep 3 23:27:02.954052 containerd[1871]: time="2025-09-03T23:27:02.953368438Z" level=error msg="Failed to destroy network for sandbox \"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.956401 systemd[1]: run-netns-cni\x2d4c733ba8\x2d05aa\x2dd827\x2d65c3\x2df5703fa95ea8.mount: Deactivated successfully. Sep 3 23:27:02.959277 containerd[1871]: time="2025-09-03T23:27:02.959180770Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhw6w,Uid:8b08a17c-b086-43b4-863d-441927f3822c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.959404 kubelet[3456]: E0903 23:27:02.959375 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.959610 kubelet[3456]: E0903 23:27:02.959412 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xhw6w" Sep 3 23:27:02.959610 kubelet[3456]: E0903 23:27:02.959428 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xhw6w" Sep 3 23:27:02.959610 kubelet[3456]: E0903 23:27:02.959451 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xhw6w_kube-system(8b08a17c-b086-43b4-863d-441927f3822c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xhw6w_kube-system(8b08a17c-b086-43b4-863d-441927f3822c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aea7f73ae42864faf2547c633c33172fc3f94df8516e980e9e7f3bc20bf76ebb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xhw6w" podUID="8b08a17c-b086-43b4-863d-441927f3822c" Sep 3 23:27:02.980123 containerd[1871]: time="2025-09-03T23:27:02.979983848Z" level=error msg="Failed to destroy network for sandbox \"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.982479 systemd[1]: run-netns-cni\x2d78630523\x2dcaad\x2dea45\x2da098\x2d88b5d8e19338.mount: Deactivated successfully. Sep 3 23:27:02.985140 containerd[1871]: time="2025-09-03T23:27:02.984882559Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-864f5cd4f6-clbjf,Uid:de303087-ad15-475d-bdd3-49d0d4d476d9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.985335 kubelet[3456]: E0903 23:27:02.985269 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.985335 kubelet[3456]: E0903 23:27:02.985324 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-864f5cd4f6-clbjf" Sep 3 23:27:02.985396 kubelet[3456]: E0903 23:27:02.985338 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-864f5cd4f6-clbjf" Sep 3 23:27:02.985616 kubelet[3456]: E0903 23:27:02.985415 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-864f5cd4f6-clbjf_calico-system(de303087-ad15-475d-bdd3-49d0d4d476d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-864f5cd4f6-clbjf_calico-system(de303087-ad15-475d-bdd3-49d0d4d476d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7132fe54705540f76c70fa9540623e9b1069d95faf6ffc2a05be38f9eaeeeb5d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-864f5cd4f6-clbjf" podUID="de303087-ad15-475d-bdd3-49d0d4d476d9" Sep 3 23:27:02.987877 containerd[1871]: time="2025-09-03T23:27:02.987850922Z" level=error msg="Failed to destroy network for sandbox \"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.992052 containerd[1871]: time="2025-09-03T23:27:02.992010550Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-h5cx7,Uid:b345f694-032e-4f26-8e13-4a3335092435,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.992474 kubelet[3456]: E0903 23:27:02.992445 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.992984 kubelet[3456]: E0903 23:27:02.992915 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" Sep 3 23:27:02.992984 kubelet[3456]: E0903 23:27:02.992946 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" Sep 3 23:27:02.992984 kubelet[3456]: E0903 23:27:02.992979 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549b6b9dbd-h5cx7_calico-apiserver(b345f694-032e-4f26-8e13-4a3335092435)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549b6b9dbd-h5cx7_calico-apiserver(b345f694-032e-4f26-8e13-4a3335092435)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f565348dab4dd8be2e01cf13b272047ad06236e27b9e4e3d28a3337e4610f1d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" podUID="b345f694-032e-4f26-8e13-4a3335092435" Sep 3 23:27:02.996074 containerd[1871]: time="2025-09-03T23:27:02.996036089Z" level=error msg="Failed to destroy network for sandbox \"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:02.999924 containerd[1871]: time="2025-09-03T23:27:02.999845040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-c6twc,Uid:443c995a-7c9b-4cf9-bf5b-9413159cbdf5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.000034 kubelet[3456]: E0903 23:27:03.000018 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.000170 kubelet[3456]: E0903 23:27:03.000047 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" Sep 3 23:27:03.000216 kubelet[3456]: E0903 23:27:03.000173 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" Sep 3 23:27:03.000356 kubelet[3456]: E0903 23:27:03.000298 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549b6b9dbd-c6twc_calico-apiserver(443c995a-7c9b-4cf9-bf5b-9413159cbdf5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549b6b9dbd-c6twc_calico-apiserver(443c995a-7c9b-4cf9-bf5b-9413159cbdf5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"472b864bef46e324e7721ca31fd4a396d9a94f0f73e326912d8cc9640d02e282\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" podUID="443c995a-7c9b-4cf9-bf5b-9413159cbdf5" Sep 3 23:27:03.005891 containerd[1871]: time="2025-09-03T23:27:03.005106124Z" level=error msg="Failed to destroy network for sandbox \"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.009327 containerd[1871]: time="2025-09-03T23:27:03.009146183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 3 23:27:03.009997 containerd[1871]: time="2025-09-03T23:27:03.009962987Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-nvtt7,Uid:0bb920d0-73e4-40ad-a06a-66374e99ced3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.010078 containerd[1871]: time="2025-09-03T23:27:03.010062548Z" level=error msg="Failed to destroy network for sandbox \"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.010603 kubelet[3456]: E0903 23:27:03.010576 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.010661 kubelet[3456]: E0903 23:27:03.010614 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-nvtt7" Sep 3 23:27:03.010661 kubelet[3456]: E0903 23:27:03.010627 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-nvtt7" Sep 3 23:27:03.010661 kubelet[3456]: E0903 23:27:03.010650 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-nvtt7_kube-system(0bb920d0-73e4-40ad-a06a-66374e99ced3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-nvtt7_kube-system(0bb920d0-73e4-40ad-a06a-66374e99ced3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d388ffeeaae2b40af634e4d09c06e9ce810b99333a22d60b8b3d579cc8c8d241\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-nvtt7" podUID="0bb920d0-73e4-40ad-a06a-66374e99ced3" Sep 3 23:27:03.020870 containerd[1871]: time="2025-09-03T23:27:03.020832176Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f546bb6c-svbsf,Uid:49b58582-70ce-49c0-bf27-d782f23aaab2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.021022 kubelet[3456]: E0903 23:27:03.020994 3456 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 3 23:27:03.021078 kubelet[3456]: E0903 23:27:03.021041 3456 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" Sep 3 23:27:03.021078 kubelet[3456]: E0903 23:27:03.021056 3456 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" Sep 3 23:27:03.021118 kubelet[3456]: E0903 23:27:03.021084 3456 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-55f546bb6c-svbsf_calico-system(49b58582-70ce-49c0-bf27-d782f23aaab2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-55f546bb6c-svbsf_calico-system(49b58582-70ce-49c0-bf27-d782f23aaab2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec353c4e9ac417f2f827f8d829a65024709bd61b79363ba461430249fb4125f1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" podUID="49b58582-70ce-49c0-bf27-d782f23aaab2" Sep 3 23:27:03.845771 systemd[1]: run-netns-cni\x2d48f66c2e\x2dc271\x2d62a3\x2d6c0b\x2d516eb767dc63.mount: Deactivated successfully. Sep 3 23:27:03.846034 systemd[1]: run-netns-cni\x2d20db75b3\x2d7eb1\x2dfda4\x2dd05e\x2de853e378c0d3.mount: Deactivated successfully. Sep 3 23:27:03.846149 systemd[1]: run-netns-cni\x2dc964a170\x2d3876\x2d9bc9\x2dafad\x2d63b39ef736b5.mount: Deactivated successfully. Sep 3 23:27:03.846247 systemd[1]: run-netns-cni\x2dd2dd348d\x2da9ea\x2df29f\x2d364e\x2dde482481de31.mount: Deactivated successfully. Sep 3 23:27:06.708123 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3503583973.mount: Deactivated successfully. Sep 3 23:27:07.130551 containerd[1871]: time="2025-09-03T23:27:07.130221974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:07.137880 containerd[1871]: time="2025-09-03T23:27:07.137843949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 3 23:27:07.142646 containerd[1871]: time="2025-09-03T23:27:07.142599779Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:07.147000 containerd[1871]: time="2025-09-03T23:27:07.146957338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:07.147497 containerd[1871]: time="2025-09-03T23:27:07.147218470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.138019831s" Sep 3 23:27:07.147497 containerd[1871]: time="2025-09-03T23:27:07.147244694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 3 23:27:07.159180 containerd[1871]: time="2025-09-03T23:27:07.159158629Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 3 23:27:07.176429 containerd[1871]: time="2025-09-03T23:27:07.176402169Z" level=info msg="Container e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:07.209916 containerd[1871]: time="2025-09-03T23:27:07.209876010Z" level=info msg="CreateContainer within sandbox \"5c3c193a200ef529cac866ace813456a1ff66d96668416a0072a18650e19b690\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\"" Sep 3 23:27:07.210331 containerd[1871]: time="2025-09-03T23:27:07.210308536Z" level=info msg="StartContainer for \"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\"" Sep 3 23:27:07.211769 containerd[1871]: time="2025-09-03T23:27:07.211669212Z" level=info msg="connecting to shim e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8" address="unix:///run/containerd/s/1f231755159e60f1bf13ab9a43fd285762be97b022dacecfe74b70b2aeac1d23" protocol=ttrpc version=3 Sep 3 23:27:07.234865 systemd[1]: Started cri-containerd-e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8.scope - libcontainer container e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8. Sep 3 23:27:07.271894 containerd[1871]: time="2025-09-03T23:27:07.271863348Z" level=info msg="StartContainer for \"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" returns successfully" Sep 3 23:27:07.712295 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 3 23:27:07.712409 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 3 23:27:07.918612 kubelet[3456]: I0903 23:27:07.918499 3456 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbq8d\" (UniqueName: \"kubernetes.io/projected/de303087-ad15-475d-bdd3-49d0d4d476d9-kube-api-access-wbq8d\") pod \"de303087-ad15-475d-bdd3-49d0d4d476d9\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " Sep 3 23:27:07.918612 kubelet[3456]: I0903 23:27:07.918576 3456 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-ca-bundle\") pod \"de303087-ad15-475d-bdd3-49d0d4d476d9\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " Sep 3 23:27:07.919633 kubelet[3456]: I0903 23:27:07.919013 3456 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-backend-key-pair\") pod \"de303087-ad15-475d-bdd3-49d0d4d476d9\" (UID: \"de303087-ad15-475d-bdd3-49d0d4d476d9\") " Sep 3 23:27:07.923085 systemd[1]: var-lib-kubelet-pods-de303087\x2dad15\x2d475d\x2dbdd3\x2d49d0d4d476d9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 3 23:27:07.924601 kubelet[3456]: I0903 23:27:07.923757 3456 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "de303087-ad15-475d-bdd3-49d0d4d476d9" (UID: "de303087-ad15-475d-bdd3-49d0d4d476d9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 3 23:27:07.924601 kubelet[3456]: I0903 23:27:07.923959 3456 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "de303087-ad15-475d-bdd3-49d0d4d476d9" (UID: "de303087-ad15-475d-bdd3-49d0d4d476d9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 3 23:27:07.926465 systemd[1]: var-lib-kubelet-pods-de303087\x2dad15\x2d475d\x2dbdd3\x2d49d0d4d476d9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dwbq8d.mount: Deactivated successfully. Sep 3 23:27:07.927991 kubelet[3456]: I0903 23:27:07.927969 3456 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de303087-ad15-475d-bdd3-49d0d4d476d9-kube-api-access-wbq8d" (OuterVolumeSpecName: "kube-api-access-wbq8d") pod "de303087-ad15-475d-bdd3-49d0d4d476d9" (UID: "de303087-ad15-475d-bdd3-49d0d4d476d9"). InnerVolumeSpecName "kube-api-access-wbq8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 3 23:27:08.020321 kubelet[3456]: I0903 23:27:08.020202 3456 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbq8d\" (UniqueName: \"kubernetes.io/projected/de303087-ad15-475d-bdd3-49d0d4d476d9-kube-api-access-wbq8d\") on node \"ci-4372.1.0-n-71c6c07a75\" DevicePath \"\"" Sep 3 23:27:08.020321 kubelet[3456]: I0903 23:27:08.020222 3456 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-ca-bundle\") on node \"ci-4372.1.0-n-71c6c07a75\" DevicePath \"\"" Sep 3 23:27:08.020321 kubelet[3456]: I0903 23:27:08.020229 3456 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/de303087-ad15-475d-bdd3-49d0d4d476d9-whisker-backend-key-pair\") on node \"ci-4372.1.0-n-71c6c07a75\" DevicePath \"\"" Sep 3 23:27:08.027566 systemd[1]: Removed slice kubepods-besteffort-podde303087_ad15_475d_bdd3_49d0d4d476d9.slice - libcontainer container kubepods-besteffort-podde303087_ad15_475d_bdd3_49d0d4d476d9.slice. Sep 3 23:27:08.066333 kubelet[3456]: I0903 23:27:08.066265 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6476q" podStartSLOduration=2.050504649 podStartE2EDuration="16.066250574s" podCreationTimestamp="2025-09-03 23:26:52 +0000 UTC" firstStartedPulling="2025-09-03 23:26:53.132354798 +0000 UTC m=+19.393911859" lastFinishedPulling="2025-09-03 23:27:07.148100723 +0000 UTC m=+33.409657784" observedRunningTime="2025-09-03 23:27:08.044166292 +0000 UTC m=+34.305723457" watchObservedRunningTime="2025-09-03 23:27:08.066250574 +0000 UTC m=+34.327807635" Sep 3 23:27:08.147129 systemd[1]: Created slice kubepods-besteffort-pod27efc715_4d47_43f5_93c0_8c0f556faf11.slice - libcontainer container kubepods-besteffort-pod27efc715_4d47_43f5_93c0_8c0f556faf11.slice. Sep 3 23:27:08.221373 kubelet[3456]: I0903 23:27:08.221324 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/27efc715-4d47-43f5-93c0-8c0f556faf11-whisker-backend-key-pair\") pod \"whisker-fb477d79f-2sqt7\" (UID: \"27efc715-4d47-43f5-93c0-8c0f556faf11\") " pod="calico-system/whisker-fb477d79f-2sqt7" Sep 3 23:27:08.221642 kubelet[3456]: I0903 23:27:08.221360 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27efc715-4d47-43f5-93c0-8c0f556faf11-whisker-ca-bundle\") pod \"whisker-fb477d79f-2sqt7\" (UID: \"27efc715-4d47-43f5-93c0-8c0f556faf11\") " pod="calico-system/whisker-fb477d79f-2sqt7" Sep 3 23:27:08.221642 kubelet[3456]: I0903 23:27:08.221596 3456 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnblc\" (UniqueName: \"kubernetes.io/projected/27efc715-4d47-43f5-93c0-8c0f556faf11-kube-api-access-hnblc\") pod \"whisker-fb477d79f-2sqt7\" (UID: \"27efc715-4d47-43f5-93c0-8c0f556faf11\") " pod="calico-system/whisker-fb477d79f-2sqt7" Sep 3 23:27:08.449545 containerd[1871]: time="2025-09-03T23:27:08.449324285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fb477d79f-2sqt7,Uid:27efc715-4d47-43f5-93c0-8c0f556faf11,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:08.569125 systemd-networkd[1698]: calie8ecda0bdaf: Link UP Sep 3 23:27:08.569795 systemd-networkd[1698]: calie8ecda0bdaf: Gained carrier Sep 3 23:27:08.586889 containerd[1871]: 2025-09-03 23:27:08.474 [INFO][4569] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:08.586889 containerd[1871]: 2025-09-03 23:27:08.496 [INFO][4569] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0 whisker-fb477d79f- calico-system 27efc715-4d47-43f5-93c0-8c0f556faf11 855 0 2025-09-03 23:27:08 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fb477d79f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 whisker-fb477d79f-2sqt7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie8ecda0bdaf [] [] }} ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-" Sep 3 23:27:08.586889 containerd[1871]: 2025-09-03 23:27:08.496 [INFO][4569] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.586889 containerd[1871]: 2025-09-03 23:27:08.514 [INFO][4582] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" HandleID="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Workload="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.514 [INFO][4582] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" HandleID="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Workload="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"whisker-fb477d79f-2sqt7", "timestamp":"2025-09-03 23:27:08.513987118 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.514 [INFO][4582] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.514 [INFO][4582] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.514 [INFO][4582] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.518 [INFO][4582] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.521 [INFO][4582] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.524 [INFO][4582] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.525 [INFO][4582] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587061 containerd[1871]: 2025-09-03 23:27:08.527 [INFO][4582] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.527 [INFO][4582] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.528 [INFO][4582] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164 Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.537 [INFO][4582] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.544 [INFO][4582] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.1/26] block=192.168.91.0/26 handle="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.544 [INFO][4582] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.1/26] handle="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.545 [INFO][4582] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:08.587193 containerd[1871]: 2025-09-03 23:27:08.545 [INFO][4582] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.1/26] IPv6=[] ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" HandleID="k8s-pod-network.dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Workload="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.587287 containerd[1871]: 2025-09-03 23:27:08.547 [INFO][4569] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0", GenerateName:"whisker-fb477d79f-", Namespace:"calico-system", SelfLink:"", UID:"27efc715-4d47-43f5-93c0-8c0f556faf11", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fb477d79f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"whisker-fb477d79f-2sqt7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie8ecda0bdaf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:08.587287 containerd[1871]: 2025-09-03 23:27:08.547 [INFO][4569] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.1/32] ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.587335 containerd[1871]: 2025-09-03 23:27:08.547 [INFO][4569] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8ecda0bdaf ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.587335 containerd[1871]: 2025-09-03 23:27:08.569 [INFO][4569] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.587363 containerd[1871]: 2025-09-03 23:27:08.569 [INFO][4569] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0", GenerateName:"whisker-fb477d79f-", Namespace:"calico-system", SelfLink:"", UID:"27efc715-4d47-43f5-93c0-8c0f556faf11", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 27, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fb477d79f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164", Pod:"whisker-fb477d79f-2sqt7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie8ecda0bdaf", MAC:"66:b6:13:38:d7:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:08.587399 containerd[1871]: 2025-09-03 23:27:08.583 [INFO][4569] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" Namespace="calico-system" Pod="whisker-fb477d79f-2sqt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-whisker--fb477d79f--2sqt7-eth0" Sep 3 23:27:08.625305 containerd[1871]: time="2025-09-03T23:27:08.625233024Z" level=info msg="connecting to shim dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164" address="unix:///run/containerd/s/edb2b3c3fcd2b5c7f6bbb93797b8f01134e9d4879fa44868b358ec4c6d541959" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:08.644648 systemd[1]: Started cri-containerd-dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164.scope - libcontainer container dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164. Sep 3 23:27:08.676880 containerd[1871]: time="2025-09-03T23:27:08.676829018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fb477d79f-2sqt7,Uid:27efc715-4d47-43f5-93c0-8c0f556faf11,Namespace:calico-system,Attempt:0,} returns sandbox id \"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164\"" Sep 3 23:27:08.678421 containerd[1871]: time="2025-09-03T23:27:08.678393297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 3 23:27:09.955870 kubelet[3456]: I0903 23:27:09.955675 3456 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de303087-ad15-475d-bdd3-49d0d4d476d9" path="/var/lib/kubelet/pods/de303087-ad15-475d-bdd3-49d0d4d476d9/volumes" Sep 3 23:27:10.499635 systemd-networkd[1698]: calie8ecda0bdaf: Gained IPv6LL Sep 3 23:27:10.911832 containerd[1871]: time="2025-09-03T23:27:10.911784164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:10.915099 containerd[1871]: time="2025-09-03T23:27:10.915065172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 3 23:27:10.918107 containerd[1871]: time="2025-09-03T23:27:10.918070784Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:10.921378 containerd[1871]: time="2025-09-03T23:27:10.921342192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:10.921965 containerd[1871]: time="2025-09-03T23:27:10.921670069Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 2.243246188s" Sep 3 23:27:10.921965 containerd[1871]: time="2025-09-03T23:27:10.921697349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 3 23:27:10.924372 containerd[1871]: time="2025-09-03T23:27:10.924345924Z" level=info msg="CreateContainer within sandbox \"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 3 23:27:10.946979 containerd[1871]: time="2025-09-03T23:27:10.946828668Z" level=info msg="Container d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:10.963858 containerd[1871]: time="2025-09-03T23:27:10.963819204Z" level=info msg="CreateContainer within sandbox \"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119\"" Sep 3 23:27:10.965012 containerd[1871]: time="2025-09-03T23:27:10.964985126Z" level=info msg="StartContainer for \"d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119\"" Sep 3 23:27:10.966018 containerd[1871]: time="2025-09-03T23:27:10.965891675Z" level=info msg="connecting to shim d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119" address="unix:///run/containerd/s/edb2b3c3fcd2b5c7f6bbb93797b8f01134e9d4879fa44868b358ec4c6d541959" protocol=ttrpc version=3 Sep 3 23:27:10.985656 systemd[1]: Started cri-containerd-d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119.scope - libcontainer container d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119. Sep 3 23:27:11.081283 containerd[1871]: time="2025-09-03T23:27:11.081230408Z" level=info msg="StartContainer for \"d86268204a7e9badeb6c0ca6858625d8d05be826749c99b4f4e14247cfa79119\" returns successfully" Sep 3 23:27:11.082200 containerd[1871]: time="2025-09-03T23:27:11.082180078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 3 23:27:11.912068 kubelet[3456]: I0903 23:27:11.912000 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:11.968180 containerd[1871]: time="2025-09-03T23:27:11.968137323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"aa90018da1eb11c609c0bf346290861d9e0557b0122a4c66e0ad43fed43075bb\" pid:4828 exit_status:1 exited_at:{seconds:1756942031 nanos:967683900}" Sep 3 23:27:12.031563 containerd[1871]: time="2025-09-03T23:27:12.031506089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"1b73791eaf51505f338327d14c2efa6842464a8c0f939af14f50bae04b3d6603\" pid:4851 exit_status:1 exited_at:{seconds:1756942032 nanos:31242957}" Sep 3 23:27:12.758806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2663381021.mount: Deactivated successfully. Sep 3 23:27:12.826957 containerd[1871]: time="2025-09-03T23:27:12.826880090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.830277 containerd[1871]: time="2025-09-03T23:27:12.830247651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 3 23:27:12.833229 containerd[1871]: time="2025-09-03T23:27:12.833187262Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.837256 containerd[1871]: time="2025-09-03T23:27:12.837208697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:12.837730 containerd[1871]: time="2025-09-03T23:27:12.837614871Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.755319951s" Sep 3 23:27:12.837730 containerd[1871]: time="2025-09-03T23:27:12.837644487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 3 23:27:12.840306 containerd[1871]: time="2025-09-03T23:27:12.840265726Z" level=info msg="CreateContainer within sandbox \"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 3 23:27:12.864350 containerd[1871]: time="2025-09-03T23:27:12.862918217Z" level=info msg="Container bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:12.882245 containerd[1871]: time="2025-09-03T23:27:12.882212291Z" level=info msg="CreateContainer within sandbox \"dc48144539713c0d6076e911da57eaf92f69497eb140d6080e3fc56b65747164\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571\"" Sep 3 23:27:12.883657 containerd[1871]: time="2025-09-03T23:27:12.883630656Z" level=info msg="StartContainer for \"bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571\"" Sep 3 23:27:12.884362 containerd[1871]: time="2025-09-03T23:27:12.884334858Z" level=info msg="connecting to shim bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571" address="unix:///run/containerd/s/edb2b3c3fcd2b5c7f6bbb93797b8f01134e9d4879fa44868b358ec4c6d541959" protocol=ttrpc version=3 Sep 3 23:27:12.904637 systemd[1]: Started cri-containerd-bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571.scope - libcontainer container bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571. Sep 3 23:27:12.938898 containerd[1871]: time="2025-09-03T23:27:12.938865711Z" level=info msg="StartContainer for \"bda6a4b83d0f39da1d03dde86821448ed7a1b0629c258a6a96c26fe305599571\" returns successfully" Sep 3 23:27:13.099545 kubelet[3456]: I0903 23:27:13.099421 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-fb477d79f-2sqt7" podStartSLOduration=0.938954274 podStartE2EDuration="5.099404834s" podCreationTimestamp="2025-09-03 23:27:08 +0000 UTC" firstStartedPulling="2025-09-03 23:27:08.677997835 +0000 UTC m=+34.939554896" lastFinishedPulling="2025-09-03 23:27:12.838448395 +0000 UTC m=+39.100005456" observedRunningTime="2025-09-03 23:27:13.09871788 +0000 UTC m=+39.360274941" watchObservedRunningTime="2025-09-03 23:27:13.099404834 +0000 UTC m=+39.360961895" Sep 3 23:27:13.911087 containerd[1871]: time="2025-09-03T23:27:13.911010664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6rww2,Uid:43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:14.002193 systemd-networkd[1698]: calif050a2aa5e4: Link UP Sep 3 23:27:14.003041 systemd-networkd[1698]: calif050a2aa5e4: Gained carrier Sep 3 23:27:14.017669 containerd[1871]: 2025-09-03 23:27:13.934 [INFO][4947] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:14.017669 containerd[1871]: 2025-09-03 23:27:13.941 [INFO][4947] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0 csi-node-driver- calico-system 43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9 685 0 2025-09-03 23:26:52 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 csi-node-driver-6rww2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif050a2aa5e4 [] [] }} ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-" Sep 3 23:27:14.017669 containerd[1871]: 2025-09-03 23:27:13.941 [INFO][4947] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.017669 containerd[1871]: 2025-09-03 23:27:13.960 [INFO][4958] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" HandleID="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Workload="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.960 [INFO][4958] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" HandleID="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Workload="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002caff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"csi-node-driver-6rww2", "timestamp":"2025-09-03 23:27:13.960217191 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.960 [INFO][4958] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.960 [INFO][4958] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.960 [INFO][4958] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.966 [INFO][4958] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.970 [INFO][4958] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.974 [INFO][4958] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.975 [INFO][4958] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017839 containerd[1871]: 2025-09-03 23:27:13.977 [INFO][4958] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.977 [INFO][4958] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.979 [INFO][4958] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85 Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.986 [INFO][4958] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.996 [INFO][4958] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.2/26] block=192.168.91.0/26 handle="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.996 [INFO][4958] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.2/26] handle="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.996 [INFO][4958] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:14.017968 containerd[1871]: 2025-09-03 23:27:13.996 [INFO][4958] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.2/26] IPv6=[] ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" HandleID="k8s-pod-network.a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Workload="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.018070 containerd[1871]: 2025-09-03 23:27:13.998 [INFO][4947] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"csi-node-driver-6rww2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif050a2aa5e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:14.018103 containerd[1871]: 2025-09-03 23:27:13.998 [INFO][4947] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.2/32] ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.018103 containerd[1871]: 2025-09-03 23:27:13.998 [INFO][4947] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif050a2aa5e4 ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.018103 containerd[1871]: 2025-09-03 23:27:14.003 [INFO][4947] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.018143 containerd[1871]: 2025-09-03 23:27:14.003 [INFO][4947] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85", Pod:"csi-node-driver-6rww2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif050a2aa5e4", MAC:"ca:15:b1:b1:86:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:14.018175 containerd[1871]: 2025-09-03 23:27:14.015 [INFO][4947] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" Namespace="calico-system" Pod="csi-node-driver-6rww2" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-csi--node--driver--6rww2-eth0" Sep 3 23:27:14.062334 containerd[1871]: time="2025-09-03T23:27:14.062298963Z" level=info msg="connecting to shim a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85" address="unix:///run/containerd/s/4e6044a33797c0f05f72e10acaba53461e9c424bbc41f5e4dcf2a2902de1f2b2" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:14.084628 systemd[1]: Started cri-containerd-a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85.scope - libcontainer container a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85. Sep 3 23:27:14.104474 containerd[1871]: time="2025-09-03T23:27:14.104439467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6rww2,Uid:43cbfbbe-3cf1-4bc5-8e46-c7e46d49d7a9,Namespace:calico-system,Attempt:0,} returns sandbox id \"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85\"" Sep 3 23:27:14.106110 containerd[1871]: time="2025-09-03T23:27:14.105860685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 3 23:27:15.107687 systemd-networkd[1698]: calif050a2aa5e4: Gained IPv6LL Sep 3 23:27:15.180451 containerd[1871]: time="2025-09-03T23:27:15.180411140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:15.183440 containerd[1871]: time="2025-09-03T23:27:15.183415105Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 3 23:27:15.186488 containerd[1871]: time="2025-09-03T23:27:15.186428862Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:15.190217 containerd[1871]: time="2025-09-03T23:27:15.190177801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:15.190711 containerd[1871]: time="2025-09-03T23:27:15.190416819Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.084527838s" Sep 3 23:27:15.190711 containerd[1871]: time="2025-09-03T23:27:15.190443739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 3 23:27:15.192062 containerd[1871]: time="2025-09-03T23:27:15.192030030Z" level=info msg="CreateContainer within sandbox \"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 3 23:27:15.215469 containerd[1871]: time="2025-09-03T23:27:15.214796398Z" level=info msg="Container fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:15.219546 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1151745212.mount: Deactivated successfully. Sep 3 23:27:15.230293 containerd[1871]: time="2025-09-03T23:27:15.230260483Z" level=info msg="CreateContainer within sandbox \"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117\"" Sep 3 23:27:15.230833 containerd[1871]: time="2025-09-03T23:27:15.230809863Z" level=info msg="StartContainer for \"fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117\"" Sep 3 23:27:15.232081 containerd[1871]: time="2025-09-03T23:27:15.232038728Z" level=info msg="connecting to shim fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117" address="unix:///run/containerd/s/4e6044a33797c0f05f72e10acaba53461e9c424bbc41f5e4dcf2a2902de1f2b2" protocol=ttrpc version=3 Sep 3 23:27:15.248627 systemd[1]: Started cri-containerd-fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117.scope - libcontainer container fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117. Sep 3 23:27:15.342786 containerd[1871]: time="2025-09-03T23:27:15.342746156Z" level=info msg="StartContainer for \"fc985e738d345a3945894e10319948361c3be0bce1d436c4a48dc5d35d53e117\" returns successfully" Sep 3 23:27:15.344533 containerd[1871]: time="2025-09-03T23:27:15.344251943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 3 23:27:15.910296 containerd[1871]: time="2025-09-03T23:27:15.909966418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f546bb6c-svbsf,Uid:49b58582-70ce-49c0-bf27-d782f23aaab2,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:16.002124 systemd-networkd[1698]: cali6574c21019b: Link UP Sep 3 23:27:16.003039 systemd-networkd[1698]: cali6574c21019b: Gained carrier Sep 3 23:27:16.020996 containerd[1871]: 2025-09-03 23:27:15.939 [INFO][5094] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:16.020996 containerd[1871]: 2025-09-03 23:27:15.952 [INFO][5094] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0 calico-kube-controllers-55f546bb6c- calico-system 49b58582-70ce-49c0-bf27-d782f23aaab2 792 0 2025-09-03 23:26:53 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:55f546bb6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 calico-kube-controllers-55f546bb6c-svbsf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6574c21019b [] [] }} ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-" Sep 3 23:27:16.020996 containerd[1871]: 2025-09-03 23:27:15.952 [INFO][5094] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.020996 containerd[1871]: 2025-09-03 23:27:15.969 [INFO][5106] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" HandleID="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.969 [INFO][5106] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" HandleID="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"calico-kube-controllers-55f546bb6c-svbsf", "timestamp":"2025-09-03 23:27:15.969214308 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.969 [INFO][5106] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.969 [INFO][5106] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.969 [INFO][5106] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.973 [INFO][5106] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.976 [INFO][5106] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.979 [INFO][5106] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.980 [INFO][5106] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021479 containerd[1871]: 2025-09-03 23:27:15.982 [INFO][5106] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.982 [INFO][5106] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.983 [INFO][5106] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.991 [INFO][5106] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.997 [INFO][5106] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.3/26] block=192.168.91.0/26 handle="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.997 [INFO][5106] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.3/26] handle="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.997 [INFO][5106] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:16.021667 containerd[1871]: 2025-09-03 23:27:15.997 [INFO][5106] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.3/26] IPv6=[] ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" HandleID="k8s-pod-network.9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.021764 containerd[1871]: 2025-09-03 23:27:15.999 [INFO][5094] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0", GenerateName:"calico-kube-controllers-55f546bb6c-", Namespace:"calico-system", SelfLink:"", UID:"49b58582-70ce-49c0-bf27-d782f23aaab2", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55f546bb6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"calico-kube-controllers-55f546bb6c-svbsf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6574c21019b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.021803 containerd[1871]: 2025-09-03 23:27:15.999 [INFO][5094] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.3/32] ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.021803 containerd[1871]: 2025-09-03 23:27:15.999 [INFO][5094] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6574c21019b ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.021803 containerd[1871]: 2025-09-03 23:27:16.003 [INFO][5094] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.021972 containerd[1871]: 2025-09-03 23:27:16.004 [INFO][5094] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0", GenerateName:"calico-kube-controllers-55f546bb6c-", Namespace:"calico-system", SelfLink:"", UID:"49b58582-70ce-49c0-bf27-d782f23aaab2", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55f546bb6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d", Pod:"calico-kube-controllers-55f546bb6c-svbsf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6574c21019b", MAC:"5e:16:39:8a:7a:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:16.022074 containerd[1871]: 2025-09-03 23:27:16.018 [INFO][5094] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" Namespace="calico-system" Pod="calico-kube-controllers-55f546bb6c-svbsf" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--kube--controllers--55f546bb6c--svbsf-eth0" Sep 3 23:27:16.056391 containerd[1871]: time="2025-09-03T23:27:16.056312410Z" level=info msg="connecting to shim 9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d" address="unix:///run/containerd/s/a667368fc17ac1bcd80903687f8f5491fca153697a76577129d0214781baa916" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:16.076628 systemd[1]: Started cri-containerd-9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d.scope - libcontainer container 9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d. Sep 3 23:27:16.105404 containerd[1871]: time="2025-09-03T23:27:16.105375540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f546bb6c-svbsf,Uid:49b58582-70ce-49c0-bf27-d782f23aaab2,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d\"" Sep 3 23:27:16.581305 containerd[1871]: time="2025-09-03T23:27:16.580849131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:16.583696 containerd[1871]: time="2025-09-03T23:27:16.583672743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 3 23:27:16.587297 containerd[1871]: time="2025-09-03T23:27:16.587274729Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:16.591025 containerd[1871]: time="2025-09-03T23:27:16.591003107Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:16.591535 containerd[1871]: time="2025-09-03T23:27:16.591250325Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.246573163s" Sep 3 23:27:16.591535 containerd[1871]: time="2025-09-03T23:27:16.591277293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 3 23:27:16.593697 containerd[1871]: time="2025-09-03T23:27:16.593671758Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 3 23:27:16.594374 containerd[1871]: time="2025-09-03T23:27:16.594347003Z" level=info msg="CreateContainer within sandbox \"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 3 23:27:16.619033 containerd[1871]: time="2025-09-03T23:27:16.618911344Z" level=info msg="Container dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:16.655723 containerd[1871]: time="2025-09-03T23:27:16.655676347Z" level=info msg="CreateContainer within sandbox \"a9d95e5499a08ec190479a7f2b315e7657d894236aa8e06e319dc3c770032e85\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f\"" Sep 3 23:27:16.657189 containerd[1871]: time="2025-09-03T23:27:16.657161141Z" level=info msg="StartContainer for \"dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f\"" Sep 3 23:27:16.659727 containerd[1871]: time="2025-09-03T23:27:16.659699583Z" level=info msg="connecting to shim dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f" address="unix:///run/containerd/s/4e6044a33797c0f05f72e10acaba53461e9c424bbc41f5e4dcf2a2902de1f2b2" protocol=ttrpc version=3 Sep 3 23:27:16.686889 systemd[1]: Started cri-containerd-dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f.scope - libcontainer container dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f. Sep 3 23:27:16.910419 containerd[1871]: time="2025-09-03T23:27:16.910290254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-h5cx7,Uid:b345f694-032e-4f26-8e13-4a3335092435,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:16.910419 containerd[1871]: time="2025-09-03T23:27:16.910410287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-nvtt7,Uid:0bb920d0-73e4-40ad-a06a-66374e99ced3,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:16.911175 containerd[1871]: time="2025-09-03T23:27:16.910560584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-4qkvb,Uid:65a1f6b3-d706-4773-8e38-f93a88d10c00,Namespace:calico-system,Attempt:0,}" Sep 3 23:27:16.989105 kubelet[3456]: I0903 23:27:16.989075 3456 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 3 23:27:16.989105 kubelet[3456]: I0903 23:27:16.989109 3456 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 3 23:27:17.170826 containerd[1871]: time="2025-09-03T23:27:17.169854926Z" level=info msg="StartContainer for \"dfc91439860ea2d170cbe6e949a2b7a91c1c69721d8bbc855db240ed2225424f\" returns successfully" Sep 3 23:27:17.318630 systemd-networkd[1698]: calia4605a8a775: Link UP Sep 3 23:27:17.319575 systemd-networkd[1698]: calia4605a8a775: Gained carrier Sep 3 23:27:17.332613 containerd[1871]: 2025-09-03 23:27:17.218 [INFO][5223] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:17.332613 containerd[1871]: 2025-09-03 23:27:17.230 [INFO][5223] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0 coredns-7c65d6cfc9- kube-system 0bb920d0-73e4-40ad-a06a-66374e99ced3 794 0 2025-09-03 23:26:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 coredns-7c65d6cfc9-nvtt7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia4605a8a775 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-" Sep 3 23:27:17.332613 containerd[1871]: 2025-09-03 23:27:17.230 [INFO][5223] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.332613 containerd[1871]: 2025-09-03 23:27:17.269 [INFO][5256] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" HandleID="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.270 [INFO][5256] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" HandleID="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"coredns-7c65d6cfc9-nvtt7", "timestamp":"2025-09-03 23:27:17.268601792 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.270 [INFO][5256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.270 [INFO][5256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.270 [INFO][5256] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.277 [INFO][5256] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.280 [INFO][5256] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.282 [INFO][5256] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.285 [INFO][5256] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332792 containerd[1871]: 2025-09-03 23:27:17.286 [INFO][5256] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.286 [INFO][5256] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.287 [INFO][5256] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517 Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.297 [INFO][5256] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5256] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.4/26] block=192.168.91.0/26 handle="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5256] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.4/26] handle="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:17.332953 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5256] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.4/26] IPv6=[] ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" HandleID="k8s-pod-network.9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.311 [INFO][5223] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0bb920d0-73e4-40ad-a06a-66374e99ced3", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"coredns-7c65d6cfc9-nvtt7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia4605a8a775", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.311 [INFO][5223] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.4/32] ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.311 [INFO][5223] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia4605a8a775 ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.320 [INFO][5223] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.320 [INFO][5223] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0bb920d0-73e4-40ad-a06a-66374e99ced3", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517", Pod:"coredns-7c65d6cfc9-nvtt7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia4605a8a775", MAC:"f6:56:c0:39:63:e0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.333048 containerd[1871]: 2025-09-03 23:27:17.329 [INFO][5223] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" Namespace="kube-system" Pod="coredns-7c65d6cfc9-nvtt7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--nvtt7-eth0" Sep 3 23:27:17.409910 systemd-networkd[1698]: caliee69136a7cd: Link UP Sep 3 23:27:17.411604 systemd-networkd[1698]: caliee69136a7cd: Gained carrier Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.243 [INFO][5243] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.253 [INFO][5243] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0 calico-apiserver-549b6b9dbd- calico-apiserver b345f694-032e-4f26-8e13-4a3335092435 791 0 2025-09-03 23:26:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:549b6b9dbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 calico-apiserver-549b6b9dbd-h5cx7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliee69136a7cd [] [] }} ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.253 [INFO][5243] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.273 [INFO][5267] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" HandleID="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.273 [INFO][5267] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" HandleID="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cf730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"calico-apiserver-549b6b9dbd-h5cx7", "timestamp":"2025-09-03 23:27:17.2732006 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.273 [INFO][5267] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5267] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.308 [INFO][5267] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.378 [INFO][5267] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.383 [INFO][5267] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.386 [INFO][5267] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.387 [INFO][5267] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.389 [INFO][5267] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.389 [INFO][5267] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.390 [INFO][5267] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.398 [INFO][5267] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5267] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.5/26] block=192.168.91.0/26 handle="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5267] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.5/26] handle="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5267] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:17.427196 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5267] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.5/26] IPv6=[] ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" HandleID="k8s-pod-network.2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.406 [INFO][5243] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0", GenerateName:"calico-apiserver-549b6b9dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"b345f694-032e-4f26-8e13-4a3335092435", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549b6b9dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"calico-apiserver-549b6b9dbd-h5cx7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliee69136a7cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.406 [INFO][5243] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.5/32] ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.406 [INFO][5243] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliee69136a7cd ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.412 [INFO][5243] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.412 [INFO][5243] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0", GenerateName:"calico-apiserver-549b6b9dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"b345f694-032e-4f26-8e13-4a3335092435", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549b6b9dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb", Pod:"calico-apiserver-549b6b9dbd-h5cx7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliee69136a7cd", MAC:"1a:86:de:3f:d9:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.428128 containerd[1871]: 2025-09-03 23:27:17.424 [INFO][5243] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-h5cx7" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--h5cx7-eth0" Sep 3 23:27:17.474200 containerd[1871]: time="2025-09-03T23:27:17.474158492Z" level=info msg="connecting to shim 9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517" address="unix:///run/containerd/s/e4005dbb5e247a9ac42b71d4f241803a6c6de97e79c70ebd5bf6809ef65345dd" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:17.503030 containerd[1871]: time="2025-09-03T23:27:17.502716643Z" level=info msg="connecting to shim 2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb" address="unix:///run/containerd/s/015b93c8e97218c5a0bb3da447341567d360c7450e961beb97e1b5cbcb71bf89" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:17.508752 systemd[1]: Started cri-containerd-9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517.scope - libcontainer container 9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517. Sep 3 23:27:17.530781 systemd[1]: Started cri-containerd-2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb.scope - libcontainer container 2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb. Sep 3 23:27:17.546576 systemd-networkd[1698]: cali5e36f64e056: Link UP Sep 3 23:27:17.548631 systemd-networkd[1698]: cali5e36f64e056: Gained carrier Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.225 [INFO][5233] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.238 [INFO][5233] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0 goldmane-7988f88666- calico-system 65a1f6b3-d706-4773-8e38-f93a88d10c00 787 0 2025-09-03 23:26:52 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 goldmane-7988f88666-4qkvb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5e36f64e056 [] [] }} ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.238 [INFO][5233] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.273 [INFO][5262] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" HandleID="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Workload="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.273 [INFO][5262] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" HandleID="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Workload="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3690), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"goldmane-7988f88666-4qkvb", "timestamp":"2025-09-03 23:27:17.273846666 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.275 [INFO][5262] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5262] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.403 [INFO][5262] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.483 [INFO][5262] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.490 [INFO][5262] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.493 [INFO][5262] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.496 [INFO][5262] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.505 [INFO][5262] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.506 [INFO][5262] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.508 [INFO][5262] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.519 [INFO][5262] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.535 [INFO][5262] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.6/26] block=192.168.91.0/26 handle="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.535 [INFO][5262] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.6/26] handle="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.535 [INFO][5262] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:17.570222 containerd[1871]: 2025-09-03 23:27:17.535 [INFO][5262] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.6/26] IPv6=[] ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" HandleID="k8s-pod-network.a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Workload="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.540 [INFO][5233] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"65a1f6b3-d706-4773-8e38-f93a88d10c00", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"goldmane-7988f88666-4qkvb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5e36f64e056", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.540 [INFO][5233] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.6/32] ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.540 [INFO][5233] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e36f64e056 ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.549 [INFO][5233] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.549 [INFO][5233] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"65a1f6b3-d706-4773-8e38-f93a88d10c00", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f", Pod:"goldmane-7988f88666-4qkvb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5e36f64e056", MAC:"46:b4:87:69:1b:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:17.570669 containerd[1871]: 2025-09-03 23:27:17.567 [INFO][5233] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" Namespace="calico-system" Pod="goldmane-7988f88666-4qkvb" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-goldmane--7988f88666--4qkvb-eth0" Sep 3 23:27:17.573425 containerd[1871]: time="2025-09-03T23:27:17.573348509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-nvtt7,Uid:0bb920d0-73e4-40ad-a06a-66374e99ced3,Namespace:kube-system,Attempt:0,} returns sandbox id \"9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517\"" Sep 3 23:27:17.578481 containerd[1871]: time="2025-09-03T23:27:17.578007022Z" level=info msg="CreateContainer within sandbox \"9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 3 23:27:17.603134 containerd[1871]: time="2025-09-03T23:27:17.603109071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-h5cx7,Uid:b345f694-032e-4f26-8e13-4a3335092435,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb\"" Sep 3 23:27:17.613349 containerd[1871]: time="2025-09-03T23:27:17.613008706Z" level=info msg="Container e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:17.646762 containerd[1871]: time="2025-09-03T23:27:17.646736043Z" level=info msg="connecting to shim a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f" address="unix:///run/containerd/s/a8d81daf91ca85f93dc8d54a0aca306f44dcbcfb534af74630164d2d9fee4b8a" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:17.653093 containerd[1871]: time="2025-09-03T23:27:17.653060566Z" level=info msg="CreateContainer within sandbox \"9c178994dcd49f4054a0fad5a609d4497b2243cf27c3c17a0136cb96f272a517\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428\"" Sep 3 23:27:17.655224 containerd[1871]: time="2025-09-03T23:27:17.655198567Z" level=info msg="StartContainer for \"e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428\"" Sep 3 23:27:17.656650 containerd[1871]: time="2025-09-03T23:27:17.656619893Z" level=info msg="connecting to shim e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428" address="unix:///run/containerd/s/e4005dbb5e247a9ac42b71d4f241803a6c6de97e79c70ebd5bf6809ef65345dd" protocol=ttrpc version=3 Sep 3 23:27:17.665048 systemd[1]: Started cri-containerd-a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f.scope - libcontainer container a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f. Sep 3 23:27:17.682626 systemd[1]: Started cri-containerd-e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428.scope - libcontainer container e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428. Sep 3 23:27:17.713219 containerd[1871]: time="2025-09-03T23:27:17.713170787Z" level=info msg="StartContainer for \"e100cf3434f0c5ee9e3c4e4fb1571dea5e6c5fa4219c988f4873ab2c6369a428\" returns successfully" Sep 3 23:27:17.736621 containerd[1871]: time="2025-09-03T23:27:17.736568698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-4qkvb,Uid:65a1f6b3-d706-4773-8e38-f93a88d10c00,Namespace:calico-system,Attempt:0,} returns sandbox id \"a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f\"" Sep 3 23:27:17.912020 containerd[1871]: time="2025-09-03T23:27:17.911759593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-c6twc,Uid:443c995a-7c9b-4cf9-bf5b-9413159cbdf5,Namespace:calico-apiserver,Attempt:0,}" Sep 3 23:27:17.912437 containerd[1871]: time="2025-09-03T23:27:17.912416732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhw6w,Uid:8b08a17c-b086-43b4-863d-441927f3822c,Namespace:kube-system,Attempt:0,}" Sep 3 23:27:17.924724 systemd-networkd[1698]: cali6574c21019b: Gained IPv6LL Sep 3 23:27:18.036422 systemd-networkd[1698]: calic53e7af23c9: Link UP Sep 3 23:27:18.037724 systemd-networkd[1698]: calic53e7af23c9: Gained carrier Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.940 [INFO][5487] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.954 [INFO][5487] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0 calico-apiserver-549b6b9dbd- calico-apiserver 443c995a-7c9b-4cf9-bf5b-9413159cbdf5 793 0 2025-09-03 23:26:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:549b6b9dbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 calico-apiserver-549b6b9dbd-c6twc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic53e7af23c9 [] [] }} ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.954 [INFO][5487] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.988 [INFO][5510] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" HandleID="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.989 [INFO][5510] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" HandleID="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"calico-apiserver-549b6b9dbd-c6twc", "timestamp":"2025-09-03 23:27:17.988898226 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.989 [INFO][5510] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.989 [INFO][5510] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.989 [INFO][5510] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:17.998 [INFO][5510] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.002 [INFO][5510] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.009 [INFO][5510] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.012 [INFO][5510] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.014 [INFO][5510] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.014 [INFO][5510] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.015 [INFO][5510] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7 Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.020 [INFO][5510] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.028 [INFO][5510] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.7/26] block=192.168.91.0/26 handle="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.028 [INFO][5510] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.7/26] handle="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.029 [INFO][5510] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:18.052651 containerd[1871]: 2025-09-03 23:27:18.029 [INFO][5510] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.7/26] IPv6=[] ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" HandleID="k8s-pod-network.2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Workload="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.032 [INFO][5487] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0", GenerateName:"calico-apiserver-549b6b9dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"443c995a-7c9b-4cf9-bf5b-9413159cbdf5", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549b6b9dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"calico-apiserver-549b6b9dbd-c6twc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic53e7af23c9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.032 [INFO][5487] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.7/32] ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.032 [INFO][5487] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic53e7af23c9 ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.036 [INFO][5487] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.037 [INFO][5487] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0", GenerateName:"calico-apiserver-549b6b9dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"443c995a-7c9b-4cf9-bf5b-9413159cbdf5", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549b6b9dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7", Pod:"calico-apiserver-549b6b9dbd-c6twc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic53e7af23c9", MAC:"0e:0b:29:61:d4:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.053042 containerd[1871]: 2025-09-03 23:27:18.051 [INFO][5487] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" Namespace="calico-apiserver" Pod="calico-apiserver-549b6b9dbd-c6twc" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-calico--apiserver--549b6b9dbd--c6twc-eth0" Sep 3 23:27:18.141111 systemd-networkd[1698]: calic629edc3c02: Link UP Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:17.960 [INFO][5496] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:17.973 [INFO][5496] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0 coredns-7c65d6cfc9- kube-system 8b08a17c-b086-43b4-863d-441927f3822c 783 0 2025-09-03 23:26:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372.1.0-n-71c6c07a75 coredns-7c65d6cfc9-xhw6w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic629edc3c02 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:17.973 [INFO][5496] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.007 [INFO][5517] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" HandleID="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.007 [INFO][5517] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" HandleID="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb900), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372.1.0-n-71c6c07a75", "pod":"coredns-7c65d6cfc9-xhw6w", "timestamp":"2025-09-03 23:27:18.00702583 +0000 UTC"}, Hostname:"ci-4372.1.0-n-71c6c07a75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.007 [INFO][5517] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.029 [INFO][5517] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.029 [INFO][5517] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372.1.0-n-71c6c07a75' Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.097 [INFO][5517] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.108 [INFO][5517] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.111 [INFO][5517] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.112 [INFO][5517] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.114 [INFO][5517] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.114 [INFO][5517] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.115 [INFO][5517] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1 Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.119 [INFO][5517] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.129 [INFO][5517] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.8/26] block=192.168.91.0/26 handle="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.129 [INFO][5517] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.8/26] handle="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" host="ci-4372.1.0-n-71c6c07a75" Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.129 [INFO][5517] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 3 23:27:18.407388 containerd[1871]: 2025-09-03 23:27:18.129 [INFO][5517] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.8/26] IPv6=[] ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" HandleID="k8s-pod-network.fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Workload="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.407816 kubelet[3456]: I0903 23:27:18.211420 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-6rww2" podStartSLOduration=23.724920992 podStartE2EDuration="26.211407271s" podCreationTimestamp="2025-09-03 23:26:52 +0000 UTC" firstStartedPulling="2025-09-03 23:27:14.105496019 +0000 UTC m=+40.367053080" lastFinishedPulling="2025-09-03 23:27:16.59198229 +0000 UTC m=+42.853539359" observedRunningTime="2025-09-03 23:27:18.210964544 +0000 UTC m=+44.472521613" watchObservedRunningTime="2025-09-03 23:27:18.211407271 +0000 UTC m=+44.472964332" Sep 3 23:27:18.407816 kubelet[3456]: I0903 23:27:18.230763 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-nvtt7" podStartSLOduration=38.23075167 podStartE2EDuration="38.23075167s" podCreationTimestamp="2025-09-03 23:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:27:18.230450729 +0000 UTC m=+44.492007790" watchObservedRunningTime="2025-09-03 23:27:18.23075167 +0000 UTC m=+44.492308731" Sep 3 23:27:18.142055 systemd-networkd[1698]: calic629edc3c02: Gained carrier Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.132 [INFO][5496] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8b08a17c-b086-43b4-863d-441927f3822c", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"", Pod:"coredns-7c65d6cfc9-xhw6w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic629edc3c02", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.133 [INFO][5496] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.8/32] ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.133 [INFO][5496] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic629edc3c02 ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.141 [INFO][5496] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.144 [INFO][5496] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"8b08a17c-b086-43b4-863d-441927f3822c", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 3, 23, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372.1.0-n-71c6c07a75", ContainerID:"fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1", Pod:"coredns-7c65d6cfc9-xhw6w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic629edc3c02", MAC:"fa:d8:44:7a:cf:73", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 3 23:27:18.408212 containerd[1871]: 2025-09-03 23:27:18.161 [INFO][5496] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhw6w" WorkloadEndpoint="ci--4372.1.0--n--71c6c07a75-k8s-coredns--7c65d6cfc9--xhw6w-eth0" Sep 3 23:27:18.627691 systemd-networkd[1698]: calia4605a8a775: Gained IPv6LL Sep 3 23:27:18.683299 containerd[1871]: time="2025-09-03T23:27:18.682970464Z" level=info msg="connecting to shim 2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7" address="unix:///run/containerd/s/648f7e80e295b4eb18a2194ac996201340cecfadd4a3e9ffdab30290c9e1bc81" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:18.708715 systemd[1]: Started cri-containerd-2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7.scope - libcontainer container 2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7. Sep 3 23:27:18.743809 containerd[1871]: time="2025-09-03T23:27:18.743587302Z" level=info msg="connecting to shim fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1" address="unix:///run/containerd/s/d62bd48a19629816f34c987c525e074857fe9c325d569bb9aa6df0cb26850e28" namespace=k8s.io protocol=ttrpc version=3 Sep 3 23:27:18.769727 containerd[1871]: time="2025-09-03T23:27:18.769693271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549b6b9dbd-c6twc,Uid:443c995a-7c9b-4cf9-bf5b-9413159cbdf5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7\"" Sep 3 23:27:18.778675 systemd[1]: Started cri-containerd-fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1.scope - libcontainer container fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1. Sep 3 23:27:18.829212 containerd[1871]: time="2025-09-03T23:27:18.829168498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhw6w,Uid:8b08a17c-b086-43b4-863d-441927f3822c,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1\"" Sep 3 23:27:18.836227 containerd[1871]: time="2025-09-03T23:27:18.835854395Z" level=info msg="CreateContainer within sandbox \"fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 3 23:27:18.872389 containerd[1871]: time="2025-09-03T23:27:18.872344566Z" level=info msg="Container 47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:18.894581 containerd[1871]: time="2025-09-03T23:27:18.893888608Z" level=info msg="CreateContainer within sandbox \"fe82172c0a56f9cb7875fbab8540c9c3e9fe5350c881473317b4ba0fee1502c1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a\"" Sep 3 23:27:18.895339 containerd[1871]: time="2025-09-03T23:27:18.895140923Z" level=info msg="StartContainer for \"47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a\"" Sep 3 23:27:18.897427 containerd[1871]: time="2025-09-03T23:27:18.897316133Z" level=info msg="connecting to shim 47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a" address="unix:///run/containerd/s/d62bd48a19629816f34c987c525e074857fe9c325d569bb9aa6df0cb26850e28" protocol=ttrpc version=3 Sep 3 23:27:18.926797 systemd[1]: Started cri-containerd-47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a.scope - libcontainer container 47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a. Sep 3 23:27:18.977297 containerd[1871]: time="2025-09-03T23:27:18.976775994Z" level=info msg="StartContainer for \"47faeb11ef4a4fff5552dd33d50bf4a4b7bdbe98229f9413778218cf10a4206a\" returns successfully" Sep 3 23:27:19.139682 systemd-networkd[1698]: caliee69136a7cd: Gained IPv6LL Sep 3 23:27:19.140121 systemd-networkd[1698]: cali5e36f64e056: Gained IPv6LL Sep 3 23:27:19.390808 containerd[1871]: time="2025-09-03T23:27:19.390755230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:19.394320 containerd[1871]: time="2025-09-03T23:27:19.394151955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 3 23:27:19.397535 containerd[1871]: time="2025-09-03T23:27:19.397135826Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:19.401526 containerd[1871]: time="2025-09-03T23:27:19.401468094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:19.402281 containerd[1871]: time="2025-09-03T23:27:19.402254162Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.80855386s" Sep 3 23:27:19.402281 containerd[1871]: time="2025-09-03T23:27:19.402282282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 3 23:27:19.403925 containerd[1871]: time="2025-09-03T23:27:19.403885691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 3 23:27:19.425708 containerd[1871]: time="2025-09-03T23:27:19.425679441Z" level=info msg="CreateContainer within sandbox \"9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 3 23:27:19.447984 containerd[1871]: time="2025-09-03T23:27:19.447953518Z" level=info msg="Container 02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:19.463407 containerd[1871]: time="2025-09-03T23:27:19.463337478Z" level=info msg="CreateContainer within sandbox \"9ea3d6f16356437e61c45c0888882b1fbd8ac53d133ee2456256627a2616181d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\"" Sep 3 23:27:19.463875 containerd[1871]: time="2025-09-03T23:27:19.463844414Z" level=info msg="StartContainer for \"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\"" Sep 3 23:27:19.464728 containerd[1871]: time="2025-09-03T23:27:19.464694140Z" level=info msg="connecting to shim 02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f" address="unix:///run/containerd/s/a667368fc17ac1bcd80903687f8f5491fca153697a76577129d0214781baa916" protocol=ttrpc version=3 Sep 3 23:27:19.480635 systemd[1]: Started cri-containerd-02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f.scope - libcontainer container 02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f. Sep 3 23:27:19.517074 containerd[1871]: time="2025-09-03T23:27:19.517041624Z" level=info msg="StartContainer for \"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" returns successfully" Sep 3 23:27:19.907737 systemd-networkd[1698]: calic53e7af23c9: Gained IPv6LL Sep 3 23:27:20.163728 systemd-networkd[1698]: calic629edc3c02: Gained IPv6LL Sep 3 23:27:20.212774 kubelet[3456]: I0903 23:27:20.212721 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xhw6w" podStartSLOduration=40.212529356 podStartE2EDuration="40.212529356s" podCreationTimestamp="2025-09-03 23:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-03 23:27:19.216033845 +0000 UTC m=+45.477590906" watchObservedRunningTime="2025-09-03 23:27:20.212529356 +0000 UTC m=+46.474086529" Sep 3 23:27:20.213420 kubelet[3456]: I0903 23:27:20.212806 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-55f546bb6c-svbsf" podStartSLOduration=23.915834121 podStartE2EDuration="27.212799888s" podCreationTimestamp="2025-09-03 23:26:53 +0000 UTC" firstStartedPulling="2025-09-03 23:27:16.106209961 +0000 UTC m=+42.367767022" lastFinishedPulling="2025-09-03 23:27:19.403175728 +0000 UTC m=+45.664732789" observedRunningTime="2025-09-03 23:27:20.211031381 +0000 UTC m=+46.472588450" watchObservedRunningTime="2025-09-03 23:27:20.212799888 +0000 UTC m=+46.474356989" Sep 3 23:27:20.809995 kubelet[3456]: I0903 23:27:20.809802 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:21.202186 kubelet[3456]: I0903 23:27:21.200981 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:21.947969 systemd-networkd[1698]: vxlan.calico: Link UP Sep 3 23:27:21.947977 systemd-networkd[1698]: vxlan.calico: Gained carrier Sep 3 23:27:22.359913 containerd[1871]: time="2025-09-03T23:27:22.359820702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:22.363845 containerd[1871]: time="2025-09-03T23:27:22.363815086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 3 23:27:22.367177 containerd[1871]: time="2025-09-03T23:27:22.367070172Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:22.371866 containerd[1871]: time="2025-09-03T23:27:22.371053756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:22.371866 containerd[1871]: time="2025-09-03T23:27:22.371333159Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.966922668s" Sep 3 23:27:22.371866 containerd[1871]: time="2025-09-03T23:27:22.371356679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 3 23:27:22.372965 containerd[1871]: time="2025-09-03T23:27:22.372938994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 3 23:27:22.374697 containerd[1871]: time="2025-09-03T23:27:22.374579174Z" level=info msg="CreateContainer within sandbox \"2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:22.402347 containerd[1871]: time="2025-09-03T23:27:22.402248878Z" level=info msg="Container 6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:22.423952 containerd[1871]: time="2025-09-03T23:27:22.423927672Z" level=info msg="CreateContainer within sandbox \"2a5eab9a0941419bf971e329f38cd0efe114883bcdf2356488f03bd8d0a5f3bb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc\"" Sep 3 23:27:22.424872 containerd[1871]: time="2025-09-03T23:27:22.424847451Z" level=info msg="StartContainer for \"6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc\"" Sep 3 23:27:22.426497 containerd[1871]: time="2025-09-03T23:27:22.426455182Z" level=info msg="connecting to shim 6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc" address="unix:///run/containerd/s/015b93c8e97218c5a0bb3da447341567d360c7450e961beb97e1b5cbcb71bf89" protocol=ttrpc version=3 Sep 3 23:27:22.450629 systemd[1]: Started cri-containerd-6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc.scope - libcontainer container 6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc. Sep 3 23:27:22.483549 containerd[1871]: time="2025-09-03T23:27:22.483520444Z" level=info msg="StartContainer for \"6fc026c32c8c4151456beec75650b4cd4ee7e239dbdaa9d14b8a4b2ea5c148cc\" returns successfully" Sep 3 23:27:23.221715 kubelet[3456]: I0903 23:27:23.221643 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-549b6b9dbd-h5cx7" podStartSLOduration=29.453083487 podStartE2EDuration="34.221627982s" podCreationTimestamp="2025-09-03 23:26:49 +0000 UTC" firstStartedPulling="2025-09-03 23:27:17.604231689 +0000 UTC m=+43.865788750" lastFinishedPulling="2025-09-03 23:27:22.372776176 +0000 UTC m=+48.634333245" observedRunningTime="2025-09-03 23:27:23.221545029 +0000 UTC m=+49.483102138" watchObservedRunningTime="2025-09-03 23:27:23.221627982 +0000 UTC m=+49.483185043" Sep 3 23:27:23.619652 systemd-networkd[1698]: vxlan.calico: Gained IPv6LL Sep 3 23:27:24.210198 kubelet[3456]: I0903 23:27:24.209847 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:25.239038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1705941354.mount: Deactivated successfully. Sep 3 23:27:26.025255 containerd[1871]: time="2025-09-03T23:27:26.025203314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:26.028054 containerd[1871]: time="2025-09-03T23:27:26.027987372Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 3 23:27:26.031427 containerd[1871]: time="2025-09-03T23:27:26.031381428Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:26.035467 containerd[1871]: time="2025-09-03T23:27:26.035436716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:26.035951 containerd[1871]: time="2025-09-03T23:27:26.035928322Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.662960896s" Sep 3 23:27:26.035992 containerd[1871]: time="2025-09-03T23:27:26.035957042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 3 23:27:26.038043 containerd[1871]: time="2025-09-03T23:27:26.036888581Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 3 23:27:26.038043 containerd[1871]: time="2025-09-03T23:27:26.038033755Z" level=info msg="CreateContainer within sandbox \"a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 3 23:27:26.065678 containerd[1871]: time="2025-09-03T23:27:26.065648587Z" level=info msg="Container 308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:26.081307 containerd[1871]: time="2025-09-03T23:27:26.081251700Z" level=info msg="CreateContainer within sandbox \"a68dcf09bb313b01d829ffffbbb934917d7421a7b41f48032dce3018642cc45f\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\"" Sep 3 23:27:26.082826 containerd[1871]: time="2025-09-03T23:27:26.082806167Z" level=info msg="StartContainer for \"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\"" Sep 3 23:27:26.084116 containerd[1871]: time="2025-09-03T23:27:26.084044262Z" level=info msg="connecting to shim 308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717" address="unix:///run/containerd/s/a8d81daf91ca85f93dc8d54a0aca306f44dcbcfb534af74630164d2d9fee4b8a" protocol=ttrpc version=3 Sep 3 23:27:26.125780 systemd[1]: Started cri-containerd-308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717.scope - libcontainer container 308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717. Sep 3 23:27:26.170531 containerd[1871]: time="2025-09-03T23:27:26.170407128Z" level=info msg="StartContainer for \"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" returns successfully" Sep 3 23:27:26.235659 kubelet[3456]: I0903 23:27:26.235447 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-4qkvb" podStartSLOduration=25.937379563 podStartE2EDuration="34.235434644s" podCreationTimestamp="2025-09-03 23:26:52 +0000 UTC" firstStartedPulling="2025-09-03 23:27:17.738734763 +0000 UTC m=+44.000291824" lastFinishedPulling="2025-09-03 23:27:26.036789844 +0000 UTC m=+52.298346905" observedRunningTime="2025-09-03 23:27:26.235232802 +0000 UTC m=+52.496789863" watchObservedRunningTime="2025-09-03 23:27:26.235434644 +0000 UTC m=+52.496991705" Sep 3 23:27:26.278332 containerd[1871]: time="2025-09-03T23:27:26.278236265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"4e4b2d42e6403b16e9a57c1eeac468e2019d5099b01943ce11c3b196e6f2d307\" pid:6004 exit_status:1 exited_at:{seconds:1756942046 nanos:274034855}" Sep 3 23:27:26.541598 containerd[1871]: time="2025-09-03T23:27:26.541164229Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 3 23:27:26.544682 containerd[1871]: time="2025-09-03T23:27:26.544653910Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 3 23:27:26.547461 containerd[1871]: time="2025-09-03T23:27:26.547352550Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 509.449597ms" Sep 3 23:27:26.547461 containerd[1871]: time="2025-09-03T23:27:26.547380527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 3 23:27:26.550413 containerd[1871]: time="2025-09-03T23:27:26.549352718Z" level=info msg="CreateContainer within sandbox \"2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 3 23:27:26.572588 containerd[1871]: time="2025-09-03T23:27:26.572088292Z" level=info msg="Container 30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b: CDI devices from CRI Config.CDIDevices: []" Sep 3 23:27:26.586794 containerd[1871]: time="2025-09-03T23:27:26.586766867Z" level=info msg="CreateContainer within sandbox \"2ed1a6b8fd27865e27755b0e20bac60e79d364392452bc66aa154441da1068c7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b\"" Sep 3 23:27:26.587297 containerd[1871]: time="2025-09-03T23:27:26.587263033Z" level=info msg="StartContainer for \"30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b\"" Sep 3 23:27:26.588263 containerd[1871]: time="2025-09-03T23:27:26.588236236Z" level=info msg="connecting to shim 30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b" address="unix:///run/containerd/s/648f7e80e295b4eb18a2194ac996201340cecfadd4a3e9ffdab30290c9e1bc81" protocol=ttrpc version=3 Sep 3 23:27:26.607629 systemd[1]: Started cri-containerd-30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b.scope - libcontainer container 30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b. Sep 3 23:27:26.640860 containerd[1871]: time="2025-09-03T23:27:26.640824293Z" level=info msg="StartContainer for \"30f3573897ef541c53f13995176c3ff3f42dce32ea6840e9f87d9b7e98bbb31b\" returns successfully" Sep 3 23:27:27.322236 containerd[1871]: time="2025-09-03T23:27:27.322178075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"6ba09570144ed9748d211a99c3e185af24ec47c6138a4b187c524f214f6eafbc\" pid:6066 exit_status:1 exited_at:{seconds:1756942047 nanos:321739965}" Sep 3 23:27:28.227958 kubelet[3456]: I0903 23:27:28.227856 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:31.713106 kubelet[3456]: I0903 23:27:31.713065 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:27:31.736936 containerd[1871]: time="2025-09-03T23:27:31.736900976Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"f6aad53cfc0e1770bd836f5038d263002287b78ecc410a75b8af0d6b03fadb68\" pid:6099 exited_at:{seconds:1756942051 nanos:736607862}" Sep 3 23:27:31.759499 kubelet[3456]: I0903 23:27:31.757806 3456 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-549b6b9dbd-c6twc" podStartSLOduration=34.98178985 podStartE2EDuration="42.75779067s" podCreationTimestamp="2025-09-03 23:26:49 +0000 UTC" firstStartedPulling="2025-09-03 23:27:18.77197753 +0000 UTC m=+45.033534591" lastFinishedPulling="2025-09-03 23:27:26.547978342 +0000 UTC m=+52.809535411" observedRunningTime="2025-09-03 23:27:27.265602928 +0000 UTC m=+53.527160029" watchObservedRunningTime="2025-09-03 23:27:31.75779067 +0000 UTC m=+58.019347731" Sep 3 23:27:31.776523 containerd[1871]: time="2025-09-03T23:27:31.776453325Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"d87a86cd67f2b6ea7f4be7d6e2081db3ec54801946fda70d23441a01b5347ff8\" pid:6121 exited_at:{seconds:1756942051 nanos:776303516}" Sep 3 23:27:32.760908 containerd[1871]: time="2025-09-03T23:27:32.760865845Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"cba773493eedd38a3277414db6672c918106feb9ae235a708c453353a19d84db\" pid:6152 exited_at:{seconds:1756942052 nanos:760550889}" Sep 3 23:27:41.964210 containerd[1871]: time="2025-09-03T23:27:41.964042508Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"5f7f5430dff76df9cc75dc8848f4fec77aff174ff4c2d5cfa3c89eb658d58921\" pid:6177 exited_at:{seconds:1756942061 nanos:963824441}" Sep 3 23:27:48.616154 containerd[1871]: time="2025-09-03T23:27:48.616116599Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"3b6149c190ba5f07cd46a0fd28af7162263092297f06586e9dce8a97a0638d5b\" pid:6210 exited_at:{seconds:1756942068 nanos:615890332}" Sep 3 23:28:01.582175 kubelet[3456]: I0903 23:28:01.582014 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:28:01.750285 containerd[1871]: time="2025-09-03T23:28:01.750187076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"d12479802956a779877e45c287d0f60e08f1db7700ea47ac455baac4c849c615\" pid:6237 exited_at:{seconds:1756942081 nanos:749966609}" Sep 3 23:28:02.134252 kubelet[3456]: I0903 23:28:02.133722 3456 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 3 23:28:06.461737 containerd[1871]: time="2025-09-03T23:28:06.461505830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"e26fb43c80ec34f69fd97f36e9dcd46f081f8b6d05de0dd5c2a11dc714f017e9\" pid:6268 exited_at:{seconds:1756942086 nanos:461062601}" Sep 3 23:28:11.979840 containerd[1871]: time="2025-09-03T23:28:11.979800373Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"5c39e378c0b0b9d31b533707b5bd466186a22a0373c5e8905f45650fb2301e18\" pid:6293 exited_at:{seconds:1756942091 nanos:979445656}" Sep 3 23:28:18.651722 containerd[1871]: time="2025-09-03T23:28:18.651678768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"d76337f1efeeaeff251ad57db73e0a3dd41d7f4292cf9e9f76bf0c0666b21065\" pid:6317 exited_at:{seconds:1756942098 nanos:651252682}" Sep 3 23:28:31.737613 containerd[1871]: time="2025-09-03T23:28:31.737433954Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"b1243490e9180f6378ae3f65e6a2c971c6dd54cb76fa7332485c75437382d257\" pid:6340 exited_at:{seconds:1756942111 nanos:737254143}" Sep 3 23:28:32.763440 containerd[1871]: time="2025-09-03T23:28:32.763386994Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"eb6b341add70ed682883315b944471a632fa498a5401e5c923c19f4266a37029\" pid:6360 exited_at:{seconds:1756942112 nanos:763050366}" Sep 3 23:28:41.964269 containerd[1871]: time="2025-09-03T23:28:41.964133738Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"f556c8b093926cc5b29d88e6019fc393aa5bff6b3af6fe0737327238a9250a78\" pid:6388 exited_at:{seconds:1756942121 nanos:963939327}" Sep 3 23:28:48.606624 containerd[1871]: time="2025-09-03T23:28:48.606573258Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"3aaffcb0c401239ef73870cb5fd4786496d3341be36dc56d3ee67c735e17f232\" pid:6420 exited_at:{seconds:1756942128 nanos:605858184}" Sep 3 23:28:51.235152 systemd[1]: Started sshd@7-10.200.20.15:22-10.200.16.10:40914.service - OpenSSH per-connection server daemon (10.200.16.10:40914). Sep 3 23:28:51.696764 sshd[6434]: Accepted publickey for core from 10.200.16.10 port 40914 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:28:51.698278 sshd-session[6434]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:28:51.704901 systemd-logind[1851]: New session 10 of user core. Sep 3 23:28:51.708650 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 3 23:28:52.145375 sshd[6436]: Connection closed by 10.200.16.10 port 40914 Sep 3 23:28:52.145214 sshd-session[6434]: pam_unix(sshd:session): session closed for user core Sep 3 23:28:52.149766 systemd-logind[1851]: Session 10 logged out. Waiting for processes to exit. Sep 3 23:28:52.151159 systemd[1]: sshd@7-10.200.20.15:22-10.200.16.10:40914.service: Deactivated successfully. Sep 3 23:28:52.155605 systemd[1]: session-10.scope: Deactivated successfully. Sep 3 23:28:52.157879 systemd-logind[1851]: Removed session 10. Sep 3 23:28:57.233208 systemd[1]: Started sshd@8-10.200.20.15:22-10.200.16.10:40926.service - OpenSSH per-connection server daemon (10.200.16.10:40926). Sep 3 23:28:57.728455 sshd[6470]: Accepted publickey for core from 10.200.16.10 port 40926 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:28:57.729257 sshd-session[6470]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:28:57.733316 systemd-logind[1851]: New session 11 of user core. Sep 3 23:28:57.738778 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 3 23:28:58.120990 sshd[6472]: Connection closed by 10.200.16.10 port 40926 Sep 3 23:28:58.121574 sshd-session[6470]: pam_unix(sshd:session): session closed for user core Sep 3 23:28:58.124289 systemd-logind[1851]: Session 11 logged out. Waiting for processes to exit. Sep 3 23:28:58.125360 systemd[1]: sshd@8-10.200.20.15:22-10.200.16.10:40926.service: Deactivated successfully. Sep 3 23:28:58.127355 systemd[1]: session-11.scope: Deactivated successfully. Sep 3 23:28:58.128927 systemd-logind[1851]: Removed session 11. Sep 3 23:29:01.737182 containerd[1871]: time="2025-09-03T23:29:01.737106774Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"81b8ae0b45cd09abb6a386dbfddf896a8c175f662516b8d1d61435f69d51e167\" pid:6496 exited_at:{seconds:1756942141 nanos:736884627}" Sep 3 23:29:03.208431 systemd[1]: Started sshd@9-10.200.20.15:22-10.200.16.10:55490.service - OpenSSH per-connection server daemon (10.200.16.10:55490). Sep 3 23:29:03.661349 sshd[6506]: Accepted publickey for core from 10.200.16.10 port 55490 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:03.662487 sshd-session[6506]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:03.666150 systemd-logind[1851]: New session 12 of user core. Sep 3 23:29:03.671635 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 3 23:29:04.045345 sshd[6508]: Connection closed by 10.200.16.10 port 55490 Sep 3 23:29:04.046098 sshd-session[6506]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:04.049199 systemd[1]: sshd@9-10.200.20.15:22-10.200.16.10:55490.service: Deactivated successfully. Sep 3 23:29:04.051144 systemd[1]: session-12.scope: Deactivated successfully. Sep 3 23:29:04.051928 systemd-logind[1851]: Session 12 logged out. Waiting for processes to exit. Sep 3 23:29:04.052965 systemd-logind[1851]: Removed session 12. Sep 3 23:29:04.140441 systemd[1]: Started sshd@10-10.200.20.15:22-10.200.16.10:55496.service - OpenSSH per-connection server daemon (10.200.16.10:55496). Sep 3 23:29:04.634672 sshd[6520]: Accepted publickey for core from 10.200.16.10 port 55496 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:04.635782 sshd-session[6520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:04.639395 systemd-logind[1851]: New session 13 of user core. Sep 3 23:29:04.645620 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 3 23:29:05.060778 sshd[6522]: Connection closed by 10.200.16.10 port 55496 Sep 3 23:29:05.061200 sshd-session[6520]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:05.064190 systemd[1]: sshd@10-10.200.20.15:22-10.200.16.10:55496.service: Deactivated successfully. Sep 3 23:29:05.066072 systemd[1]: session-13.scope: Deactivated successfully. Sep 3 23:29:05.066876 systemd-logind[1851]: Session 13 logged out. Waiting for processes to exit. Sep 3 23:29:05.068335 systemd-logind[1851]: Removed session 13. Sep 3 23:29:05.152334 systemd[1]: Started sshd@11-10.200.20.15:22-10.200.16.10:55498.service - OpenSSH per-connection server daemon (10.200.16.10:55498). Sep 3 23:29:05.644057 sshd[6532]: Accepted publickey for core from 10.200.16.10 port 55498 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:05.645168 sshd-session[6532]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:05.648869 systemd-logind[1851]: New session 14 of user core. Sep 3 23:29:05.656620 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 3 23:29:06.039750 sshd[6538]: Connection closed by 10.200.16.10 port 55498 Sep 3 23:29:06.040185 sshd-session[6532]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:06.043492 systemd[1]: sshd@11-10.200.20.15:22-10.200.16.10:55498.service: Deactivated successfully. Sep 3 23:29:06.045373 systemd[1]: session-14.scope: Deactivated successfully. Sep 3 23:29:06.046309 systemd-logind[1851]: Session 14 logged out. Waiting for processes to exit. Sep 3 23:29:06.047651 systemd-logind[1851]: Removed session 14. Sep 3 23:29:06.445170 containerd[1871]: time="2025-09-03T23:29:06.445132069Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"8f4c86d66519a49c3318f7e91885842a4ec4e0cc245b87e8e14e3f1752d97377\" pid:6561 exited_at:{seconds:1756942146 nanos:444973347}" Sep 3 23:29:11.128705 systemd[1]: Started sshd@12-10.200.20.15:22-10.200.16.10:35666.service - OpenSSH per-connection server daemon (10.200.16.10:35666). Sep 3 23:29:11.633522 sshd[6574]: Accepted publickey for core from 10.200.16.10 port 35666 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:11.634695 sshd-session[6574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:11.638460 systemd-logind[1851]: New session 15 of user core. Sep 3 23:29:11.647636 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 3 23:29:11.999231 containerd[1871]: time="2025-09-03T23:29:11.999057408Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"6a83344614800693606c5408071fe3a316fe115adb74c867af1806b54a60f65c\" pid:6596 exited_at:{seconds:1756942151 nanos:998755061}" Sep 3 23:29:12.050550 sshd[6576]: Connection closed by 10.200.16.10 port 35666 Sep 3 23:29:12.051127 sshd-session[6574]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:12.055007 systemd[1]: sshd@12-10.200.20.15:22-10.200.16.10:35666.service: Deactivated successfully. Sep 3 23:29:12.058134 systemd[1]: session-15.scope: Deactivated successfully. Sep 3 23:29:12.059797 systemd-logind[1851]: Session 15 logged out. Waiting for processes to exit. Sep 3 23:29:12.060975 systemd-logind[1851]: Removed session 15. Sep 3 23:29:12.155827 systemd[1]: Started sshd@13-10.200.20.15:22-10.200.16.10:35682.service - OpenSSH per-connection server daemon (10.200.16.10:35682). Sep 3 23:29:12.652035 sshd[6611]: Accepted publickey for core from 10.200.16.10 port 35682 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:12.653182 sshd-session[6611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:12.657308 systemd-logind[1851]: New session 16 of user core. Sep 3 23:29:12.662630 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 3 23:29:13.249728 sshd[6613]: Connection closed by 10.200.16.10 port 35682 Sep 3 23:29:13.250448 sshd-session[6611]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:13.254042 systemd[1]: sshd@13-10.200.20.15:22-10.200.16.10:35682.service: Deactivated successfully. Sep 3 23:29:13.256082 systemd[1]: session-16.scope: Deactivated successfully. Sep 3 23:29:13.257843 systemd-logind[1851]: Session 16 logged out. Waiting for processes to exit. Sep 3 23:29:13.259326 systemd-logind[1851]: Removed session 16. Sep 3 23:29:13.319569 systemd[1]: Started sshd@14-10.200.20.15:22-10.200.16.10:35698.service - OpenSSH per-connection server daemon (10.200.16.10:35698). Sep 3 23:29:13.736718 sshd[6623]: Accepted publickey for core from 10.200.16.10 port 35698 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:13.737860 sshd-session[6623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:13.741569 systemd-logind[1851]: New session 17 of user core. Sep 3 23:29:13.747808 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 3 23:29:15.278529 sshd[6625]: Connection closed by 10.200.16.10 port 35698 Sep 3 23:29:15.278839 sshd-session[6623]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:15.283583 systemd[1]: sshd@14-10.200.20.15:22-10.200.16.10:35698.service: Deactivated successfully. Sep 3 23:29:15.286008 systemd[1]: session-17.scope: Deactivated successfully. Sep 3 23:29:15.286232 systemd[1]: session-17.scope: Consumed 333ms CPU time, 70.8M memory peak. Sep 3 23:29:15.287595 systemd-logind[1851]: Session 17 logged out. Waiting for processes to exit. Sep 3 23:29:15.288868 systemd-logind[1851]: Removed session 17. Sep 3 23:29:15.382719 systemd[1]: Started sshd@15-10.200.20.15:22-10.200.16.10:35704.service - OpenSSH per-connection server daemon (10.200.16.10:35704). Sep 3 23:29:15.847218 sshd[6642]: Accepted publickey for core from 10.200.16.10 port 35704 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:15.848355 sshd-session[6642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:15.852293 systemd-logind[1851]: New session 18 of user core. Sep 3 23:29:15.857624 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 3 23:29:16.319368 sshd[6644]: Connection closed by 10.200.16.10 port 35704 Sep 3 23:29:16.318907 sshd-session[6642]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:16.321667 systemd-logind[1851]: Session 18 logged out. Waiting for processes to exit. Sep 3 23:29:16.322197 systemd[1]: sshd@15-10.200.20.15:22-10.200.16.10:35704.service: Deactivated successfully. Sep 3 23:29:16.324268 systemd[1]: session-18.scope: Deactivated successfully. Sep 3 23:29:16.327350 systemd-logind[1851]: Removed session 18. Sep 3 23:29:16.409390 systemd[1]: Started sshd@16-10.200.20.15:22-10.200.16.10:35716.service - OpenSSH per-connection server daemon (10.200.16.10:35716). Sep 3 23:29:16.895162 sshd[6654]: Accepted publickey for core from 10.200.16.10 port 35716 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:16.896489 sshd-session[6654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:16.900032 systemd-logind[1851]: New session 19 of user core. Sep 3 23:29:16.907799 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 3 23:29:17.277311 sshd[6656]: Connection closed by 10.200.16.10 port 35716 Sep 3 23:29:17.277140 sshd-session[6654]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:17.280149 systemd[1]: sshd@16-10.200.20.15:22-10.200.16.10:35716.service: Deactivated successfully. Sep 3 23:29:17.281624 systemd[1]: session-19.scope: Deactivated successfully. Sep 3 23:29:17.282214 systemd-logind[1851]: Session 19 logged out. Waiting for processes to exit. Sep 3 23:29:17.283571 systemd-logind[1851]: Removed session 19. Sep 3 23:29:18.612262 containerd[1871]: time="2025-09-03T23:29:18.612214563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"9a1f6f0d2d8b05a2f96ca5ca1e78daf79b6c00099c01d56d8d5f2fc441f2352c\" pid:6678 exited_at:{seconds:1756942158 nanos:607427088}" Sep 3 23:29:22.363285 systemd[1]: Started sshd@17-10.200.20.15:22-10.200.16.10:53304.service - OpenSSH per-connection server daemon (10.200.16.10:53304). Sep 3 23:29:22.824989 sshd[6692]: Accepted publickey for core from 10.200.16.10 port 53304 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:22.827027 sshd-session[6692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:22.832253 systemd-logind[1851]: New session 20 of user core. Sep 3 23:29:22.836704 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 3 23:29:23.241951 sshd[6696]: Connection closed by 10.200.16.10 port 53304 Sep 3 23:29:23.242637 sshd-session[6692]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:23.247267 systemd[1]: sshd@17-10.200.20.15:22-10.200.16.10:53304.service: Deactivated successfully. Sep 3 23:29:23.251126 systemd[1]: session-20.scope: Deactivated successfully. Sep 3 23:29:23.252487 systemd-logind[1851]: Session 20 logged out. Waiting for processes to exit. Sep 3 23:29:23.253902 systemd-logind[1851]: Removed session 20. Sep 3 23:29:28.329907 systemd[1]: Started sshd@18-10.200.20.15:22-10.200.16.10:53312.service - OpenSSH per-connection server daemon (10.200.16.10:53312). Sep 3 23:29:28.788232 sshd[6709]: Accepted publickey for core from 10.200.16.10 port 53312 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:28.789545 sshd-session[6709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:28.793582 systemd-logind[1851]: New session 21 of user core. Sep 3 23:29:28.797653 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 3 23:29:29.171421 sshd[6711]: Connection closed by 10.200.16.10 port 53312 Sep 3 23:29:29.171314 sshd-session[6709]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:29.174665 systemd-logind[1851]: Session 21 logged out. Waiting for processes to exit. Sep 3 23:29:29.174816 systemd[1]: sshd@18-10.200.20.15:22-10.200.16.10:53312.service: Deactivated successfully. Sep 3 23:29:29.177931 systemd[1]: session-21.scope: Deactivated successfully. Sep 3 23:29:29.180897 systemd-logind[1851]: Removed session 21. Sep 3 23:29:31.740896 containerd[1871]: time="2025-09-03T23:29:31.740676517Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02b16a0f4bb078a93c316bc3e904921badf1b7041d8e4ea2aa17058adf440f5f\" id:\"26d147d310b9bb931efbed05d1f35751f9d14148734e795d397ba1172228b551\" pid:6734 exited_at:{seconds:1756942171 nanos:740343122}" Sep 3 23:29:32.764781 containerd[1871]: time="2025-09-03T23:29:32.764725475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"f7f6dff73eed5297f58f5685382e7486a94072e172cf7c465905ddd24cb48a86\" pid:6756 exited_at:{seconds:1756942172 nanos:764378855}" Sep 3 23:29:34.259114 systemd[1]: Started sshd@19-10.200.20.15:22-10.200.16.10:35688.service - OpenSSH per-connection server daemon (10.200.16.10:35688). Sep 3 23:29:34.741529 sshd[6769]: Accepted publickey for core from 10.200.16.10 port 35688 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:34.742735 sshd-session[6769]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:34.746678 systemd-logind[1851]: New session 22 of user core. Sep 3 23:29:34.751636 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 3 23:29:35.123886 sshd[6771]: Connection closed by 10.200.16.10 port 35688 Sep 3 23:29:35.124435 sshd-session[6769]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:35.127437 systemd[1]: sshd@19-10.200.20.15:22-10.200.16.10:35688.service: Deactivated successfully. Sep 3 23:29:35.129043 systemd[1]: session-22.scope: Deactivated successfully. Sep 3 23:29:35.129708 systemd-logind[1851]: Session 22 logged out. Waiting for processes to exit. Sep 3 23:29:35.131268 systemd-logind[1851]: Removed session 22. Sep 3 23:29:40.214613 systemd[1]: Started sshd@20-10.200.20.15:22-10.200.16.10:56798.service - OpenSSH per-connection server daemon (10.200.16.10:56798). Sep 3 23:29:40.697807 sshd[6783]: Accepted publickey for core from 10.200.16.10 port 56798 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:40.699110 sshd-session[6783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:40.703178 systemd-logind[1851]: New session 23 of user core. Sep 3 23:29:40.711649 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 3 23:29:41.080241 sshd[6785]: Connection closed by 10.200.16.10 port 56798 Sep 3 23:29:41.080948 sshd-session[6783]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:41.084029 systemd[1]: sshd@20-10.200.20.15:22-10.200.16.10:56798.service: Deactivated successfully. Sep 3 23:29:41.085759 systemd[1]: session-23.scope: Deactivated successfully. Sep 3 23:29:41.086616 systemd-logind[1851]: Session 23 logged out. Waiting for processes to exit. Sep 3 23:29:41.088287 systemd-logind[1851]: Removed session 23. Sep 3 23:29:41.966382 containerd[1871]: time="2025-09-03T23:29:41.966345504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e31a4f5664542daa5c4c87d8ba49fad1b32bc5ac87370951468a2ddb150232d8\" id:\"2af178b459ce7963759935b5e1041d9cb0aa3faa8cf79b96f1345f76dd8ca9b8\" pid:6810 exited_at:{seconds:1756942181 nanos:965868003}" Sep 3 23:29:46.167722 systemd[1]: Started sshd@21-10.200.20.15:22-10.200.16.10:56808.service - OpenSSH per-connection server daemon (10.200.16.10:56808). Sep 3 23:29:46.643896 sshd[6821]: Accepted publickey for core from 10.200.16.10 port 56808 ssh2: RSA SHA256:+LoyTczYPQZz35LneG7EaruCG6YAUVWd39QoXAwwCdw Sep 3 23:29:46.645133 sshd-session[6821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 3 23:29:46.648980 systemd-logind[1851]: New session 24 of user core. Sep 3 23:29:46.657634 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 3 23:29:47.022791 sshd[6823]: Connection closed by 10.200.16.10 port 56808 Sep 3 23:29:47.023434 sshd-session[6821]: pam_unix(sshd:session): session closed for user core Sep 3 23:29:47.026673 systemd[1]: sshd@21-10.200.20.15:22-10.200.16.10:56808.service: Deactivated successfully. Sep 3 23:29:47.028144 systemd[1]: session-24.scope: Deactivated successfully. Sep 3 23:29:47.029422 systemd-logind[1851]: Session 24 logged out. Waiting for processes to exit. Sep 3 23:29:47.031073 systemd-logind[1851]: Removed session 24. Sep 3 23:29:48.617623 containerd[1871]: time="2025-09-03T23:29:48.617577331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"308696593bf52903e0c4f520b2e614b84b853ef3d5f7885c4a4a3899e5282717\" id:\"6b73ca38bca25946c613f18a33031949dbf9adff8f7e86213f467207bf4a32a4\" pid:6846 exited_at:{seconds:1756942188 nanos:617185775}"