Sep 9 23:42:15.012149 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 9 23:42:15.012166 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 22:10:22 -00 2025 Sep 9 23:42:15.012173 kernel: KASLR enabled Sep 9 23:42:15.012177 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 9 23:42:15.012182 kernel: printk: legacy bootconsole [pl11] enabled Sep 9 23:42:15.012185 kernel: efi: EFI v2.7 by EDK II Sep 9 23:42:15.012190 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20f698 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 Sep 9 23:42:15.012194 kernel: random: crng init done Sep 9 23:42:15.012198 kernel: secureboot: Secure boot disabled Sep 9 23:42:15.012202 kernel: ACPI: Early table checksum verification disabled Sep 9 23:42:15.012206 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 9 23:42:15.012210 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012214 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012218 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 9 23:42:15.012223 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012228 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012232 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012236 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012241 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012245 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012250 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 9 23:42:15.012254 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 23:42:15.012258 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 9 23:42:15.012262 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 23:42:15.012266 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 9 23:42:15.012271 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 9 23:42:15.012275 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 9 23:42:15.012279 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 9 23:42:15.012283 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 9 23:42:15.012288 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 9 23:42:15.012292 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 9 23:42:15.012297 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 9 23:42:15.012301 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 9 23:42:15.012305 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 9 23:42:15.012309 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 9 23:42:15.012314 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 9 23:42:15.012318 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 9 23:42:15.012322 kernel: NODE_DATA(0) allocated [mem 0x1bf7fda00-0x1bf804fff] Sep 9 23:42:15.012326 kernel: Zone ranges: Sep 9 23:42:15.012330 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 9 23:42:15.012337 kernel: DMA32 empty Sep 9 23:42:15.012342 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 23:42:15.012346 kernel: Device empty Sep 9 23:42:15.012350 kernel: Movable zone start for each node Sep 9 23:42:15.012355 kernel: Early memory node ranges Sep 9 23:42:15.012360 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 9 23:42:15.012364 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 9 23:42:15.012368 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 9 23:42:15.012373 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 9 23:42:15.012377 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 9 23:42:15.012382 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 9 23:42:15.012399 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 9 23:42:15.012404 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 9 23:42:15.012408 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 23:42:15.012413 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 9 23:42:15.012417 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 9 23:42:15.012421 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 9 23:42:15.012427 kernel: psci: probing for conduit method from ACPI. Sep 9 23:42:15.012431 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 23:42:15.012436 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 23:42:15.012440 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 9 23:42:15.012444 kernel: psci: SMC Calling Convention v1.4 Sep 9 23:42:15.012449 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 9 23:42:15.012453 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 9 23:42:15.012457 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 23:42:15.012462 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 23:42:15.012466 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 9 23:42:15.012471 kernel: Detected PIPT I-cache on CPU0 Sep 9 23:42:15.012476 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 9 23:42:15.012480 kernel: CPU features: detected: GIC system register CPU interface Sep 9 23:42:15.012485 kernel: CPU features: detected: Spectre-v4 Sep 9 23:42:15.012489 kernel: CPU features: detected: Spectre-BHB Sep 9 23:42:15.012494 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 23:42:15.012498 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 23:42:15.012502 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 9 23:42:15.012507 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 23:42:15.012511 kernel: alternatives: applying boot alternatives Sep 9 23:42:15.012516 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:42:15.012521 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 23:42:15.012526 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 23:42:15.012531 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 23:42:15.012535 kernel: Fallback order for Node 0: 0 Sep 9 23:42:15.012540 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 9 23:42:15.012544 kernel: Policy zone: Normal Sep 9 23:42:15.012548 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 23:42:15.012553 kernel: software IO TLB: area num 2. Sep 9 23:42:15.012557 kernel: software IO TLB: mapped [mem 0x0000000036290000-0x000000003a290000] (64MB) Sep 9 23:42:15.012562 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 23:42:15.012566 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 23:42:15.012571 kernel: rcu: RCU event tracing is enabled. Sep 9 23:42:15.012576 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 23:42:15.012581 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 23:42:15.012585 kernel: Tracing variant of Tasks RCU enabled. Sep 9 23:42:15.012590 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 23:42:15.012594 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 23:42:15.012598 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 23:42:15.012603 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 23:42:15.012607 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 23:42:15.012612 kernel: GICv3: 960 SPIs implemented Sep 9 23:42:15.012616 kernel: GICv3: 0 Extended SPIs implemented Sep 9 23:42:15.012620 kernel: Root IRQ handler: gic_handle_irq Sep 9 23:42:15.012625 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 9 23:42:15.012630 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 9 23:42:15.012634 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 9 23:42:15.012638 kernel: ITS: No ITS available, not enabling LPIs Sep 9 23:42:15.012643 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 23:42:15.012647 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 9 23:42:15.012652 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 23:42:15.012656 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 9 23:42:15.012661 kernel: Console: colour dummy device 80x25 Sep 9 23:42:15.012665 kernel: printk: legacy console [tty1] enabled Sep 9 23:42:15.012670 kernel: ACPI: Core revision 20240827 Sep 9 23:42:15.012675 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 9 23:42:15.012680 kernel: pid_max: default: 32768 minimum: 301 Sep 9 23:42:15.012685 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 23:42:15.012689 kernel: landlock: Up and running. Sep 9 23:42:15.012694 kernel: SELinux: Initializing. Sep 9 23:42:15.012698 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:42:15.012706 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:42:15.012712 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 9 23:42:15.012717 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 9 23:42:15.012721 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 9 23:42:15.012726 kernel: rcu: Hierarchical SRCU implementation. Sep 9 23:42:15.012731 kernel: rcu: Max phase no-delay instances is 400. Sep 9 23:42:15.012736 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 23:42:15.012741 kernel: Remapping and enabling EFI services. Sep 9 23:42:15.012746 kernel: smp: Bringing up secondary CPUs ... Sep 9 23:42:15.012751 kernel: Detected PIPT I-cache on CPU1 Sep 9 23:42:15.012756 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 9 23:42:15.012761 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 9 23:42:15.012766 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 23:42:15.012771 kernel: SMP: Total of 2 processors activated. Sep 9 23:42:15.012776 kernel: CPU: All CPU(s) started at EL1 Sep 9 23:42:15.012780 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 23:42:15.012785 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 9 23:42:15.012790 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 23:42:15.012795 kernel: CPU features: detected: Common not Private translations Sep 9 23:42:15.012800 kernel: CPU features: detected: CRC32 instructions Sep 9 23:42:15.012805 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 9 23:42:15.012810 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 23:42:15.012815 kernel: CPU features: detected: LSE atomic instructions Sep 9 23:42:15.012820 kernel: CPU features: detected: Privileged Access Never Sep 9 23:42:15.012825 kernel: CPU features: detected: Speculation barrier (SB) Sep 9 23:42:15.012829 kernel: CPU features: detected: TLB range maintenance instructions Sep 9 23:42:15.012834 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 23:42:15.012839 kernel: CPU features: detected: Scalable Vector Extension Sep 9 23:42:15.012844 kernel: alternatives: applying system-wide alternatives Sep 9 23:42:15.012850 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 9 23:42:15.012854 kernel: SVE: maximum available vector length 16 bytes per vector Sep 9 23:42:15.012859 kernel: SVE: default vector length 16 bytes per vector Sep 9 23:42:15.012864 kernel: Memory: 3959668K/4194160K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 213304K reserved, 16384K cma-reserved) Sep 9 23:42:15.012869 kernel: devtmpfs: initialized Sep 9 23:42:15.012874 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 23:42:15.012879 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 23:42:15.012883 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 23:42:15.012888 kernel: 0 pages in range for non-PLT usage Sep 9 23:42:15.012894 kernel: 508576 pages in range for PLT usage Sep 9 23:42:15.012898 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 23:42:15.012903 kernel: SMBIOS 3.1.0 present. Sep 9 23:42:15.012908 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 9 23:42:15.012913 kernel: DMI: Memory slots populated: 2/2 Sep 9 23:42:15.012918 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 23:42:15.012922 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 23:42:15.012927 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 23:42:15.012932 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 23:42:15.012937 kernel: audit: initializing netlink subsys (disabled) Sep 9 23:42:15.012942 kernel: audit: type=2000 audit(0.058:1): state=initialized audit_enabled=0 res=1 Sep 9 23:42:15.012947 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 23:42:15.012952 kernel: cpuidle: using governor menu Sep 9 23:42:15.012956 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 23:42:15.012961 kernel: ASID allocator initialised with 32768 entries Sep 9 23:42:15.012966 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 23:42:15.012971 kernel: Serial: AMBA PL011 UART driver Sep 9 23:42:15.012976 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 23:42:15.012981 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 23:42:15.012986 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 23:42:15.012991 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 23:42:15.012995 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 23:42:15.013000 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 23:42:15.013005 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 23:42:15.013010 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 23:42:15.013014 kernel: ACPI: Added _OSI(Module Device) Sep 9 23:42:15.013019 kernel: ACPI: Added _OSI(Processor Device) Sep 9 23:42:15.013025 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 23:42:15.013029 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 23:42:15.013034 kernel: ACPI: Interpreter enabled Sep 9 23:42:15.013039 kernel: ACPI: Using GIC for interrupt routing Sep 9 23:42:15.013044 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 9 23:42:15.013048 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 23:42:15.013053 kernel: printk: legacy bootconsole [pl11] disabled Sep 9 23:42:15.013058 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 9 23:42:15.013063 kernel: ACPI: CPU0 has been hot-added Sep 9 23:42:15.013068 kernel: ACPI: CPU1 has been hot-added Sep 9 23:42:15.013073 kernel: iommu: Default domain type: Translated Sep 9 23:42:15.013078 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 23:42:15.013082 kernel: efivars: Registered efivars operations Sep 9 23:42:15.013087 kernel: vgaarb: loaded Sep 9 23:42:15.013092 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 23:42:15.013097 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 23:42:15.013101 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 23:42:15.013106 kernel: pnp: PnP ACPI init Sep 9 23:42:15.013112 kernel: pnp: PnP ACPI: found 0 devices Sep 9 23:42:15.013116 kernel: NET: Registered PF_INET protocol family Sep 9 23:42:15.013121 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 23:42:15.013126 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 23:42:15.013131 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 23:42:15.013136 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 23:42:15.013141 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 23:42:15.013145 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 23:42:15.013150 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:42:15.013156 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:42:15.013161 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 23:42:15.013165 kernel: PCI: CLS 0 bytes, default 64 Sep 9 23:42:15.013170 kernel: kvm [1]: HYP mode not available Sep 9 23:42:15.013175 kernel: Initialise system trusted keyrings Sep 9 23:42:15.013179 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 23:42:15.013184 kernel: Key type asymmetric registered Sep 9 23:42:15.013189 kernel: Asymmetric key parser 'x509' registered Sep 9 23:42:15.013193 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 23:42:15.013199 kernel: io scheduler mq-deadline registered Sep 9 23:42:15.013204 kernel: io scheduler kyber registered Sep 9 23:42:15.013209 kernel: io scheduler bfq registered Sep 9 23:42:15.013213 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 23:42:15.013218 kernel: thunder_xcv, ver 1.0 Sep 9 23:42:15.013223 kernel: thunder_bgx, ver 1.0 Sep 9 23:42:15.013227 kernel: nicpf, ver 1.0 Sep 9 23:42:15.013232 kernel: nicvf, ver 1.0 Sep 9 23:42:15.013336 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 23:42:15.013396 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T23:42:14 UTC (1757461334) Sep 9 23:42:15.013403 kernel: efifb: probing for efifb Sep 9 23:42:15.013408 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 9 23:42:15.013413 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 9 23:42:15.013417 kernel: efifb: scrolling: redraw Sep 9 23:42:15.013422 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 9 23:42:15.013427 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 23:42:15.013432 kernel: fb0: EFI VGA frame buffer device Sep 9 23:42:15.013438 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 9 23:42:15.013442 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 23:42:15.013447 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 23:42:15.013452 kernel: watchdog: NMI not fully supported Sep 9 23:42:15.013457 kernel: NET: Registered PF_INET6 protocol family Sep 9 23:42:15.013462 kernel: watchdog: Hard watchdog permanently disabled Sep 9 23:42:15.013466 kernel: Segment Routing with IPv6 Sep 9 23:42:15.013471 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 23:42:15.013476 kernel: NET: Registered PF_PACKET protocol family Sep 9 23:42:15.013482 kernel: Key type dns_resolver registered Sep 9 23:42:15.013486 kernel: registered taskstats version 1 Sep 9 23:42:15.013491 kernel: Loading compiled-in X.509 certificates Sep 9 23:42:15.013496 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 61217a1897415238555e2058a4e44c51622b0f87' Sep 9 23:42:15.013501 kernel: Demotion targets for Node 0: null Sep 9 23:42:15.013506 kernel: Key type .fscrypt registered Sep 9 23:42:15.013510 kernel: Key type fscrypt-provisioning registered Sep 9 23:42:15.013515 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 23:42:15.013520 kernel: ima: Allocated hash algorithm: sha1 Sep 9 23:42:15.013525 kernel: ima: No architecture policies found Sep 9 23:42:15.013530 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 23:42:15.013535 kernel: clk: Disabling unused clocks Sep 9 23:42:15.013540 kernel: PM: genpd: Disabling unused power domains Sep 9 23:42:15.013544 kernel: Warning: unable to open an initial console. Sep 9 23:42:15.013549 kernel: Freeing unused kernel memory: 38912K Sep 9 23:42:15.013554 kernel: Run /init as init process Sep 9 23:42:15.013559 kernel: with arguments: Sep 9 23:42:15.013563 kernel: /init Sep 9 23:42:15.013569 kernel: with environment: Sep 9 23:42:15.013574 kernel: HOME=/ Sep 9 23:42:15.013578 kernel: TERM=linux Sep 9 23:42:15.013583 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 23:42:15.013589 systemd[1]: Successfully made /usr/ read-only. Sep 9 23:42:15.013595 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:42:15.013601 systemd[1]: Detected virtualization microsoft. Sep 9 23:42:15.013607 systemd[1]: Detected architecture arm64. Sep 9 23:42:15.013612 systemd[1]: Running in initrd. Sep 9 23:42:15.013617 systemd[1]: No hostname configured, using default hostname. Sep 9 23:42:15.013622 systemd[1]: Hostname set to . Sep 9 23:42:15.013627 systemd[1]: Initializing machine ID from random generator. Sep 9 23:42:15.013633 systemd[1]: Queued start job for default target initrd.target. Sep 9 23:42:15.013638 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:42:15.013643 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:42:15.013649 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 23:42:15.013655 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:42:15.013660 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 23:42:15.013666 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 23:42:15.013672 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 23:42:15.013677 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 23:42:15.013682 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:42:15.013688 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:42:15.013693 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:42:15.013699 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:42:15.013704 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:42:15.013709 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:42:15.013714 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:42:15.013719 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:42:15.013724 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 23:42:15.013730 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 23:42:15.013736 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:42:15.013741 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:42:15.013746 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:42:15.013751 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:42:15.013756 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 23:42:15.013762 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:42:15.013767 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 23:42:15.013772 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 23:42:15.013778 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 23:42:15.013784 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:42:15.013789 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:42:15.013803 systemd-journald[224]: Collecting audit messages is disabled. Sep 9 23:42:15.013817 systemd-journald[224]: Journal started Sep 9 23:42:15.013831 systemd-journald[224]: Runtime Journal (/run/log/journal/3862794c97044a48a7515e2868f434e3) is 8M, max 78.5M, 70.5M free. Sep 9 23:42:15.021423 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:15.026322 systemd-modules-load[226]: Inserted module 'overlay' Sep 9 23:42:15.044395 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 23:42:15.044432 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:42:15.053589 kernel: Bridge firewalling registered Sep 9 23:42:15.053660 systemd-modules-load[226]: Inserted module 'br_netfilter' Sep 9 23:42:15.054317 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 23:42:15.067506 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:42:15.072230 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 23:42:15.075977 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:42:15.085470 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:15.093463 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 23:42:15.114834 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:42:15.122087 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 23:42:15.139964 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:42:15.160419 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:42:15.165985 systemd-tmpfiles[257]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 23:42:15.168262 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:42:15.176322 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:42:15.187406 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:42:15.196912 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 23:42:15.225526 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:42:15.234779 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:42:15.247953 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:42:15.251797 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:42:15.291785 systemd-resolved[263]: Positive Trust Anchors: Sep 9 23:42:15.291798 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:42:15.291817 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:42:15.293463 systemd-resolved[263]: Defaulting to hostname 'linux'. Sep 9 23:42:15.294698 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:42:15.299261 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:42:15.370398 kernel: SCSI subsystem initialized Sep 9 23:42:15.375402 kernel: Loading iSCSI transport class v2.0-870. Sep 9 23:42:15.382421 kernel: iscsi: registered transport (tcp) Sep 9 23:42:15.395058 kernel: iscsi: registered transport (qla4xxx) Sep 9 23:42:15.395087 kernel: QLogic iSCSI HBA Driver Sep 9 23:42:15.407350 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:42:15.425291 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:42:15.435997 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:42:15.477103 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 23:42:15.485319 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 23:42:15.547407 kernel: raid6: neonx8 gen() 18563 MB/s Sep 9 23:42:15.561391 kernel: raid6: neonx4 gen() 18553 MB/s Sep 9 23:42:15.580391 kernel: raid6: neonx2 gen() 17093 MB/s Sep 9 23:42:15.600392 kernel: raid6: neonx1 gen() 15145 MB/s Sep 9 23:42:15.619397 kernel: raid6: int64x8 gen() 10770 MB/s Sep 9 23:42:15.638392 kernel: raid6: int64x4 gen() 10678 MB/s Sep 9 23:42:15.658395 kernel: raid6: int64x2 gen() 9019 MB/s Sep 9 23:42:15.679159 kernel: raid6: int64x1 gen() 7092 MB/s Sep 9 23:42:15.679176 kernel: raid6: using algorithm neonx8 gen() 18563 MB/s Sep 9 23:42:15.700168 kernel: raid6: .... xor() 14906 MB/s, rmw enabled Sep 9 23:42:15.700189 kernel: raid6: using neon recovery algorithm Sep 9 23:42:15.707861 kernel: xor: measuring software checksum speed Sep 9 23:42:15.707869 kernel: 8regs : 28771 MB/sec Sep 9 23:42:15.711051 kernel: 32regs : 29152 MB/sec Sep 9 23:42:15.713275 kernel: arm64_neon : 38917 MB/sec Sep 9 23:42:15.715861 kernel: xor: using function: arm64_neon (38917 MB/sec) Sep 9 23:42:15.752418 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 23:42:15.757147 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:42:15.765504 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:42:15.786796 systemd-udevd[474]: Using default interface naming scheme 'v255'. Sep 9 23:42:15.790299 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:42:15.800721 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 23:42:15.824601 dracut-pre-trigger[483]: rd.md=0: removing MD RAID activation Sep 9 23:42:15.843129 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:42:15.853454 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:42:15.896130 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:42:15.905511 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 23:42:15.976073 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:42:15.977339 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:15.994164 kernel: hv_vmbus: Vmbus version:5.3 Sep 9 23:42:15.994183 kernel: hv_vmbus: registering driver hid_hyperv Sep 9 23:42:15.990310 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:16.009353 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 9 23:42:16.009390 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Sep 9 23:42:16.009491 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:16.053189 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Sep 9 23:42:16.053209 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 9 23:42:16.053330 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 9 23:42:16.053338 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 9 23:42:16.053345 kernel: PTP clock support registered Sep 9 23:42:16.053351 kernel: hv_vmbus: registering driver hv_netvsc Sep 9 23:42:16.053357 kernel: hv_utils: Registering HyperV Utility Driver Sep 9 23:42:16.053363 kernel: hv_vmbus: registering driver hv_utils Sep 9 23:42:16.053369 kernel: hv_utils: Heartbeat IC version 3.0 Sep 9 23:42:16.053375 kernel: hv_utils: Shutdown IC version 3.2 Sep 9 23:42:16.053381 kernel: hv_utils: TimeSync IC version 4.0 Sep 9 23:42:16.053389 kernel: hv_vmbus: registering driver hv_storvsc Sep 9 23:42:16.053395 kernel: scsi host0: storvsc_host_t Sep 9 23:42:16.053471 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 9 23:42:16.053486 kernel: scsi host1: storvsc_host_t Sep 9 23:42:16.053549 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 9 23:42:16.004035 systemd-resolved[263]: Clock change detected. Flushing caches. Sep 9 23:42:16.016663 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:42:16.057596 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:42:16.057668 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:16.125977 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 9 23:42:16.126145 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 9 23:42:16.126223 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 23:42:16.126231 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 9 23:42:16.126293 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 9 23:42:16.126351 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 9 23:42:16.126416 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 9 23:42:16.126474 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#198 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 23:42:16.126537 kernel: hv_netvsc 000d3afc-720b-000d-3afc-720b000d3afc eth0: VF slot 1 added Sep 9 23:42:16.126594 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#205 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 23:42:16.126645 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 9 23:42:16.074109 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:16.136778 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:42:16.136803 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 9 23:42:16.151018 kernel: hv_vmbus: registering driver hv_pci Sep 9 23:42:16.151066 kernel: hv_pci aae76c33-df86-4cef-a881-0e2f9b70901b: PCI VMBus probing: Using version 0x10004 Sep 9 23:42:16.153091 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:16.170185 kernel: hv_pci aae76c33-df86-4cef-a881-0e2f9b70901b: PCI host bridge to bus df86:00 Sep 9 23:42:16.170311 kernel: pci_bus df86:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 9 23:42:16.175601 kernel: pci_bus df86:00: No busn resource found for root bus, will use [bus 00-ff] Sep 9 23:42:16.181298 kernel: pci df86:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 9 23:42:16.192027 kernel: pci df86:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 9 23:42:16.192070 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#287 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 23:42:16.194846 kernel: pci df86:00:02.0: enabling Extended Tags Sep 9 23:42:16.221091 kernel: pci df86:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at df86:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 9 23:42:16.229415 kernel: pci_bus df86:00: busn_res: [bus 00-ff] end is updated to 00 Sep 9 23:42:16.229548 kernel: pci df86:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 9 23:42:16.235172 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#267 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 23:42:16.292049 kernel: mlx5_core df86:00:02.0: enabling device (0000 -> 0002) Sep 9 23:42:16.299490 kernel: mlx5_core df86:00:02.0: PTM is not supported by PCIe Sep 9 23:42:16.299645 kernel: mlx5_core df86:00:02.0: firmware version: 16.30.5006 Sep 9 23:42:16.470041 kernel: hv_netvsc 000d3afc-720b-000d-3afc-720b000d3afc eth0: VF registering: eth1 Sep 9 23:42:16.470229 kernel: mlx5_core df86:00:02.0 eth1: joined to eth0 Sep 9 23:42:16.474910 kernel: mlx5_core df86:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 9 23:42:16.485027 kernel: mlx5_core df86:00:02.0 enP57222s1: renamed from eth1 Sep 9 23:42:16.744076 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 9 23:42:16.827462 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 23:42:16.850622 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 9 23:42:16.855530 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 23:42:16.870854 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 9 23:42:16.882321 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 9 23:42:16.891343 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:42:16.900882 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:42:16.909193 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:42:16.917049 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 23:42:16.923074 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 23:42:16.948234 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:42:16.963027 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#264 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 23:42:16.978948 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:42:18.003876 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#225 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 23:42:18.016030 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:42:18.016994 disk-uuid[660]: The operation has completed successfully. Sep 9 23:42:18.088991 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 23:42:18.089095 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 23:42:18.107307 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 23:42:18.127094 sh[818]: Success Sep 9 23:42:18.160735 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 23:42:18.160775 kernel: device-mapper: uevent: version 1.0.3 Sep 9 23:42:18.165420 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 23:42:18.175064 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 23:42:18.565332 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 23:42:18.578391 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 23:42:18.586478 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 23:42:18.609015 kernel: BTRFS: device fsid 2bc16190-0dd5-44d6-b331-3d703f5a1d1f devid 1 transid 40 /dev/mapper/usr (254:0) scanned by mount (836) Sep 9 23:42:18.618470 kernel: BTRFS info (device dm-0): first mount of filesystem 2bc16190-0dd5-44d6-b331-3d703f5a1d1f Sep 9 23:42:18.618502 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:42:19.084763 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 23:42:19.084838 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 23:42:19.169780 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 23:42:19.173545 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:42:19.180534 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 23:42:19.181155 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 23:42:19.201561 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 23:42:19.232027 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (872) Sep 9 23:42:19.243400 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:42:19.243431 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:42:19.291271 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:42:19.304374 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:42:19.304396 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:42:19.314036 kernel: BTRFS info (device sda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:42:19.314272 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:42:19.324391 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 23:42:19.330526 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 23:42:19.357634 systemd-networkd[1003]: lo: Link UP Sep 9 23:42:19.357647 systemd-networkd[1003]: lo: Gained carrier Sep 9 23:42:19.358343 systemd-networkd[1003]: Enumeration completed Sep 9 23:42:19.360341 systemd-networkd[1003]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:19.360344 systemd-networkd[1003]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:42:19.360422 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:42:19.367951 systemd[1]: Reached target network.target - Network. Sep 9 23:42:19.437873 kernel: mlx5_core df86:00:02.0 enP57222s1: Link up Sep 9 23:42:19.438087 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 23:42:19.475026 kernel: hv_netvsc 000d3afc-720b-000d-3afc-720b000d3afc eth0: Data path switched to VF: enP57222s1 Sep 9 23:42:19.475139 systemd-networkd[1003]: enP57222s1: Link UP Sep 9 23:42:19.475195 systemd-networkd[1003]: eth0: Link UP Sep 9 23:42:19.475330 systemd-networkd[1003]: eth0: Gained carrier Sep 9 23:42:19.475343 systemd-networkd[1003]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:19.494167 systemd-networkd[1003]: enP57222s1: Gained carrier Sep 9 23:42:19.518045 systemd-networkd[1003]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 23:42:21.406172 systemd-networkd[1003]: eth0: Gained IPv6LL Sep 9 23:42:21.801495 ignition[1006]: Ignition 2.21.0 Sep 9 23:42:21.801513 ignition[1006]: Stage: fetch-offline Sep 9 23:42:21.805279 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:42:21.801595 ignition[1006]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:21.812761 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 23:42:21.801601 ignition[1006]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:21.801688 ignition[1006]: parsed url from cmdline: "" Sep 9 23:42:21.801690 ignition[1006]: no config URL provided Sep 9 23:42:21.801694 ignition[1006]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:42:21.801699 ignition[1006]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:42:21.801702 ignition[1006]: failed to fetch config: resource requires networking Sep 9 23:42:21.801930 ignition[1006]: Ignition finished successfully Sep 9 23:42:21.837887 ignition[1016]: Ignition 2.21.0 Sep 9 23:42:21.837892 ignition[1016]: Stage: fetch Sep 9 23:42:21.838122 ignition[1016]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:21.838130 ignition[1016]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:21.838212 ignition[1016]: parsed url from cmdline: "" Sep 9 23:42:21.838214 ignition[1016]: no config URL provided Sep 9 23:42:21.838218 ignition[1016]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:42:21.838223 ignition[1016]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:42:21.838247 ignition[1016]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 9 23:42:22.017908 ignition[1016]: GET result: OK Sep 9 23:42:22.017985 ignition[1016]: config has been read from IMDS userdata Sep 9 23:42:22.018025 ignition[1016]: parsing config with SHA512: c383e6aac61bc57bfb3b667520ffdef2953e756c110b217e4e713e4074cff7cf491eaf9961e4d0774a902e6d73cc3e497b5a46625c7e9422f1fd03a8ffd6862e Sep 9 23:42:22.021077 unknown[1016]: fetched base config from "system" Sep 9 23:42:22.021288 ignition[1016]: fetch: fetch complete Sep 9 23:42:22.021082 unknown[1016]: fetched base config from "system" Sep 9 23:42:22.021292 ignition[1016]: fetch: fetch passed Sep 9 23:42:22.021093 unknown[1016]: fetched user config from "azure" Sep 9 23:42:22.021339 ignition[1016]: Ignition finished successfully Sep 9 23:42:22.027352 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 23:42:22.033289 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 23:42:22.068532 ignition[1022]: Ignition 2.21.0 Sep 9 23:42:22.068550 ignition[1022]: Stage: kargs Sep 9 23:42:22.068706 ignition[1022]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:22.068713 ignition[1022]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:22.072498 ignition[1022]: kargs: kargs passed Sep 9 23:42:22.081700 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 23:42:22.074217 ignition[1022]: Ignition finished successfully Sep 9 23:42:22.093679 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 23:42:22.129743 ignition[1029]: Ignition 2.21.0 Sep 9 23:42:22.131956 ignition[1029]: Stage: disks Sep 9 23:42:22.135954 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 23:42:22.132150 ignition[1029]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:22.132158 ignition[1029]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:22.143293 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 23:42:22.132676 ignition[1029]: disks: disks passed Sep 9 23:42:22.150300 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 23:42:22.132724 ignition[1029]: Ignition finished successfully Sep 9 23:42:22.158394 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:42:22.165982 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:42:22.171751 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:42:22.179828 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 23:42:22.249758 systemd-fsck[1038]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 9 23:42:22.257819 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 23:42:22.264257 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 23:42:24.517019 kernel: EXT4-fs (sda9): mounted filesystem 7cc0d7f3-e4a1-4dc4-8b58-ceece0d874c1 r/w with ordered data mode. Quota mode: none. Sep 9 23:42:24.517827 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 23:42:24.522196 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 23:42:24.575442 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:42:24.594145 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 23:42:24.597770 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 23:42:24.610183 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 23:42:24.610214 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:42:24.618118 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 23:42:24.626997 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 23:42:24.649081 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1052) Sep 9 23:42:24.658293 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:42:24.658326 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:42:24.667025 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:42:24.667058 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:42:24.668328 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:42:25.268464 coreos-metadata[1054]: Sep 09 23:42:25.268 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 23:42:25.275994 coreos-metadata[1054]: Sep 09 23:42:25.275 INFO Fetch successful Sep 9 23:42:25.280245 coreos-metadata[1054]: Sep 09 23:42:25.276 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 9 23:42:25.287949 coreos-metadata[1054]: Sep 09 23:42:25.285 INFO Fetch successful Sep 9 23:42:25.305481 coreos-metadata[1054]: Sep 09 23:42:25.305 INFO wrote hostname ci-4426.0.0-n-c59ad9327c to /sysroot/etc/hostname Sep 9 23:42:25.312096 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 23:42:25.510993 initrd-setup-root[1083]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 23:42:25.566092 initrd-setup-root[1090]: cut: /sysroot/etc/group: No such file or directory Sep 9 23:42:25.594297 initrd-setup-root[1097]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 23:42:25.598598 initrd-setup-root[1104]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 23:42:26.941722 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 23:42:26.946851 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 23:42:26.962396 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 23:42:26.973081 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 23:42:26.980920 kernel: BTRFS info (device sda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:42:26.996961 ignition[1172]: INFO : Ignition 2.21.0 Sep 9 23:42:26.996961 ignition[1172]: INFO : Stage: mount Sep 9 23:42:27.007583 ignition[1172]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:27.007583 ignition[1172]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:27.007583 ignition[1172]: INFO : mount: mount passed Sep 9 23:42:27.007583 ignition[1172]: INFO : Ignition finished successfully Sep 9 23:42:27.000173 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 23:42:27.008133 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 23:42:27.026113 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 23:42:27.039324 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:42:27.066335 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1183) Sep 9 23:42:27.066366 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:42:27.075570 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:42:27.084693 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:42:27.084718 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:42:27.086200 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:42:27.111208 ignition[1200]: INFO : Ignition 2.21.0 Sep 9 23:42:27.111208 ignition[1200]: INFO : Stage: files Sep 9 23:42:27.116580 ignition[1200]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:27.116580 ignition[1200]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:27.116580 ignition[1200]: DEBUG : files: compiled without relabeling support, skipping Sep 9 23:42:27.132086 ignition[1200]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 23:42:27.132086 ignition[1200]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 23:42:27.181640 ignition[1200]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 23:42:27.187050 ignition[1200]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 23:42:27.192656 ignition[1200]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 23:42:27.187173 unknown[1200]: wrote ssh authorized keys file for user: core Sep 9 23:42:27.253837 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:42:27.261033 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 9 23:42:27.395509 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 23:42:27.661992 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:42:27.669655 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 23:42:27.669655 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:42:27.682579 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:42:27.743929 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:42:27.743929 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:42:27.743929 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 9 23:42:28.181258 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 23:42:28.411846 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:42:28.411846 ignition[1200]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 23:42:28.449651 ignition[1200]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:42:28.464571 ignition[1200]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:42:28.464571 ignition[1200]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 23:42:28.464571 ignition[1200]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 23:42:28.489921 ignition[1200]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 23:42:28.489921 ignition[1200]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:42:28.489921 ignition[1200]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:42:28.489921 ignition[1200]: INFO : files: files passed Sep 9 23:42:28.489921 ignition[1200]: INFO : Ignition finished successfully Sep 9 23:42:28.472721 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 23:42:28.481589 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 23:42:28.509650 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 23:42:28.521733 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 23:42:28.521801 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 23:42:28.551152 initrd-setup-root-after-ignition[1230]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:42:28.551152 initrd-setup-root-after-ignition[1230]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:42:28.562940 initrd-setup-root-after-ignition[1234]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:42:28.557950 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:42:28.568312 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 23:42:28.578134 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 23:42:28.617399 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 23:42:28.618064 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 23:42:28.625669 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 23:42:28.634324 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 23:42:28.641467 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 23:42:28.642115 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 23:42:28.673184 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:42:28.684915 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 23:42:28.707842 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:42:28.712341 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:42:28.720571 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 23:42:28.728404 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 23:42:28.728499 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:42:28.739425 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 23:42:28.743301 systemd[1]: Stopped target basic.target - Basic System. Sep 9 23:42:28.750670 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 23:42:28.758188 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:42:28.765776 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 23:42:28.774169 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:42:28.782562 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 23:42:28.790755 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:42:28.799446 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 23:42:28.807240 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 23:42:28.815161 systemd[1]: Stopped target swap.target - Swaps. Sep 9 23:42:28.821813 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 23:42:28.821932 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:42:28.831565 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:42:28.835943 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:42:28.843608 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 23:42:28.843675 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:42:28.852337 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 23:42:28.852430 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 23:42:28.864590 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 23:42:28.864667 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:42:28.869643 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 23:42:28.869709 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 23:42:28.876584 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 23:42:28.938085 ignition[1254]: INFO : Ignition 2.21.0 Sep 9 23:42:28.938085 ignition[1254]: INFO : Stage: umount Sep 9 23:42:28.938085 ignition[1254]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:42:28.938085 ignition[1254]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 23:42:28.938085 ignition[1254]: INFO : umount: umount passed Sep 9 23:42:28.938085 ignition[1254]: INFO : Ignition finished successfully Sep 9 23:42:28.876645 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 23:42:28.886877 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 23:42:28.898847 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 23:42:28.898960 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:42:28.915805 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 23:42:28.924999 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 23:42:28.925127 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:42:28.934459 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 23:42:28.934578 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:42:28.946605 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 23:42:28.947574 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 23:42:28.953620 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 23:42:28.953689 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 23:42:28.959373 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 23:42:28.959421 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 23:42:28.967289 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 23:42:28.967322 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 23:42:28.973755 systemd[1]: Stopped target network.target - Network. Sep 9 23:42:28.980044 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 23:42:28.980088 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:42:28.989117 systemd[1]: Stopped target paths.target - Path Units. Sep 9 23:42:28.995942 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 23:42:28.999369 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:42:29.004487 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 23:42:29.013230 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 23:42:29.024967 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 23:42:29.025032 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:42:29.032400 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 23:42:29.032436 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:42:29.041540 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 23:42:29.041598 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 23:42:29.049135 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 23:42:29.049173 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 23:42:29.059842 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 23:42:29.068580 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 23:42:29.088610 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 23:42:29.089112 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 23:42:29.089199 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 23:42:29.100565 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 23:42:29.100735 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 23:42:29.276763 kernel: hv_netvsc 000d3afc-720b-000d-3afc-720b000d3afc eth0: Data path switched from VF: enP57222s1 Sep 9 23:42:29.100812 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 23:42:29.112376 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 23:42:29.112541 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 23:42:29.114022 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 23:42:29.121923 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 23:42:29.128689 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 23:42:29.128730 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:42:29.145176 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 23:42:29.151224 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 23:42:29.151279 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:42:29.161468 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 23:42:29.161517 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:42:29.174192 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 23:42:29.174238 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 23:42:29.181396 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 23:42:29.181445 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:42:29.193058 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:42:29.203888 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 23:42:29.203936 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:42:29.223139 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 23:42:29.225035 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:42:29.232013 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 23:42:29.232044 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 23:42:29.239651 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 23:42:29.239671 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:42:29.248536 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 23:42:29.248587 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:42:29.259599 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 23:42:29.259640 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 23:42:29.276841 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 23:42:29.276895 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:42:29.291172 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 23:42:29.297980 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 23:42:29.298068 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:42:29.310279 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 23:42:29.310323 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:42:29.323076 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:42:29.323124 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:29.334554 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 23:42:29.334595 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 23:42:29.505717 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 9 23:42:29.334622 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:42:29.334860 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 23:42:29.334965 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 23:42:29.349797 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 23:42:29.349862 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 23:42:29.357470 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 23:42:29.357567 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 23:42:29.365574 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 23:42:29.367029 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 23:42:29.372748 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 23:42:29.380497 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 23:42:29.416509 systemd[1]: Switching root. Sep 9 23:42:29.550991 systemd-journald[224]: Journal stopped Sep 9 23:42:38.046537 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 23:42:38.046555 kernel: SELinux: policy capability open_perms=1 Sep 9 23:42:38.046562 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 23:42:38.046568 kernel: SELinux: policy capability always_check_network=0 Sep 9 23:42:38.046574 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 23:42:38.046579 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 23:42:38.046586 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 23:42:38.046591 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 23:42:38.046597 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 23:42:38.046602 kernel: audit: type=1403 audit(1757461350.920:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 23:42:38.046609 systemd[1]: Successfully loaded SELinux policy in 244.772ms. Sep 9 23:42:38.046617 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.261ms. Sep 9 23:42:38.046624 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:42:38.046630 systemd[1]: Detected virtualization microsoft. Sep 9 23:42:38.046636 systemd[1]: Detected architecture arm64. Sep 9 23:42:38.046643 systemd[1]: Detected first boot. Sep 9 23:42:38.046650 systemd[1]: Hostname set to . Sep 9 23:42:38.046656 systemd[1]: Initializing machine ID from random generator. Sep 9 23:42:38.046662 zram_generator::config[1296]: No configuration found. Sep 9 23:42:38.046668 kernel: NET: Registered PF_VSOCK protocol family Sep 9 23:42:38.046674 systemd[1]: Populated /etc with preset unit settings. Sep 9 23:42:38.046682 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 23:42:38.046689 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 23:42:38.046694 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 23:42:38.046700 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 23:42:38.046706 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 23:42:38.046712 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 23:42:38.046718 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 23:42:38.046724 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 23:42:38.046731 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 23:42:38.046737 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 23:42:38.046744 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 23:42:38.046750 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 23:42:38.046756 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:42:38.046762 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:42:38.046768 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 23:42:38.046774 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 23:42:38.046780 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 23:42:38.046787 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:42:38.046794 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 23:42:38.046801 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:42:38.046807 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:42:38.046814 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 23:42:38.046821 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 23:42:38.046827 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 23:42:38.046834 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 23:42:38.046840 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:42:38.046846 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:42:38.046853 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:42:38.046859 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:42:38.046865 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 23:42:38.046871 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 23:42:38.046879 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 23:42:38.046885 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:42:38.046892 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:42:38.046898 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:42:38.046904 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 23:42:38.046911 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 23:42:38.046918 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 23:42:38.046924 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 23:42:38.046930 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 23:42:38.046937 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 23:42:38.046943 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 23:42:38.046950 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 23:42:38.046956 systemd[1]: Reached target machines.target - Containers. Sep 9 23:42:38.046962 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 23:42:38.046970 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:42:38.046976 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:42:38.046982 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 23:42:38.046988 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:42:38.046995 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:42:38.047030 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:42:38.047037 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 23:42:38.047044 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:42:38.047050 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 23:42:38.047058 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 23:42:38.047064 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 23:42:38.047071 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 23:42:38.047077 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 23:42:38.047083 kernel: fuse: init (API version 7.41) Sep 9 23:42:38.047089 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:42:38.047096 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:42:38.047102 kernel: loop: module loaded Sep 9 23:42:38.047109 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:42:38.047116 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:42:38.047122 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 23:42:38.047128 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 23:42:38.047134 kernel: ACPI: bus type drm_connector registered Sep 9 23:42:38.047157 systemd-journald[1379]: Collecting audit messages is disabled. Sep 9 23:42:38.047171 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:42:38.047178 systemd-journald[1379]: Journal started Sep 9 23:42:38.047193 systemd-journald[1379]: Runtime Journal (/run/log/journal/14e625ec501241be89463dbbed233224) is 8M, max 78.5M, 70.5M free. Sep 9 23:42:37.253231 systemd[1]: Queued start job for default target multi-user.target. Sep 9 23:42:37.260403 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 23:42:37.260759 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 23:42:37.260993 systemd[1]: systemd-journald.service: Consumed 2.153s CPU time. Sep 9 23:42:38.059037 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 23:42:38.059078 systemd[1]: Stopped verity-setup.service. Sep 9 23:42:38.072837 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:42:38.073406 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 23:42:38.078033 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 23:42:38.082349 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 23:42:38.086136 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 23:42:38.090334 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 23:42:38.095335 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 23:42:38.099270 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 23:42:38.103988 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:42:38.109510 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 23:42:38.109638 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 23:42:38.114332 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:42:38.114465 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:42:38.118706 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:42:38.118828 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:42:38.122854 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:42:38.122956 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:42:38.127765 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 23:42:38.127873 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 23:42:38.132264 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:42:38.132383 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:42:38.136434 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:42:38.141404 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:42:38.146897 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 23:42:38.160609 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:42:38.165672 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 23:42:38.175533 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 23:42:38.182480 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 23:42:38.182506 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:42:38.186882 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 23:42:38.193113 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 23:42:38.198604 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:42:38.214919 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 23:42:38.229535 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 23:42:38.234168 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:42:38.234829 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 23:42:38.239229 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:42:38.239841 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:42:38.246111 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 23:42:38.251351 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 23:42:38.256433 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 23:42:38.263116 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:42:38.268328 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 23:42:38.273081 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 23:42:38.280993 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 23:42:38.289948 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 23:42:38.299207 systemd-journald[1379]: Time spent on flushing to /var/log/journal/14e625ec501241be89463dbbed233224 is 9.182ms for 941 entries. Sep 9 23:42:38.299207 systemd-journald[1379]: System Journal (/var/log/journal/14e625ec501241be89463dbbed233224) is 8M, max 2.6G, 2.6G free. Sep 9 23:42:38.341295 systemd-journald[1379]: Received client request to flush runtime journal. Sep 9 23:42:38.341342 kernel: loop0: detected capacity change from 0 to 119320 Sep 9 23:42:38.299198 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 23:42:38.342433 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 23:42:38.375810 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 23:42:38.376362 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 23:42:38.398171 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:42:39.022030 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 23:42:39.043035 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 23:42:39.048215 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:42:39.079050 kernel: loop1: detected capacity change from 0 to 29264 Sep 9 23:42:39.256634 systemd-tmpfiles[1452]: ACLs are not supported, ignoring. Sep 9 23:42:39.256652 systemd-tmpfiles[1452]: ACLs are not supported, ignoring. Sep 9 23:42:39.260361 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:42:39.731039 kernel: loop2: detected capacity change from 0 to 100608 Sep 9 23:42:40.261026 kernel: loop3: detected capacity change from 0 to 207008 Sep 9 23:42:40.309051 kernel: loop4: detected capacity change from 0 to 119320 Sep 9 23:42:40.319022 kernel: loop5: detected capacity change from 0 to 29264 Sep 9 23:42:40.328628 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 23:42:40.333135 kernel: loop6: detected capacity change from 0 to 100608 Sep 9 23:42:40.336231 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:42:40.342759 kernel: loop7: detected capacity change from 0 to 207008 Sep 9 23:42:40.353038 (sd-merge)[1460]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 9 23:42:40.353649 (sd-merge)[1460]: Merged extensions into '/usr'. Sep 9 23:42:40.356602 systemd[1]: Reload requested from client PID 1433 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 23:42:40.356696 systemd[1]: Reloading... Sep 9 23:42:40.362713 systemd-udevd[1462]: Using default interface naming scheme 'v255'. Sep 9 23:42:40.408029 zram_generator::config[1487]: No configuration found. Sep 9 23:42:40.609491 systemd[1]: Reloading finished in 252 ms. Sep 9 23:42:40.633052 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 23:42:40.646809 systemd[1]: Starting ensure-sysext.service... Sep 9 23:42:40.652110 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:42:40.689663 systemd[1]: Reload requested from client PID 1543 ('systemctl') (unit ensure-sysext.service)... Sep 9 23:42:40.689775 systemd[1]: Reloading... Sep 9 23:42:40.693146 systemd-tmpfiles[1544]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 23:42:40.693168 systemd-tmpfiles[1544]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 23:42:40.693378 systemd-tmpfiles[1544]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 23:42:40.693515 systemd-tmpfiles[1544]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 23:42:40.693934 systemd-tmpfiles[1544]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 23:42:40.694190 systemd-tmpfiles[1544]: ACLs are not supported, ignoring. Sep 9 23:42:40.694219 systemd-tmpfiles[1544]: ACLs are not supported, ignoring. Sep 9 23:42:40.733044 zram_generator::config[1569]: No configuration found. Sep 9 23:42:40.752935 systemd-tmpfiles[1544]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:42:40.752946 systemd-tmpfiles[1544]: Skipping /boot Sep 9 23:42:40.759040 systemd-tmpfiles[1544]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:42:40.759051 systemd-tmpfiles[1544]: Skipping /boot Sep 9 23:42:40.870971 systemd[1]: Reloading finished in 180 ms. Sep 9 23:42:40.890947 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:42:40.906134 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:42:40.943773 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 23:42:40.957158 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 23:42:40.964581 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:42:40.972397 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 23:42:40.980483 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:42:40.981631 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:42:40.987488 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:42:40.994601 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:42:41.000369 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:42:41.000533 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:42:41.001207 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:42:41.001353 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:42:41.005857 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:42:41.006553 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:42:41.011475 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:42:41.011602 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:42:41.023437 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 23:42:41.030354 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 9 23:42:41.035254 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:42:41.036237 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:42:41.047178 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:42:41.052943 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:42:41.059448 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:42:41.064577 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:42:41.064668 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:42:41.064769 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 23:42:41.069754 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:42:41.069897 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:42:41.074354 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:42:41.074463 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:42:41.079291 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:42:41.079408 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:42:41.084205 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:42:41.084315 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:42:41.090718 systemd[1]: Finished ensure-sysext.service. Sep 9 23:42:41.095692 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:42:41.095756 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:42:41.096884 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 23:42:41.143805 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 23:42:41.247676 systemd-resolved[1632]: Positive Trust Anchors: Sep 9 23:42:41.247981 systemd-resolved[1632]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:42:41.248050 systemd-resolved[1632]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:42:41.250980 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:42:41.262162 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:42:41.330099 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 23:42:41.341243 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 23:42:41.359708 augenrules[1707]: No rules Sep 9 23:42:41.361199 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:42:41.361386 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:42:41.386373 systemd-resolved[1632]: Using system hostname 'ci-4426.0.0-n-c59ad9327c'. Sep 9 23:42:41.387919 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:42:41.392820 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:42:41.460854 systemd-networkd[1684]: lo: Link UP Sep 9 23:42:41.460862 systemd-networkd[1684]: lo: Gained carrier Sep 9 23:42:41.464291 systemd-networkd[1684]: Enumeration completed Sep 9 23:42:41.464975 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:41.464978 systemd-networkd[1684]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:42:41.468019 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 23:42:41.468073 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#239 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 23:42:41.468203 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:42:41.474046 systemd[1]: Reached target network.target - Network. Sep 9 23:42:41.485144 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 23:42:41.491838 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 23:42:41.500660 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 9 23:42:41.514193 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:41.542033 kernel: hv_vmbus: registering driver hv_balloon Sep 9 23:42:41.542091 kernel: hv_vmbus: registering driver hyperv_fb Sep 9 23:42:41.542123 kernel: mlx5_core df86:00:02.0 enP57222s1: Link up Sep 9 23:42:41.547126 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:42:41.549080 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:41.550052 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 23:42:41.558141 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:41.564036 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 9 23:42:41.577031 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 9 23:42:41.577090 kernel: hv_netvsc 000d3afc-720b-000d-3afc-720b000d3afc eth0: Data path switched to VF: enP57222s1 Sep 9 23:42:41.577249 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 9 23:42:41.583016 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 9 23:42:41.584073 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:41.587449 systemd-networkd[1684]: enP57222s1: Link UP Sep 9 23:42:41.587886 systemd-networkd[1684]: eth0: Link UP Sep 9 23:42:41.588027 systemd-networkd[1684]: eth0: Gained carrier Sep 9 23:42:41.588105 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:41.590854 kernel: Console: switching to colour dummy device 80x25 Sep 9 23:42:41.597762 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 23:42:41.605274 systemd-networkd[1684]: enP57222s1: Gained carrier Sep 9 23:42:41.612077 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 23:42:41.622168 systemd-networkd[1684]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 23:42:41.624431 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:42:41.624603 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:41.632486 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:42:41.635931 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:42:41.705175 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 23:42:41.710799 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 23:42:41.759039 kernel: MACsec IEEE 802.1AE Sep 9 23:42:41.801223 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 23:42:42.782167 systemd-networkd[1684]: eth0: Gained IPv6LL Sep 9 23:42:42.784277 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 23:42:42.789259 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 23:42:42.940131 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:42:43.883408 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 23:42:43.888629 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 23:42:47.435029 ldconfig[1428]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 23:42:47.464290 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 23:42:47.470765 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 23:42:47.499019 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 23:42:47.503542 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:42:47.507941 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 23:42:47.512539 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 23:42:47.517337 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 23:42:47.521532 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 23:42:47.526456 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 23:42:47.531186 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 23:42:47.531210 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:42:47.534509 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:42:47.571956 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 23:42:47.577535 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 23:42:47.582563 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 23:42:47.587918 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 23:42:47.592840 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 23:42:47.598639 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 23:42:47.602912 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 23:42:47.607765 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 23:42:47.611806 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:42:47.615430 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:42:47.618824 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:42:47.618844 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:42:47.633985 systemd[1]: Starting chronyd.service - NTP client/server... Sep 9 23:42:47.645104 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 23:42:47.650993 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 23:42:47.662318 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 23:42:47.668042 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 23:42:47.674823 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 23:42:47.683818 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 23:42:47.687896 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 23:42:47.690136 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 9 23:42:47.694492 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 9 23:42:47.695326 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:42:47.701209 jq[1840]: false Sep 9 23:42:47.701876 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 23:42:47.707144 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 23:42:47.711813 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 23:42:47.716570 KVP[1842]: KVP starting; pid is:1842 Sep 9 23:42:47.719630 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 23:42:47.721108 chronyd[1832]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 9 23:42:47.725303 KVP[1842]: KVP LIC Version: 3.1 Sep 9 23:42:47.728030 kernel: hv_utils: KVP IC version 4.0 Sep 9 23:42:47.728583 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 23:42:47.735138 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 23:42:47.739718 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 23:42:47.740110 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 23:42:47.741322 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 23:42:47.751795 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 23:42:47.761074 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 23:42:47.764451 extend-filesystems[1841]: Found /dev/sda6 Sep 9 23:42:47.782103 jq[1857]: true Sep 9 23:42:47.775328 chronyd[1832]: Timezone right/UTC failed leap second check, ignoring Sep 9 23:42:47.767567 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 23:42:47.775476 chronyd[1832]: Loaded seccomp filter (level 2) Sep 9 23:42:47.767741 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 23:42:47.771634 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 23:42:47.773014 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 23:42:47.782967 systemd[1]: Started chronyd.service - NTP client/server. Sep 9 23:42:47.787409 extend-filesystems[1841]: Found /dev/sda9 Sep 9 23:42:47.801901 extend-filesystems[1841]: Checking size of /dev/sda9 Sep 9 23:42:47.800779 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 23:42:47.801653 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 23:42:47.816742 (ntainerd)[1873]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 23:42:47.820955 jq[1869]: true Sep 9 23:42:47.824412 update_engine[1855]: I20250909 23:42:47.824277 1855 main.cc:92] Flatcar Update Engine starting Sep 9 23:42:47.830052 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 23:42:47.845150 tar[1863]: linux-arm64/LICENSE Sep 9 23:42:47.845150 tar[1863]: linux-arm64/helm Sep 9 23:42:47.851895 extend-filesystems[1841]: Old size kept for /dev/sda9 Sep 9 23:42:47.852622 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 23:42:47.852862 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 23:42:47.878247 systemd-logind[1853]: New seat seat0. Sep 9 23:42:47.879854 systemd-logind[1853]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Sep 9 23:42:47.880060 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 23:42:48.088437 bash[1911]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:42:48.092968 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 23:42:48.100134 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 23:42:48.176043 dbus-daemon[1838]: [system] SELinux support is enabled Sep 9 23:42:48.176471 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 23:42:48.184525 update_engine[1855]: I20250909 23:42:48.184474 1855 update_check_scheduler.cc:74] Next update check in 6m54s Sep 9 23:42:48.185717 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 23:42:48.185742 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 23:42:48.187401 dbus-daemon[1838]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 23:42:48.196371 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 23:42:48.196392 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 23:42:48.202802 systemd[1]: Started update-engine.service - Update Engine. Sep 9 23:42:48.211309 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 23:42:48.266892 sshd_keygen[1883]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 23:42:48.295210 coreos-metadata[1834]: Sep 09 23:42:48.295 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 23:42:48.299784 coreos-metadata[1834]: Sep 09 23:42:48.299 INFO Fetch successful Sep 9 23:42:48.300472 coreos-metadata[1834]: Sep 09 23:42:48.299 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 9 23:42:48.304667 coreos-metadata[1834]: Sep 09 23:42:48.304 INFO Fetch successful Sep 9 23:42:48.304667 coreos-metadata[1834]: Sep 09 23:42:48.304 INFO Fetching http://168.63.129.16/machine/ed155f9a-3cfa-4347-9d3d-16e8ec3646f2/6d9d93ae%2D5da1%2D43d1%2D824e%2D33091dbd511f.%5Fci%2D4426.0.0%2Dn%2Dc59ad9327c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 9 23:42:48.305295 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 23:42:48.310149 coreos-metadata[1834]: Sep 09 23:42:48.310 INFO Fetch successful Sep 9 23:42:48.310667 coreos-metadata[1834]: Sep 09 23:42:48.310 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 9 23:42:48.319662 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 23:42:48.321395 coreos-metadata[1834]: Sep 09 23:42:48.319 INFO Fetch successful Sep 9 23:42:48.328595 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 9 23:42:48.364761 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 23:42:48.365253 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 23:42:48.374293 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 23:42:48.384159 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 9 23:42:48.390185 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 23:42:48.399033 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 23:42:48.406184 tar[1863]: linux-arm64/README.md Sep 9 23:42:48.419673 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 23:42:48.427299 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 23:42:48.437125 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 23:42:48.444172 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 23:42:48.452529 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 23:42:48.483580 locksmithd[1976]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 23:42:48.610854 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:42:48.615779 (kubelet)[2022]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:42:48.657030 containerd[1873]: time="2025-09-09T23:42:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 23:42:48.658027 containerd[1873]: time="2025-09-09T23:42:48.657779276Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 23:42:48.664601 containerd[1873]: time="2025-09-09T23:42:48.664453228Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.624µs" Sep 9 23:42:48.664601 containerd[1873]: time="2025-09-09T23:42:48.664481692Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 23:42:48.664601 containerd[1873]: time="2025-09-09T23:42:48.664496188Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 23:42:48.665035 containerd[1873]: time="2025-09-09T23:42:48.664993908Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 23:42:48.665075 containerd[1873]: time="2025-09-09T23:42:48.665036100Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 23:42:48.665075 containerd[1873]: time="2025-09-09T23:42:48.665060420Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665132 containerd[1873]: time="2025-09-09T23:42:48.665115844Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665132 containerd[1873]: time="2025-09-09T23:42:48.665128788Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665703 containerd[1873]: time="2025-09-09T23:42:48.665681652Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665724 containerd[1873]: time="2025-09-09T23:42:48.665702132Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665724 containerd[1873]: time="2025-09-09T23:42:48.665712652Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665724 containerd[1873]: time="2025-09-09T23:42:48.665718196Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 23:42:48.665828 containerd[1873]: time="2025-09-09T23:42:48.665811468Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 23:42:48.666456 containerd[1873]: time="2025-09-09T23:42:48.666430020Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:42:48.666479 containerd[1873]: time="2025-09-09T23:42:48.666469036Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:42:48.666479 containerd[1873]: time="2025-09-09T23:42:48.666476852Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 23:42:48.666952 containerd[1873]: time="2025-09-09T23:42:48.666498300Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 23:42:48.667612 containerd[1873]: time="2025-09-09T23:42:48.667592260Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 23:42:48.667683 containerd[1873]: time="2025-09-09T23:42:48.667668004Z" level=info msg="metadata content store policy set" policy=shared Sep 9 23:42:48.703383 containerd[1873]: time="2025-09-09T23:42:48.703336308Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 23:42:48.703680 containerd[1873]: time="2025-09-09T23:42:48.703555732Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 23:42:48.703680 containerd[1873]: time="2025-09-09T23:42:48.703573836Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 23:42:48.703680 containerd[1873]: time="2025-09-09T23:42:48.703582444Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704065948Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704096204Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704115764Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704125900Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704134396Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704140884Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704147276Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 23:42:48.704268 containerd[1873]: time="2025-09-09T23:42:48.704173116Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704725660Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704775028Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704797108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704805300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704813420Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704820476Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704827980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704834796Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704842244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704848836Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 23:42:48.704868 containerd[1873]: time="2025-09-09T23:42:48.704854876Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 23:42:48.705366 containerd[1873]: time="2025-09-09T23:42:48.705124356Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 23:42:48.705366 containerd[1873]: time="2025-09-09T23:42:48.705337700Z" level=info msg="Start snapshots syncer" Sep 9 23:42:48.705580 containerd[1873]: time="2025-09-09T23:42:48.705403524Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 23:42:48.707039 containerd[1873]: time="2025-09-09T23:42:48.706030076Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 23:42:48.707039 containerd[1873]: time="2025-09-09T23:42:48.706080692Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706138532Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706261372Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706278348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706291708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706299204Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706306892Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706313284Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706319948Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706337684Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706345004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706351420Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706373396Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706383068Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:42:48.707157 containerd[1873]: time="2025-09-09T23:42:48.706388444Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706394276Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706400252Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706408652Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706417484Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706431028Z" level=info msg="runtime interface created" Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706434116Z" level=info msg="created NRI interface" Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706439292Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706447004Z" level=info msg="Connect containerd service" Sep 9 23:42:48.707317 containerd[1873]: time="2025-09-09T23:42:48.706465892Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 23:42:48.707556 containerd[1873]: time="2025-09-09T23:42:48.707537820Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 23:42:48.958524 kubelet[2022]: E0909 23:42:48.958386 2022 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:42:48.960329 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:42:48.960440 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:42:48.962076 systemd[1]: kubelet.service: Consumed 533ms CPU time, 256.5M memory peak. Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581193156Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581252372Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581273556Z" level=info msg="Start subscribing containerd event" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581308652Z" level=info msg="Start recovering state" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581373404Z" level=info msg="Start event monitor" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581383188Z" level=info msg="Start cni network conf syncer for default" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581387988Z" level=info msg="Start streaming server" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581394172Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581398716Z" level=info msg="runtime interface starting up..." Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581402348Z" level=info msg="starting plugins..." Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581412772Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 23:42:49.582032 containerd[1873]: time="2025-09-09T23:42:49.581504812Z" level=info msg="containerd successfully booted in 0.925708s" Sep 9 23:42:49.581627 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 23:42:49.586139 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 23:42:49.591147 systemd[1]: Startup finished in 1.528s (kernel) + 16.094s (initrd) + 18.913s (userspace) = 36.536s. Sep 9 23:42:50.289044 login[2012]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Sep 9 23:42:50.289249 login[2011]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:42:50.299854 systemd-logind[1853]: New session 1 of user core. Sep 9 23:42:50.300829 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 23:42:50.302338 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 23:42:50.337903 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 23:42:50.341478 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 23:42:50.382308 (systemd)[2049]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 23:42:50.384683 systemd-logind[1853]: New session c1 of user core. Sep 9 23:42:50.666923 systemd[2049]: Queued start job for default target default.target. Sep 9 23:42:50.675111 systemd[2049]: Created slice app.slice - User Application Slice. Sep 9 23:42:50.675134 systemd[2049]: Reached target paths.target - Paths. Sep 9 23:42:50.675162 systemd[2049]: Reached target timers.target - Timers. Sep 9 23:42:50.676100 systemd[2049]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 23:42:50.682670 systemd[2049]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 23:42:50.682709 systemd[2049]: Reached target sockets.target - Sockets. Sep 9 23:42:50.682736 systemd[2049]: Reached target basic.target - Basic System. Sep 9 23:42:50.682755 systemd[2049]: Reached target default.target - Main User Target. Sep 9 23:42:50.682773 systemd[2049]: Startup finished in 292ms. Sep 9 23:42:50.682924 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 23:42:50.687107 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 23:42:50.953305 waagent[2005]: 2025-09-09T23:42:50.949620Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 9 23:42:50.953580 waagent[2005]: 2025-09-09T23:42:50.953486Z INFO Daemon Daemon OS: flatcar 4426.0.0 Sep 9 23:42:50.957269 waagent[2005]: 2025-09-09T23:42:50.957227Z INFO Daemon Daemon Python: 3.11.13 Sep 9 23:42:50.960266 waagent[2005]: 2025-09-09T23:42:50.960229Z INFO Daemon Daemon Run daemon Sep 9 23:42:50.962808 waagent[2005]: 2025-09-09T23:42:50.962777Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4426.0.0' Sep 9 23:42:50.968396 waagent[2005]: 2025-09-09T23:42:50.968313Z INFO Daemon Daemon Using waagent for provisioning Sep 9 23:42:50.971955 waagent[2005]: 2025-09-09T23:42:50.971920Z INFO Daemon Daemon Activate resource disk Sep 9 23:42:50.974983 waagent[2005]: 2025-09-09T23:42:50.974949Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 9 23:42:50.982471 waagent[2005]: 2025-09-09T23:42:50.982435Z INFO Daemon Daemon Found device: None Sep 9 23:42:50.985387 waagent[2005]: 2025-09-09T23:42:50.985358Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 9 23:42:50.990879 waagent[2005]: 2025-09-09T23:42:50.990853Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 9 23:42:50.998265 waagent[2005]: 2025-09-09T23:42:50.998228Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 23:42:51.001899 waagent[2005]: 2025-09-09T23:42:51.001868Z INFO Daemon Daemon Running default provisioning handler Sep 9 23:42:51.010993 waagent[2005]: 2025-09-09T23:42:51.010950Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 9 23:42:51.020581 waagent[2005]: 2025-09-09T23:42:51.020544Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 9 23:42:51.027081 waagent[2005]: 2025-09-09T23:42:51.027047Z INFO Daemon Daemon cloud-init is enabled: False Sep 9 23:42:51.030413 waagent[2005]: 2025-09-09T23:42:51.030386Z INFO Daemon Daemon Copying ovf-env.xml Sep 9 23:42:51.140229 waagent[2005]: 2025-09-09T23:42:51.140132Z INFO Daemon Daemon Successfully mounted dvd Sep 9 23:42:51.165535 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 9 23:42:51.170807 waagent[2005]: 2025-09-09T23:42:51.167622Z INFO Daemon Daemon Detect protocol endpoint Sep 9 23:42:51.171101 waagent[2005]: 2025-09-09T23:42:51.171065Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 23:42:51.175239 waagent[2005]: 2025-09-09T23:42:51.175203Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 9 23:42:51.180206 waagent[2005]: 2025-09-09T23:42:51.180172Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 9 23:42:51.183792 waagent[2005]: 2025-09-09T23:42:51.183756Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 9 23:42:51.187147 waagent[2005]: 2025-09-09T23:42:51.187119Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 9 23:42:51.265156 waagent[2005]: 2025-09-09T23:42:51.261043Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 9 23:42:51.266037 waagent[2005]: 2025-09-09T23:42:51.265996Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 9 23:42:51.269944 waagent[2005]: 2025-09-09T23:42:51.269911Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 9 23:42:51.290189 login[2012]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:42:51.293879 systemd-logind[1853]: New session 2 of user core. Sep 9 23:42:51.301121 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 23:42:51.455164 waagent[2005]: 2025-09-09T23:42:51.455091Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 9 23:42:51.459611 waagent[2005]: 2025-09-09T23:42:51.459569Z INFO Daemon Daemon Forcing an update of the goal state. Sep 9 23:42:51.471614 waagent[2005]: 2025-09-09T23:42:51.466846Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 23:42:52.149111 waagent[2005]: 2025-09-09T23:42:52.149061Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 9 23:42:52.153171 waagent[2005]: 2025-09-09T23:42:52.153138Z INFO Daemon Sep 9 23:42:52.155247 waagent[2005]: 2025-09-09T23:42:52.155220Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: f6b9eaf3-7e51-478a-b51d-cb869eb3e9f3 eTag: 16509345372949372830 source: Fabric] Sep 9 23:42:52.162618 waagent[2005]: 2025-09-09T23:42:52.162587Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 9 23:42:52.166777 waagent[2005]: 2025-09-09T23:42:52.166748Z INFO Daemon Sep 9 23:42:52.168577 waagent[2005]: 2025-09-09T23:42:52.168554Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 9 23:42:52.177210 waagent[2005]: 2025-09-09T23:42:52.177184Z INFO Daemon Daemon Downloading artifacts profile blob Sep 9 23:42:52.235160 waagent[2005]: 2025-09-09T23:42:52.235108Z INFO Daemon Downloaded certificate {'thumbprint': 'D531184A43339C954F89CA33F43CF9E14510B156', 'hasPrivateKey': True} Sep 9 23:42:52.241555 waagent[2005]: 2025-09-09T23:42:52.241518Z INFO Daemon Fetch goal state completed Sep 9 23:42:52.250593 waagent[2005]: 2025-09-09T23:42:52.250549Z INFO Daemon Daemon Starting provisioning Sep 9 23:42:52.254399 waagent[2005]: 2025-09-09T23:42:52.254367Z INFO Daemon Daemon Handle ovf-env.xml. Sep 9 23:42:52.257559 waagent[2005]: 2025-09-09T23:42:52.257531Z INFO Daemon Daemon Set hostname [ci-4426.0.0-n-c59ad9327c] Sep 9 23:42:52.303190 waagent[2005]: 2025-09-09T23:42:52.303135Z INFO Daemon Daemon Publish hostname [ci-4426.0.0-n-c59ad9327c] Sep 9 23:42:52.307443 waagent[2005]: 2025-09-09T23:42:52.307404Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 9 23:42:52.311654 waagent[2005]: 2025-09-09T23:42:52.311621Z INFO Daemon Daemon Primary interface is [eth0] Sep 9 23:42:52.321552 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:42:52.321559 systemd-networkd[1684]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:42:52.325429 waagent[2005]: 2025-09-09T23:42:52.321759Z INFO Daemon Daemon Create user account if not exists Sep 9 23:42:52.321609 systemd-networkd[1684]: eth0: DHCP lease lost Sep 9 23:42:52.325709 waagent[2005]: 2025-09-09T23:42:52.325674Z INFO Daemon Daemon User core already exists, skip useradd Sep 9 23:42:52.329631 waagent[2005]: 2025-09-09T23:42:52.329598Z INFO Daemon Daemon Configure sudoer Sep 9 23:42:52.347027 systemd-networkd[1684]: eth0: DHCPv4 address 10.200.20.16/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 23:42:52.380916 waagent[2005]: 2025-09-09T23:42:52.378100Z INFO Daemon Daemon Configure sshd Sep 9 23:42:52.390290 waagent[2005]: 2025-09-09T23:42:52.390247Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 9 23:42:52.398817 waagent[2005]: 2025-09-09T23:42:52.398780Z INFO Daemon Daemon Deploy ssh public key. Sep 9 23:42:53.510598 waagent[2005]: 2025-09-09T23:42:53.510538Z INFO Daemon Daemon Provisioning complete Sep 9 23:42:53.523912 waagent[2005]: 2025-09-09T23:42:53.523878Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 9 23:42:53.527992 waagent[2005]: 2025-09-09T23:42:53.527960Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 9 23:42:53.535083 waagent[2005]: 2025-09-09T23:42:53.535055Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 9 23:42:53.634045 waagent[2102]: 2025-09-09T23:42:53.632988Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 9 23:42:53.634045 waagent[2102]: 2025-09-09T23:42:53.633134Z INFO ExtHandler ExtHandler OS: flatcar 4426.0.0 Sep 9 23:42:53.634045 waagent[2102]: 2025-09-09T23:42:53.633172Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 9 23:42:53.634045 waagent[2102]: 2025-09-09T23:42:53.633205Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 9 23:42:53.812930 waagent[2102]: 2025-09-09T23:42:53.812808Z INFO ExtHandler ExtHandler Distro: flatcar-4426.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 9 23:42:53.813256 waagent[2102]: 2025-09-09T23:42:53.813225Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 23:42:53.813385 waagent[2102]: 2025-09-09T23:42:53.813362Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 23:42:53.819657 waagent[2102]: 2025-09-09T23:42:53.819612Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 23:42:53.825068 waagent[2102]: 2025-09-09T23:42:53.825041Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 9 23:42:53.825513 waagent[2102]: 2025-09-09T23:42:53.825480Z INFO ExtHandler Sep 9 23:42:53.825631 waagent[2102]: 2025-09-09T23:42:53.825609Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 86e65d26-8bb6-41b4-8998-b9016a867d37 eTag: 16509345372949372830 source: Fabric] Sep 9 23:42:53.825918 waagent[2102]: 2025-09-09T23:42:53.825891Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 9 23:42:53.826433 waagent[2102]: 2025-09-09T23:42:53.826399Z INFO ExtHandler Sep 9 23:42:53.826546 waagent[2102]: 2025-09-09T23:42:53.826523Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 9 23:42:53.834192 waagent[2102]: 2025-09-09T23:42:53.834164Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 9 23:42:53.885048 waagent[2102]: 2025-09-09T23:42:53.884141Z INFO ExtHandler Downloaded certificate {'thumbprint': 'D531184A43339C954F89CA33F43CF9E14510B156', 'hasPrivateKey': True} Sep 9 23:42:53.885048 waagent[2102]: 2025-09-09T23:42:53.884517Z INFO ExtHandler Fetch goal state completed Sep 9 23:42:53.897713 waagent[2102]: 2025-09-09T23:42:53.897663Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Sep 9 23:42:53.900931 waagent[2102]: 2025-09-09T23:42:53.900883Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2102 Sep 9 23:42:53.901050 waagent[2102]: 2025-09-09T23:42:53.901024Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 9 23:42:53.901288 waagent[2102]: 2025-09-09T23:42:53.901260Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 9 23:42:53.902346 waagent[2102]: 2025-09-09T23:42:53.902310Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 9 23:42:53.902657 waagent[2102]: 2025-09-09T23:42:53.902626Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4426.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 9 23:42:53.902768 waagent[2102]: 2025-09-09T23:42:53.902744Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 9 23:42:53.903206 waagent[2102]: 2025-09-09T23:42:53.903174Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 9 23:42:53.988214 waagent[2102]: 2025-09-09T23:42:53.988172Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 9 23:42:53.988425 waagent[2102]: 2025-09-09T23:42:53.988392Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 9 23:42:53.993081 waagent[2102]: 2025-09-09T23:42:53.992724Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 9 23:42:53.997251 systemd[1]: Reload requested from client PID 2117 ('systemctl') (unit waagent.service)... Sep 9 23:42:53.997479 systemd[1]: Reloading... Sep 9 23:42:54.069072 zram_generator::config[2153]: No configuration found. Sep 9 23:42:54.224906 systemd[1]: Reloading finished in 227 ms. Sep 9 23:42:54.240773 waagent[2102]: 2025-09-09T23:42:54.240668Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 9 23:42:54.240860 waagent[2102]: 2025-09-09T23:42:54.240814Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 9 23:42:54.756315 waagent[2102]: 2025-09-09T23:42:54.756238Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 9 23:42:54.756602 waagent[2102]: 2025-09-09T23:42:54.756547Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 9 23:42:54.757199 waagent[2102]: 2025-09-09T23:42:54.757158Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 9 23:42:54.757493 waagent[2102]: 2025-09-09T23:42:54.757418Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 9 23:42:54.758024 waagent[2102]: 2025-09-09T23:42:54.757645Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 23:42:54.758024 waagent[2102]: 2025-09-09T23:42:54.757712Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 23:42:54.758024 waagent[2102]: 2025-09-09T23:42:54.757863Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 9 23:42:54.758231 waagent[2102]: 2025-09-09T23:42:54.758197Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 9 23:42:54.758270 waagent[2102]: 2025-09-09T23:42:54.758231Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 9 23:42:54.758304 waagent[2102]: 2025-09-09T23:42:54.757988Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 9 23:42:54.758304 waagent[2102]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 9 23:42:54.758304 waagent[2102]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 9 23:42:54.758304 waagent[2102]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 9 23:42:54.758304 waagent[2102]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 9 23:42:54.758304 waagent[2102]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 23:42:54.758304 waagent[2102]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 23:42:54.758446 waagent[2102]: 2025-09-09T23:42:54.758410Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 9 23:42:54.758479 waagent[2102]: 2025-09-09T23:42:54.758450Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 9 23:42:54.758929 waagent[2102]: 2025-09-09T23:42:54.758907Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 9 23:42:54.759293 waagent[2102]: 2025-09-09T23:42:54.759263Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 23:42:54.759593 waagent[2102]: 2025-09-09T23:42:54.759567Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 23:42:54.759819 waagent[2102]: 2025-09-09T23:42:54.759777Z INFO EnvHandler ExtHandler Configure routes Sep 9 23:42:54.760187 waagent[2102]: 2025-09-09T23:42:54.760162Z INFO EnvHandler ExtHandler Gateway:None Sep 9 23:42:54.760448 waagent[2102]: 2025-09-09T23:42:54.760420Z INFO EnvHandler ExtHandler Routes:None Sep 9 23:42:54.764387 waagent[2102]: 2025-09-09T23:42:54.764357Z INFO ExtHandler ExtHandler Sep 9 23:42:54.764751 waagent[2102]: 2025-09-09T23:42:54.764722Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: e4546f2d-cf18-4e0c-a85c-d7b89dd166f3 correlation b96e368f-a201-4234-836e-26669bec0f3b created: 2025-09-09T23:41:25.101761Z] Sep 9 23:42:54.766459 waagent[2102]: 2025-09-09T23:42:54.765406Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 9 23:42:54.766459 waagent[2102]: 2025-09-09T23:42:54.765802Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Sep 9 23:42:54.797739 waagent[2102]: 2025-09-09T23:42:54.797698Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 9 23:42:54.797739 waagent[2102]: Try `iptables -h' or 'iptables --help' for more information.) Sep 9 23:42:54.798188 waagent[2102]: 2025-09-09T23:42:54.798159Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 2E4B922F-42B5-4257-BE61-8F3331696827;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 9 23:42:54.846858 waagent[2102]: 2025-09-09T23:42:54.846811Z INFO MonitorHandler ExtHandler Network interfaces: Sep 9 23:42:54.846858 waagent[2102]: Executing ['ip', '-a', '-o', 'link']: Sep 9 23:42:54.846858 waagent[2102]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 9 23:42:54.846858 waagent[2102]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:72:0b brd ff:ff:ff:ff:ff:ff Sep 9 23:42:54.846858 waagent[2102]: 3: enP57222s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:72:0b brd ff:ff:ff:ff:ff:ff\ altname enP57222p0s2 Sep 9 23:42:54.846858 waagent[2102]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 9 23:42:54.846858 waagent[2102]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 9 23:42:54.846858 waagent[2102]: 2: eth0 inet 10.200.20.16/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 9 23:42:54.846858 waagent[2102]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 9 23:42:54.846858 waagent[2102]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 9 23:42:54.846858 waagent[2102]: 2: eth0 inet6 fe80::20d:3aff:fefc:720b/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 9 23:42:55.940029 waagent[2102]: 2025-09-09T23:42:55.939837Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 9 23:42:55.940029 waagent[2102]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.940029 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.940029 waagent[2102]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.940029 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.940029 waagent[2102]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.940029 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.940029 waagent[2102]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 23:42:55.940029 waagent[2102]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 23:42:55.940029 waagent[2102]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 23:42:55.942196 waagent[2102]: 2025-09-09T23:42:55.942151Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 9 23:42:55.942196 waagent[2102]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.942196 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.942196 waagent[2102]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.942196 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.942196 waagent[2102]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 23:42:55.942196 waagent[2102]: pkts bytes target prot opt in out source destination Sep 9 23:42:55.942196 waagent[2102]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 23:42:55.942196 waagent[2102]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 23:42:55.942196 waagent[2102]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 23:42:55.942380 waagent[2102]: 2025-09-09T23:42:55.942354Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 9 23:42:58.972287 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 23:42:58.973755 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:42:59.077696 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:42:59.080099 (kubelet)[2252]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:42:59.214077 kubelet[2252]: E0909 23:42:59.213998 2252 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:42:59.216407 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:42:59.216519 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:42:59.217208 systemd[1]: kubelet.service: Consumed 102ms CPU time, 106.4M memory peak. Sep 9 23:43:09.223780 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 23:43:09.225565 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:09.312080 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:09.317307 (kubelet)[2267]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:43:09.445819 kubelet[2267]: E0909 23:43:09.445770 2267 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:43:09.447688 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:43:09.447803 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:43:09.448348 systemd[1]: kubelet.service: Consumed 95ms CPU time, 105.3M memory peak. Sep 9 23:43:11.583172 chronyd[1832]: Selected source PHC0 Sep 9 23:43:11.844043 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 23:43:11.845241 systemd[1]: Started sshd@0-10.200.20.16:22-10.200.16.10:56642.service - OpenSSH per-connection server daemon (10.200.16.10:56642). Sep 9 23:43:12.556236 sshd[2275]: Accepted publickey for core from 10.200.16.10 port 56642 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:12.557253 sshd-session[2275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:12.560735 systemd-logind[1853]: New session 3 of user core. Sep 9 23:43:12.571299 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 23:43:12.978223 systemd[1]: Started sshd@1-10.200.20.16:22-10.200.16.10:56654.service - OpenSSH per-connection server daemon (10.200.16.10:56654). Sep 9 23:43:13.389914 sshd[2281]: Accepted publickey for core from 10.200.16.10 port 56654 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:13.392354 sshd-session[2281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:13.396073 systemd-logind[1853]: New session 4 of user core. Sep 9 23:43:13.403113 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 23:43:13.707234 sshd[2284]: Connection closed by 10.200.16.10 port 56654 Sep 9 23:43:13.707858 sshd-session[2281]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:13.710703 systemd[1]: sshd@1-10.200.20.16:22-10.200.16.10:56654.service: Deactivated successfully. Sep 9 23:43:13.712203 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 23:43:13.712946 systemd-logind[1853]: Session 4 logged out. Waiting for processes to exit. Sep 9 23:43:13.714481 systemd-logind[1853]: Removed session 4. Sep 9 23:43:13.800782 systemd[1]: Started sshd@2-10.200.20.16:22-10.200.16.10:56662.service - OpenSSH per-connection server daemon (10.200.16.10:56662). Sep 9 23:43:14.292720 sshd[2290]: Accepted publickey for core from 10.200.16.10 port 56662 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:14.293762 sshd-session[2290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:14.297186 systemd-logind[1853]: New session 5 of user core. Sep 9 23:43:14.305213 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 23:43:14.639996 sshd[2293]: Connection closed by 10.200.16.10 port 56662 Sep 9 23:43:14.639834 sshd-session[2290]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:14.642498 systemd[1]: sshd@2-10.200.20.16:22-10.200.16.10:56662.service: Deactivated successfully. Sep 9 23:43:14.643802 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 23:43:14.646052 systemd-logind[1853]: Session 5 logged out. Waiting for processes to exit. Sep 9 23:43:14.647516 systemd-logind[1853]: Removed session 5. Sep 9 23:43:14.730244 systemd[1]: Started sshd@3-10.200.20.16:22-10.200.16.10:56676.service - OpenSSH per-connection server daemon (10.200.16.10:56676). Sep 9 23:43:15.213995 sshd[2299]: Accepted publickey for core from 10.200.16.10 port 56676 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:15.215057 sshd-session[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:15.218406 systemd-logind[1853]: New session 6 of user core. Sep 9 23:43:15.226288 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 23:43:15.576951 sshd[2302]: Connection closed by 10.200.16.10 port 56676 Sep 9 23:43:15.576790 sshd-session[2299]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:15.580261 systemd[1]: sshd@3-10.200.20.16:22-10.200.16.10:56676.service: Deactivated successfully. Sep 9 23:43:15.581836 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 23:43:15.582463 systemd-logind[1853]: Session 6 logged out. Waiting for processes to exit. Sep 9 23:43:15.583333 systemd-logind[1853]: Removed session 6. Sep 9 23:43:15.667400 systemd[1]: Started sshd@4-10.200.20.16:22-10.200.16.10:56688.service - OpenSSH per-connection server daemon (10.200.16.10:56688). Sep 9 23:43:16.164545 sshd[2308]: Accepted publickey for core from 10.200.16.10 port 56688 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:16.165576 sshd-session[2308]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:16.168868 systemd-logind[1853]: New session 7 of user core. Sep 9 23:43:16.177114 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 23:43:16.657538 sudo[2312]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 23:43:16.657753 sudo[2312]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:43:16.693282 sudo[2312]: pam_unix(sudo:session): session closed for user root Sep 9 23:43:16.783120 sshd[2311]: Connection closed by 10.200.16.10 port 56688 Sep 9 23:43:16.783785 sshd-session[2308]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:16.787467 systemd[1]: sshd@4-10.200.20.16:22-10.200.16.10:56688.service: Deactivated successfully. Sep 9 23:43:16.787799 systemd-logind[1853]: Session 7 logged out. Waiting for processes to exit. Sep 9 23:43:16.789526 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 23:43:16.790759 systemd-logind[1853]: Removed session 7. Sep 9 23:43:16.870438 systemd[1]: Started sshd@5-10.200.20.16:22-10.200.16.10:56690.service - OpenSSH per-connection server daemon (10.200.16.10:56690). Sep 9 23:43:17.363017 sshd[2318]: Accepted publickey for core from 10.200.16.10 port 56690 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:17.364119 sshd-session[2318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:17.367852 systemd-logind[1853]: New session 8 of user core. Sep 9 23:43:17.374098 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 23:43:17.637626 sudo[2323]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 23:43:17.638352 sudo[2323]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:43:17.645386 sudo[2323]: pam_unix(sudo:session): session closed for user root Sep 9 23:43:17.648775 sudo[2322]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 23:43:17.649223 sudo[2322]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:43:17.655335 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:43:17.684910 augenrules[2345]: No rules Sep 9 23:43:17.685890 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:43:17.687246 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:43:17.688163 sudo[2322]: pam_unix(sudo:session): session closed for user root Sep 9 23:43:17.765272 sshd[2321]: Connection closed by 10.200.16.10 port 56690 Sep 9 23:43:17.765579 sshd-session[2318]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:17.768431 systemd-logind[1853]: Session 8 logged out. Waiting for processes to exit. Sep 9 23:43:17.769733 systemd[1]: sshd@5-10.200.20.16:22-10.200.16.10:56690.service: Deactivated successfully. Sep 9 23:43:17.772036 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 23:43:17.773420 systemd-logind[1853]: Removed session 8. Sep 9 23:43:17.846476 systemd[1]: Started sshd@6-10.200.20.16:22-10.200.16.10:56698.service - OpenSSH per-connection server daemon (10.200.16.10:56698). Sep 9 23:43:18.295835 sshd[2354]: Accepted publickey for core from 10.200.16.10 port 56698 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:43:18.296850 sshd-session[2354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:43:18.300268 systemd-logind[1853]: New session 9 of user core. Sep 9 23:43:18.307275 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 23:43:18.550687 sudo[2358]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 23:43:18.551073 sudo[2358]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:43:19.472292 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 23:43:19.474906 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:19.868752 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:19.874197 (kubelet)[2382]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:43:19.901506 kubelet[2382]: E0909 23:43:19.901426 2382 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:43:19.903363 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:43:19.903560 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:43:19.904135 systemd[1]: kubelet.service: Consumed 99ms CPU time, 107.2M memory peak. Sep 9 23:43:20.642115 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 23:43:20.653240 (dockerd)[2391]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 23:43:21.750305 dockerd[2391]: time="2025-09-09T23:43:21.750256229Z" level=info msg="Starting up" Sep 9 23:43:21.752521 dockerd[2391]: time="2025-09-09T23:43:21.752493754Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 23:43:21.761633 dockerd[2391]: time="2025-09-09T23:43:21.761545805Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 23:43:21.868765 dockerd[2391]: time="2025-09-09T23:43:21.868722390Z" level=info msg="Loading containers: start." Sep 9 23:43:21.954024 kernel: Initializing XFRM netlink socket Sep 9 23:43:22.553717 systemd-networkd[1684]: docker0: Link UP Sep 9 23:43:22.572572 dockerd[2391]: time="2025-09-09T23:43:22.572481405Z" level=info msg="Loading containers: done." Sep 9 23:43:22.581988 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1504278014-merged.mount: Deactivated successfully. Sep 9 23:43:22.607182 dockerd[2391]: time="2025-09-09T23:43:22.607144784Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 23:43:22.607327 dockerd[2391]: time="2025-09-09T23:43:22.607220410Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 23:43:22.607327 dockerd[2391]: time="2025-09-09T23:43:22.607301884Z" level=info msg="Initializing buildkit" Sep 9 23:43:22.658303 dockerd[2391]: time="2025-09-09T23:43:22.658265886Z" level=info msg="Completed buildkit initialization" Sep 9 23:43:22.663381 dockerd[2391]: time="2025-09-09T23:43:22.663278326Z" level=info msg="Daemon has completed initialization" Sep 9 23:43:22.663539 dockerd[2391]: time="2025-09-09T23:43:22.663509453Z" level=info msg="API listen on /run/docker.sock" Sep 9 23:43:22.663534 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 23:43:23.299080 containerd[1873]: time="2025-09-09T23:43:23.299045764Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 23:43:24.186857 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount657661095.mount: Deactivated successfully. Sep 9 23:43:25.369529 containerd[1873]: time="2025-09-09T23:43:25.369477362Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:25.375505 containerd[1873]: time="2025-09-09T23:43:25.375473376Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328357" Sep 9 23:43:25.379564 containerd[1873]: time="2025-09-09T23:43:25.379535251Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:25.385574 containerd[1873]: time="2025-09-09T23:43:25.385547905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:25.386576 containerd[1873]: time="2025-09-09T23:43:25.386547712Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 2.087470578s" Sep 9 23:43:25.386595 containerd[1873]: time="2025-09-09T23:43:25.386585649Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 9 23:43:25.387127 containerd[1873]: time="2025-09-09T23:43:25.387107232Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 23:43:26.777035 containerd[1873]: time="2025-09-09T23:43:26.776959610Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:26.780509 containerd[1873]: time="2025-09-09T23:43:26.780324328Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528552" Sep 9 23:43:26.784503 containerd[1873]: time="2025-09-09T23:43:26.784476262Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:26.789480 containerd[1873]: time="2025-09-09T23:43:26.789448797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:26.790244 containerd[1873]: time="2025-09-09T23:43:26.790103433Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.402975039s" Sep 9 23:43:26.790244 containerd[1873]: time="2025-09-09T23:43:26.790135586Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 9 23:43:26.790710 containerd[1873]: time="2025-09-09T23:43:26.790691314Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 23:43:28.038072 containerd[1873]: time="2025-09-09T23:43:28.037975842Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:28.041758 containerd[1873]: time="2025-09-09T23:43:28.041733103Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483527" Sep 9 23:43:28.046765 containerd[1873]: time="2025-09-09T23:43:28.046727264Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:28.052499 containerd[1873]: time="2025-09-09T23:43:28.052446639Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:28.053079 containerd[1873]: time="2025-09-09T23:43:28.052950844Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.262177727s" Sep 9 23:43:28.053079 containerd[1873]: time="2025-09-09T23:43:28.052976469Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 9 23:43:28.053499 containerd[1873]: time="2025-09-09T23:43:28.053468626Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 23:43:29.125648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2653146103.mount: Deactivated successfully. Sep 9 23:43:29.422726 containerd[1873]: time="2025-09-09T23:43:29.422598978Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:29.426946 containerd[1873]: time="2025-09-09T23:43:29.426820515Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376724" Sep 9 23:43:29.432061 containerd[1873]: time="2025-09-09T23:43:29.432033325Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:29.437519 containerd[1873]: time="2025-09-09T23:43:29.437467328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:29.437835 containerd[1873]: time="2025-09-09T23:43:29.437717610Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 1.384221919s" Sep 9 23:43:29.437835 containerd[1873]: time="2025-09-09T23:43:29.437744620Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 9 23:43:29.438302 containerd[1873]: time="2025-09-09T23:43:29.438269433Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 23:43:29.683165 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 9 23:43:29.972232 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 23:43:29.975167 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:30.249616 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:30.252052 (kubelet)[2675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:43:30.278628 kubelet[2675]: E0909 23:43:30.278579 2675 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:43:30.280385 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:43:30.280579 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:43:30.281052 systemd[1]: kubelet.service: Consumed 97ms CPU time, 105.7M memory peak. Sep 9 23:43:30.590589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2969647606.mount: Deactivated successfully. Sep 9 23:43:32.465372 containerd[1873]: time="2025-09-09T23:43:32.465317942Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:32.470199 containerd[1873]: time="2025-09-09T23:43:32.470168433Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 9 23:43:32.477570 containerd[1873]: time="2025-09-09T23:43:32.477520516Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:32.483692 containerd[1873]: time="2025-09-09T23:43:32.483078756Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:32.483692 containerd[1873]: time="2025-09-09T23:43:32.483587962Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 3.045290279s" Sep 9 23:43:32.483692 containerd[1873]: time="2025-09-09T23:43:32.483612355Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 23:43:32.484263 containerd[1873]: time="2025-09-09T23:43:32.484203955Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 23:43:33.057042 update_engine[1855]: I20250909 23:43:33.056749 1855 update_attempter.cc:509] Updating boot flags... Sep 9 23:43:33.116897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1244200508.mount: Deactivated successfully. Sep 9 23:43:33.152029 containerd[1873]: time="2025-09-09T23:43:33.151219430Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:43:33.159495 containerd[1873]: time="2025-09-09T23:43:33.159474287Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 9 23:43:33.169024 containerd[1873]: time="2025-09-09T23:43:33.165317515Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:43:33.174323 containerd[1873]: time="2025-09-09T23:43:33.174247256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:43:33.175296 containerd[1873]: time="2025-09-09T23:43:33.175275027Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 690.912545ms" Sep 9 23:43:33.175381 containerd[1873]: time="2025-09-09T23:43:33.175368895Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 23:43:33.177272 containerd[1873]: time="2025-09-09T23:43:33.177214125Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 23:43:33.866326 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3653559862.mount: Deactivated successfully. Sep 9 23:43:36.120970 containerd[1873]: time="2025-09-09T23:43:36.120921154Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:36.125265 containerd[1873]: time="2025-09-09T23:43:36.125238742Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943165" Sep 9 23:43:36.129010 containerd[1873]: time="2025-09-09T23:43:36.128963318Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:36.133670 containerd[1873]: time="2025-09-09T23:43:36.133624732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:36.134484 containerd[1873]: time="2025-09-09T23:43:36.134220552Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.95679609s" Sep 9 23:43:36.134484 containerd[1873]: time="2025-09-09T23:43:36.134245016Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 9 23:43:38.458558 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:38.459074 systemd[1]: kubelet.service: Consumed 97ms CPU time, 105.7M memory peak. Sep 9 23:43:38.460611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:38.477828 systemd[1]: Reload requested from client PID 2933 ('systemctl') (unit session-9.scope)... Sep 9 23:43:38.477843 systemd[1]: Reloading... Sep 9 23:43:38.570122 zram_generator::config[2983]: No configuration found. Sep 9 23:43:38.716243 systemd[1]: Reloading finished in 238 ms. Sep 9 23:43:38.739418 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 23:43:38.739477 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 23:43:38.739650 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:38.739684 systemd[1]: kubelet.service: Consumed 62ms CPU time, 95M memory peak. Sep 9 23:43:38.740805 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:39.011076 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:39.020241 (kubelet)[3045]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:43:39.091229 kubelet[3045]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:43:39.091229 kubelet[3045]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:43:39.091229 kubelet[3045]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:43:39.091563 kubelet[3045]: I0909 23:43:39.091267 3045 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:43:39.328335 kubelet[3045]: I0909 23:43:39.327997 3045 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:43:39.328335 kubelet[3045]: I0909 23:43:39.328034 3045 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:43:39.328335 kubelet[3045]: I0909 23:43:39.328227 3045 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:43:39.345928 kubelet[3045]: E0909 23:43:39.345904 3045 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.16:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:43:39.347773 kubelet[3045]: I0909 23:43:39.347632 3045 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:43:39.351449 kubelet[3045]: I0909 23:43:39.351435 3045 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:43:39.353888 kubelet[3045]: I0909 23:43:39.353870 3045 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:43:39.354712 kubelet[3045]: I0909 23:43:39.354685 3045 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:43:39.354926 kubelet[3045]: I0909 23:43:39.354780 3045 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-n-c59ad9327c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:43:39.355129 kubelet[3045]: I0909 23:43:39.355048 3045 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:43:39.355129 kubelet[3045]: I0909 23:43:39.355066 3045 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:43:39.355266 kubelet[3045]: I0909 23:43:39.355256 3045 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:43:39.357393 kubelet[3045]: I0909 23:43:39.357379 3045 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:43:39.357474 kubelet[3045]: I0909 23:43:39.357465 3045 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:43:39.357613 kubelet[3045]: I0909 23:43:39.357547 3045 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:43:39.357613 kubelet[3045]: I0909 23:43:39.357564 3045 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:43:39.358983 kubelet[3045]: W0909 23:43:39.358951 3045 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.16:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-n-c59ad9327c&limit=500&resourceVersion=0": dial tcp 10.200.20.16:6443: connect: connection refused Sep 9 23:43:39.359055 kubelet[3045]: E0909 23:43:39.358993 3045 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.16:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.0.0-n-c59ad9327c&limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:43:39.359496 kubelet[3045]: W0909 23:43:39.359403 3045 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.16:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.16:6443: connect: connection refused Sep 9 23:43:39.359496 kubelet[3045]: E0909 23:43:39.359433 3045 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.16:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:43:39.360835 kubelet[3045]: I0909 23:43:39.360193 3045 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:43:39.360835 kubelet[3045]: I0909 23:43:39.360472 3045 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:43:39.360835 kubelet[3045]: W0909 23:43:39.360511 3045 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 23:43:39.361791 kubelet[3045]: I0909 23:43:39.361774 3045 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:43:39.361881 kubelet[3045]: I0909 23:43:39.361873 3045 server.go:1287] "Started kubelet" Sep 9 23:43:39.365018 kubelet[3045]: I0909 23:43:39.364982 3045 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:43:39.365253 kubelet[3045]: I0909 23:43:39.365226 3045 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:43:39.366147 kubelet[3045]: I0909 23:43:39.366133 3045 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:43:39.367390 kubelet[3045]: I0909 23:43:39.367348 3045 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:43:39.367635 kubelet[3045]: I0909 23:43:39.367620 3045 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:43:39.368404 kubelet[3045]: I0909 23:43:39.368375 3045 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:43:39.368888 kubelet[3045]: I0909 23:43:39.368863 3045 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:43:39.369073 kubelet[3045]: E0909 23:43:39.369054 3045 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" Sep 9 23:43:39.369745 kubelet[3045]: I0909 23:43:39.369719 3045 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:43:39.369786 kubelet[3045]: I0909 23:43:39.369769 3045 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:43:39.372049 kubelet[3045]: W0909 23:43:39.371997 3045 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.16:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.16:6443: connect: connection refused Sep 9 23:43:39.372107 kubelet[3045]: E0909 23:43:39.372052 3045 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.16:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:43:39.372107 kubelet[3045]: E0909 23:43:39.372095 3045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-n-c59ad9327c?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="200ms" Sep 9 23:43:39.372209 kubelet[3045]: I0909 23:43:39.372191 3045 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:43:39.372260 kubelet[3045]: I0909 23:43:39.372247 3045 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:43:39.374854 kubelet[3045]: E0909 23:43:39.374761 3045 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.16:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.16:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.0.0-n-c59ad9327c.1863c1d9778454bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.0.0-n-c59ad9327c,UID:ci-4426.0.0-n-c59ad9327c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.0.0-n-c59ad9327c,},FirstTimestamp:2025-09-09 23:43:39.361850556 +0000 UTC m=+0.339505589,LastTimestamp:2025-09-09 23:43:39.361850556 +0000 UTC m=+0.339505589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.0.0-n-c59ad9327c,}" Sep 9 23:43:39.376058 kubelet[3045]: I0909 23:43:39.375227 3045 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:43:39.391965 kubelet[3045]: E0909 23:43:39.391908 3045 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:43:39.394692 kubelet[3045]: I0909 23:43:39.394675 3045 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:43:39.394692 kubelet[3045]: I0909 23:43:39.394689 3045 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:43:39.394772 kubelet[3045]: I0909 23:43:39.394705 3045 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:43:39.469904 kubelet[3045]: E0909 23:43:39.469874 3045 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" Sep 9 23:43:39.548172 kubelet[3045]: I0909 23:43:39.548141 3045 policy_none.go:49] "None policy: Start" Sep 9 23:43:39.548172 kubelet[3045]: I0909 23:43:39.548169 3045 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:43:39.548172 kubelet[3045]: I0909 23:43:39.548182 3045 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:43:39.557847 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 23:43:39.565368 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 23:43:39.568188 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 23:43:39.570767 kubelet[3045]: E0909 23:43:39.570753 3045 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" Sep 9 23:43:39.573313 kubelet[3045]: E0909 23:43:39.573282 3045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-n-c59ad9327c?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="400ms" Sep 9 23:43:39.580583 kubelet[3045]: I0909 23:43:39.579653 3045 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:43:39.580669 kubelet[3045]: I0909 23:43:39.580658 3045 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:43:39.580740 kubelet[3045]: I0909 23:43:39.580712 3045 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:43:39.581897 kubelet[3045]: I0909 23:43:39.581072 3045 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:43:39.582725 kubelet[3045]: E0909 23:43:39.582697 3045 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:43:39.582789 kubelet[3045]: E0909 23:43:39.582739 3045 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.0.0-n-c59ad9327c\" not found" Sep 9 23:43:39.600807 kubelet[3045]: I0909 23:43:39.600768 3045 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:43:39.602591 kubelet[3045]: I0909 23:43:39.602569 3045 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:43:39.602591 kubelet[3045]: I0909 23:43:39.602590 3045 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:43:39.602743 kubelet[3045]: I0909 23:43:39.602607 3045 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:43:39.602743 kubelet[3045]: I0909 23:43:39.602612 3045 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:43:39.602743 kubelet[3045]: E0909 23:43:39.602641 3045 kubelet.go:2406] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 9 23:43:39.603167 kubelet[3045]: W0909 23:43:39.602920 3045 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.16:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.16:6443: connect: connection refused Sep 9 23:43:39.603167 kubelet[3045]: E0909 23:43:39.602943 3045 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.16:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.16:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:43:39.682999 kubelet[3045]: I0909 23:43:39.682970 3045 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.683353 kubelet[3045]: E0909 23:43:39.683325 3045 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.711932 systemd[1]: Created slice kubepods-burstable-pod90cad30b2305d4c47838c6290eeda1da.slice - libcontainer container kubepods-burstable-pod90cad30b2305d4c47838c6290eeda1da.slice. Sep 9 23:43:39.727987 kubelet[3045]: E0909 23:43:39.727895 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.730410 systemd[1]: Created slice kubepods-burstable-pod78c8cbf9bccdf912596e33daf7d64a1e.slice - libcontainer container kubepods-burstable-pod78c8cbf9bccdf912596e33daf7d64a1e.slice. Sep 9 23:43:39.737830 kubelet[3045]: E0909 23:43:39.737812 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.740453 systemd[1]: Created slice kubepods-burstable-pod4298eb6720ba95bdf9d84c517f7cf988.slice - libcontainer container kubepods-burstable-pod4298eb6720ba95bdf9d84c517f7cf988.slice. Sep 9 23:43:39.741794 kubelet[3045]: E0909 23:43:39.741673 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.771945 kubelet[3045]: I0909 23:43:39.771924 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772079 kubelet[3045]: I0909 23:43:39.772065 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772164 kubelet[3045]: I0909 23:43:39.772153 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772238 kubelet[3045]: I0909 23:43:39.772223 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772306 kubelet[3045]: I0909 23:43:39.772295 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772360 kubelet[3045]: I0909 23:43:39.772350 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772421 kubelet[3045]: I0909 23:43:39.772411 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772490 kubelet[3045]: I0909 23:43:39.772476 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.772569 kubelet[3045]: I0909 23:43:39.772557 3045 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4298eb6720ba95bdf9d84c517f7cf988-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-n-c59ad9327c\" (UID: \"4298eb6720ba95bdf9d84c517f7cf988\") " pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.885307 kubelet[3045]: I0909 23:43:39.885209 3045 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.885621 kubelet[3045]: E0909 23:43:39.885507 3045 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:39.974174 kubelet[3045]: E0909 23:43:39.974128 3045 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.16:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.0.0-n-c59ad9327c?timeout=10s\": dial tcp 10.200.20.16:6443: connect: connection refused" interval="800ms" Sep 9 23:43:40.030195 containerd[1873]: time="2025-09-09T23:43:40.029924312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-n-c59ad9327c,Uid:90cad30b2305d4c47838c6290eeda1da,Namespace:kube-system,Attempt:0,}" Sep 9 23:43:40.038609 containerd[1873]: time="2025-09-09T23:43:40.038579440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-n-c59ad9327c,Uid:78c8cbf9bccdf912596e33daf7d64a1e,Namespace:kube-system,Attempt:0,}" Sep 9 23:43:40.044536 containerd[1873]: time="2025-09-09T23:43:40.044513888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-n-c59ad9327c,Uid:4298eb6720ba95bdf9d84c517f7cf988,Namespace:kube-system,Attempt:0,}" Sep 9 23:43:40.150906 containerd[1873]: time="2025-09-09T23:43:40.150119987Z" level=info msg="connecting to shim 1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430" address="unix:///run/containerd/s/31d44b552903968b3d358bdebd87a878c9ea5d3f886027a5162df946bfa62f4e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:43:40.172613 containerd[1873]: time="2025-09-09T23:43:40.172582537Z" level=info msg="connecting to shim 8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f" address="unix:///run/containerd/s/a27e17e66e6329100cec79acf6c511dd5fec88087892a7edf239d3b56a5e6de7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:43:40.173141 systemd[1]: Started cri-containerd-1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430.scope - libcontainer container 1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430. Sep 9 23:43:40.186740 containerd[1873]: time="2025-09-09T23:43:40.186714009Z" level=info msg="connecting to shim 56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab" address="unix:///run/containerd/s/86b4187e95047f96c9cca1a80c89c5d85e404e72672ca79a4b20e5158e6cf92d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:43:40.199477 systemd[1]: Started cri-containerd-8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f.scope - libcontainer container 8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f. Sep 9 23:43:40.211146 systemd[1]: Started cri-containerd-56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab.scope - libcontainer container 56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab. Sep 9 23:43:40.231630 containerd[1873]: time="2025-09-09T23:43:40.231577682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.0.0-n-c59ad9327c,Uid:78c8cbf9bccdf912596e33daf7d64a1e,Namespace:kube-system,Attempt:0,} returns sandbox id \"1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430\"" Sep 9 23:43:40.237648 containerd[1873]: time="2025-09-09T23:43:40.237617981Z" level=info msg="CreateContainer within sandbox \"1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 23:43:40.254962 containerd[1873]: time="2025-09-09T23:43:40.254925932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.0.0-n-c59ad9327c,Uid:90cad30b2305d4c47838c6290eeda1da,Namespace:kube-system,Attempt:0,} returns sandbox id \"8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f\"" Sep 9 23:43:40.260602 containerd[1873]: time="2025-09-09T23:43:40.260573354Z" level=info msg="CreateContainer within sandbox \"8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 23:43:40.264112 containerd[1873]: time="2025-09-09T23:43:40.264087084Z" level=info msg="Container 9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:43:40.279196 containerd[1873]: time="2025-09-09T23:43:40.279058231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.0.0-n-c59ad9327c,Uid:4298eb6720ba95bdf9d84c517f7cf988,Namespace:kube-system,Attempt:0,} returns sandbox id \"56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab\"" Sep 9 23:43:40.280939 containerd[1873]: time="2025-09-09T23:43:40.280917123Z" level=info msg="CreateContainer within sandbox \"56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 23:43:40.287400 kubelet[3045]: I0909 23:43:40.287374 3045 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:40.287670 kubelet[3045]: E0909 23:43:40.287651 3045 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.16:6443/api/v1/nodes\": dial tcp 10.200.20.16:6443: connect: connection refused" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:40.306043 containerd[1873]: time="2025-09-09T23:43:40.305634921Z" level=info msg="Container d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:43:40.318272 containerd[1873]: time="2025-09-09T23:43:40.318240617Z" level=info msg="CreateContainer within sandbox \"1e24bf0f99b3c565a586fc68adbc36600b4f876fc0f7d7b70ef5dc0349cc1430\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7\"" Sep 9 23:43:40.318755 containerd[1873]: time="2025-09-09T23:43:40.318716320Z" level=info msg="StartContainer for \"9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7\"" Sep 9 23:43:40.319471 containerd[1873]: time="2025-09-09T23:43:40.319443063Z" level=info msg="connecting to shim 9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7" address="unix:///run/containerd/s/31d44b552903968b3d358bdebd87a878c9ea5d3f886027a5162df946bfa62f4e" protocol=ttrpc version=3 Sep 9 23:43:40.331718 containerd[1873]: time="2025-09-09T23:43:40.331690659Z" level=info msg="CreateContainer within sandbox \"8f368522ab2ee41aff09a3aaa8f6495921e643833c68608c6c493a0bda595d3f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185\"" Sep 9 23:43:40.332068 containerd[1873]: time="2025-09-09T23:43:40.331997085Z" level=info msg="StartContainer for \"d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185\"" Sep 9 23:43:40.332854 containerd[1873]: time="2025-09-09T23:43:40.332828000Z" level=info msg="connecting to shim d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185" address="unix:///run/containerd/s/a27e17e66e6329100cec79acf6c511dd5fec88087892a7edf239d3b56a5e6de7" protocol=ttrpc version=3 Sep 9 23:43:40.333121 systemd[1]: Started cri-containerd-9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7.scope - libcontainer container 9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7. Sep 9 23:43:40.338757 containerd[1873]: time="2025-09-09T23:43:40.338432285Z" level=info msg="Container 3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:43:40.352108 systemd[1]: Started cri-containerd-d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185.scope - libcontainer container d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185. Sep 9 23:43:40.361501 containerd[1873]: time="2025-09-09T23:43:40.361456404Z" level=info msg="CreateContainer within sandbox \"56b4956a54118dff1a358ca12515b710d526bbd945a75405a67b4cbf713443ab\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0\"" Sep 9 23:43:40.362156 containerd[1873]: time="2025-09-09T23:43:40.362129186Z" level=info msg="StartContainer for \"3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0\"" Sep 9 23:43:40.363573 containerd[1873]: time="2025-09-09T23:43:40.363534879Z" level=info msg="connecting to shim 3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0" address="unix:///run/containerd/s/86b4187e95047f96c9cca1a80c89c5d85e404e72672ca79a4b20e5158e6cf92d" protocol=ttrpc version=3 Sep 9 23:43:40.378122 systemd[1]: Started cri-containerd-3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0.scope - libcontainer container 3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0. Sep 9 23:43:40.417532 containerd[1873]: time="2025-09-09T23:43:40.417273671Z" level=info msg="StartContainer for \"d401a97290928ad80beab234eb88c85cc3a3a47a9b0f2c802811d5fae6bf3185\" returns successfully" Sep 9 23:43:40.418191 containerd[1873]: time="2025-09-09T23:43:40.417686780Z" level=info msg="StartContainer for \"9ccb41c72f2aad2b55a5d18574ba0783346cd01dfbf5f6fbd1cf167a3cbc2ca7\" returns successfully" Sep 9 23:43:40.462616 containerd[1873]: time="2025-09-09T23:43:40.462221746Z" level=info msg="StartContainer for \"3d26f85a1b7d9db8637c4dc04e1b5674c1f32f2a49299a17b689d642cfe4cdb0\" returns successfully" Sep 9 23:43:40.611579 kubelet[3045]: E0909 23:43:40.611549 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:40.614667 kubelet[3045]: E0909 23:43:40.614643 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:40.614877 kubelet[3045]: E0909 23:43:40.614863 3045 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.089294 kubelet[3045]: I0909 23:43:41.089252 3045 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.485957 kubelet[3045]: E0909 23:43:41.485922 3045 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426.0.0-n-c59ad9327c\" not found" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.536141 kubelet[3045]: I0909 23:43:41.536108 3045 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.569541 kubelet[3045]: I0909 23:43:41.569509 3045 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.616079 kubelet[3045]: I0909 23:43:41.615122 3045 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.616079 kubelet[3045]: I0909 23:43:41.615162 3045 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.666319 kubelet[3045]: E0909 23:43:41.666284 3045 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.666529 kubelet[3045]: E0909 23:43:41.666479 3045 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.0.0-n-c59ad9327c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.666529 kubelet[3045]: I0909 23:43:41.666498 3045 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.666717 kubelet[3045]: E0909 23:43:41.666694 3045 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.0.0-n-c59ad9327c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.668291 kubelet[3045]: E0909 23:43:41.668268 3045 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.668291 kubelet[3045]: I0909 23:43:41.668288 3045 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:41.669474 kubelet[3045]: E0909 23:43:41.669453 3045 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:42.361279 kubelet[3045]: I0909 23:43:42.361245 3045 apiserver.go:52] "Watching apiserver" Sep 9 23:43:42.370209 kubelet[3045]: I0909 23:43:42.370185 3045 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:43:44.020425 systemd[1]: Reload requested from client PID 3318 ('systemctl') (unit session-9.scope)... Sep 9 23:43:44.020444 systemd[1]: Reloading... Sep 9 23:43:44.104038 zram_generator::config[3366]: No configuration found. Sep 9 23:43:44.261597 systemd[1]: Reloading finished in 240 ms. Sep 9 23:43:44.279631 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:44.300522 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 23:43:44.300840 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:44.300902 systemd[1]: kubelet.service: Consumed 533ms CPU time, 129.4M memory peak. Sep 9 23:43:44.302848 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:43:44.408134 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:43:44.420227 (kubelet)[3429]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:43:44.443812 kubelet[3429]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:43:44.443812 kubelet[3429]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:43:44.443812 kubelet[3429]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:43:44.444084 kubelet[3429]: I0909 23:43:44.443813 3429 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:43:44.450795 kubelet[3429]: I0909 23:43:44.450762 3429 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:43:44.450795 kubelet[3429]: I0909 23:43:44.450792 3429 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:43:44.450972 kubelet[3429]: I0909 23:43:44.450946 3429 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:43:44.453169 kubelet[3429]: I0909 23:43:44.453149 3429 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 23:43:44.456033 kubelet[3429]: I0909 23:43:44.455032 3429 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:43:44.461745 kubelet[3429]: I0909 23:43:44.461729 3429 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:43:44.466122 kubelet[3429]: I0909 23:43:44.466101 3429 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:43:44.466404 kubelet[3429]: I0909 23:43:44.466379 3429 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:43:44.466637 kubelet[3429]: I0909 23:43:44.466467 3429 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.0.0-n-c59ad9327c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:43:44.466749 kubelet[3429]: I0909 23:43:44.466738 3429 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:43:44.466805 kubelet[3429]: I0909 23:43:44.466796 3429 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:43:44.466887 kubelet[3429]: I0909 23:43:44.466878 3429 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:43:44.467093 kubelet[3429]: I0909 23:43:44.467081 3429 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:43:44.467169 kubelet[3429]: I0909 23:43:44.467158 3429 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:43:44.467230 kubelet[3429]: I0909 23:43:44.467224 3429 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:43:44.467280 kubelet[3429]: I0909 23:43:44.467274 3429 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:43:44.472036 kubelet[3429]: I0909 23:43:44.471745 3429 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:43:44.472127 kubelet[3429]: I0909 23:43:44.472114 3429 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:43:44.472500 kubelet[3429]: I0909 23:43:44.472481 3429 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:43:44.472586 kubelet[3429]: I0909 23:43:44.472577 3429 server.go:1287] "Started kubelet" Sep 9 23:43:44.473899 kubelet[3429]: I0909 23:43:44.473874 3429 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:43:44.483270 kubelet[3429]: E0909 23:43:44.483240 3429 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:43:44.484172 kubelet[3429]: I0909 23:43:44.484150 3429 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:43:44.486036 kubelet[3429]: I0909 23:43:44.484824 3429 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:43:44.486682 kubelet[3429]: I0909 23:43:44.485222 3429 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:43:44.486879 kubelet[3429]: I0909 23:43:44.486842 3429 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:43:44.487122 kubelet[3429]: I0909 23:43:44.487106 3429 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:43:44.487220 kubelet[3429]: E0909 23:43:44.485349 3429 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.0.0-n-c59ad9327c\" not found" Sep 9 23:43:44.487615 kubelet[3429]: I0909 23:43:44.487599 3429 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:43:44.488566 kubelet[3429]: I0909 23:43:44.485245 3429 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:43:44.488725 kubelet[3429]: I0909 23:43:44.488712 3429 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:43:44.489355 kubelet[3429]: I0909 23:43:44.489128 3429 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:43:44.490461 kubelet[3429]: I0909 23:43:44.490439 3429 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:43:44.490461 kubelet[3429]: I0909 23:43:44.490461 3429 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:43:44.490544 kubelet[3429]: I0909 23:43:44.490476 3429 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:43:44.490544 kubelet[3429]: I0909 23:43:44.490481 3429 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:43:44.490544 kubelet[3429]: E0909 23:43:44.490521 3429 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:43:44.500523 kubelet[3429]: I0909 23:43:44.500135 3429 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:43:44.500610 kubelet[3429]: I0909 23:43:44.500596 3429 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:43:44.500759 kubelet[3429]: I0909 23:43:44.500737 3429 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558747 3429 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558780 3429 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558801 3429 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558927 3429 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558935 3429 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558950 3429 policy_none.go:49] "None policy: Start" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558958 3429 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.558965 3429 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:43:44.559427 kubelet[3429]: I0909 23:43:44.559087 3429 state_mem.go:75] "Updated machine memory state" Sep 9 23:43:44.562929 kubelet[3429]: I0909 23:43:44.562905 3429 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:43:44.563850 kubelet[3429]: I0909 23:43:44.563829 3429 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:43:44.563913 kubelet[3429]: I0909 23:43:44.563851 3429 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:43:44.564132 kubelet[3429]: I0909 23:43:44.564065 3429 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:43:44.565689 kubelet[3429]: E0909 23:43:44.565660 3429 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:43:44.591633 kubelet[3429]: I0909 23:43:44.591606 3429 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.592152 kubelet[3429]: I0909 23:43:44.592122 3429 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.592802 kubelet[3429]: I0909 23:43:44.592776 3429 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.603838 kubelet[3429]: W0909 23:43:44.603805 3429 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:43:44.609978 kubelet[3429]: W0909 23:43:44.609940 3429 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:43:44.610265 kubelet[3429]: W0909 23:43:44.610244 3429 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:43:44.666982 kubelet[3429]: I0909 23:43:44.666957 3429 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.682900 kubelet[3429]: I0909 23:43:44.682858 3429 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.683072 kubelet[3429]: I0909 23:43:44.683046 3429 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.689976 kubelet[3429]: I0909 23:43:44.689918 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690138 kubelet[3429]: I0909 23:43:44.690038 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-k8s-certs\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690138 kubelet[3429]: I0909 23:43:44.690055 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690138 kubelet[3429]: I0909 23:43:44.690070 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-ca-certs\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690323 kubelet[3429]: I0909 23:43:44.690228 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-kubeconfig\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690323 kubelet[3429]: I0909 23:43:44.690247 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/78c8cbf9bccdf912596e33daf7d64a1e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.0.0-n-c59ad9327c\" (UID: \"78c8cbf9bccdf912596e33daf7d64a1e\") " pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690323 kubelet[3429]: I0909 23:43:44.690259 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4298eb6720ba95bdf9d84c517f7cf988-kubeconfig\") pod \"kube-scheduler-ci-4426.0.0-n-c59ad9327c\" (UID: \"4298eb6720ba95bdf9d84c517f7cf988\") " pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690323 kubelet[3429]: I0909 23:43:44.690295 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-ca-certs\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:44.690323 kubelet[3429]: I0909 23:43:44.690304 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/90cad30b2305d4c47838c6290eeda1da-k8s-certs\") pod \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" (UID: \"90cad30b2305d4c47838c6290eeda1da\") " pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:45.468720 kubelet[3429]: I0909 23:43:45.468688 3429 apiserver.go:52] "Watching apiserver" Sep 9 23:43:45.488982 kubelet[3429]: I0909 23:43:45.488948 3429 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:43:45.539599 kubelet[3429]: I0909 23:43:45.539571 3429 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:45.552031 kubelet[3429]: W0909 23:43:45.551174 3429 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 23:43:45.552031 kubelet[3429]: E0909 23:43:45.551221 3429 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.0.0-n-c59ad9327c\" already exists" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" Sep 9 23:43:45.579890 kubelet[3429]: I0909 23:43:45.579624 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.0.0-n-c59ad9327c" podStartSLOduration=1.5796117299999999 podStartE2EDuration="1.57961173s" podCreationTimestamp="2025-09-09 23:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:43:45.56675356 +0000 UTC m=+1.144106717" watchObservedRunningTime="2025-09-09 23:43:45.57961173 +0000 UTC m=+1.156964879" Sep 9 23:43:45.579890 kubelet[3429]: I0909 23:43:45.579705 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.0.0-n-c59ad9327c" podStartSLOduration=1.579700829 podStartE2EDuration="1.579700829s" podCreationTimestamp="2025-09-09 23:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:43:45.579519639 +0000 UTC m=+1.156872796" watchObservedRunningTime="2025-09-09 23:43:45.579700829 +0000 UTC m=+1.157053978" Sep 9 23:43:45.591356 kubelet[3429]: I0909 23:43:45.591234 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.0.0-n-c59ad9327c" podStartSLOduration=1.591225339 podStartE2EDuration="1.591225339s" podCreationTimestamp="2025-09-09 23:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:43:45.591106847 +0000 UTC m=+1.168459996" watchObservedRunningTime="2025-09-09 23:43:45.591225339 +0000 UTC m=+1.168578488" Sep 9 23:43:48.911556 kubelet[3429]: I0909 23:43:48.911513 3429 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 23:43:48.912126 containerd[1873]: time="2025-09-09T23:43:48.912047381Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 23:43:48.912322 kubelet[3429]: I0909 23:43:48.912177 3429 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 23:43:49.786025 systemd[1]: Created slice kubepods-besteffort-pode6519470_0249_495b_aea7_ed9afcf1126c.slice - libcontainer container kubepods-besteffort-pode6519470_0249_495b_aea7_ed9afcf1126c.slice. Sep 9 23:43:49.819483 kubelet[3429]: I0909 23:43:49.819438 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e6519470-0249-495b-aea7-ed9afcf1126c-kube-proxy\") pod \"kube-proxy-kbdh8\" (UID: \"e6519470-0249-495b-aea7-ed9afcf1126c\") " pod="kube-system/kube-proxy-kbdh8" Sep 9 23:43:49.819483 kubelet[3429]: I0909 23:43:49.819487 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8btn\" (UniqueName: \"kubernetes.io/projected/e6519470-0249-495b-aea7-ed9afcf1126c-kube-api-access-s8btn\") pod \"kube-proxy-kbdh8\" (UID: \"e6519470-0249-495b-aea7-ed9afcf1126c\") " pod="kube-system/kube-proxy-kbdh8" Sep 9 23:43:49.819663 kubelet[3429]: I0909 23:43:49.819503 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e6519470-0249-495b-aea7-ed9afcf1126c-xtables-lock\") pod \"kube-proxy-kbdh8\" (UID: \"e6519470-0249-495b-aea7-ed9afcf1126c\") " pod="kube-system/kube-proxy-kbdh8" Sep 9 23:43:49.819663 kubelet[3429]: I0909 23:43:49.819513 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6519470-0249-495b-aea7-ed9afcf1126c-lib-modules\") pod \"kube-proxy-kbdh8\" (UID: \"e6519470-0249-495b-aea7-ed9afcf1126c\") " pod="kube-system/kube-proxy-kbdh8" Sep 9 23:43:49.993666 systemd[1]: Created slice kubepods-besteffort-podc2917430_427f_45b8_9107_84a2352e8086.slice - libcontainer container kubepods-besteffort-podc2917430_427f_45b8_9107_84a2352e8086.slice. Sep 9 23:43:50.019885 kubelet[3429]: I0909 23:43:50.019862 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c2917430-427f-45b8-9107-84a2352e8086-var-lib-calico\") pod \"tigera-operator-755d956888-zc2z9\" (UID: \"c2917430-427f-45b8-9107-84a2352e8086\") " pod="tigera-operator/tigera-operator-755d956888-zc2z9" Sep 9 23:43:50.020312 kubelet[3429]: I0909 23:43:50.020259 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw9vd\" (UniqueName: \"kubernetes.io/projected/c2917430-427f-45b8-9107-84a2352e8086-kube-api-access-kw9vd\") pod \"tigera-operator-755d956888-zc2z9\" (UID: \"c2917430-427f-45b8-9107-84a2352e8086\") " pod="tigera-operator/tigera-operator-755d956888-zc2z9" Sep 9 23:43:50.095379 containerd[1873]: time="2025-09-09T23:43:50.095275375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kbdh8,Uid:e6519470-0249-495b-aea7-ed9afcf1126c,Namespace:kube-system,Attempt:0,}" Sep 9 23:43:50.144821 containerd[1873]: time="2025-09-09T23:43:50.144590850Z" level=info msg="connecting to shim 962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442" address="unix:///run/containerd/s/81c7f35d901f03f76e03046a5ab78beaca99cb6d619754a86516c5978d03e438" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:43:50.167111 systemd[1]: Started cri-containerd-962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442.scope - libcontainer container 962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442. Sep 9 23:43:50.186653 containerd[1873]: time="2025-09-09T23:43:50.186618312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kbdh8,Uid:e6519470-0249-495b-aea7-ed9afcf1126c,Namespace:kube-system,Attempt:0,} returns sandbox id \"962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442\"" Sep 9 23:43:50.188957 containerd[1873]: time="2025-09-09T23:43:50.188928763Z" level=info msg="CreateContainer within sandbox \"962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 23:43:50.218934 containerd[1873]: time="2025-09-09T23:43:50.218554966Z" level=info msg="Container 86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:43:50.239988 containerd[1873]: time="2025-09-09T23:43:50.239924813Z" level=info msg="CreateContainer within sandbox \"962e20976672359e889a6943f0f0cdcb58df1b4f5d629b0d7f3088da1a315442\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da\"" Sep 9 23:43:50.240430 containerd[1873]: time="2025-09-09T23:43:50.240408213Z" level=info msg="StartContainer for \"86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da\"" Sep 9 23:43:50.241427 containerd[1873]: time="2025-09-09T23:43:50.241341691Z" level=info msg="connecting to shim 86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da" address="unix:///run/containerd/s/81c7f35d901f03f76e03046a5ab78beaca99cb6d619754a86516c5978d03e438" protocol=ttrpc version=3 Sep 9 23:43:50.259134 systemd[1]: Started cri-containerd-86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da.scope - libcontainer container 86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da. Sep 9 23:43:50.295400 containerd[1873]: time="2025-09-09T23:43:50.295361767Z" level=info msg="StartContainer for \"86b4235f03571380d28ff8a6c782ffba0d45f84e4761340a66b721f7d12039da\" returns successfully" Sep 9 23:43:50.297131 containerd[1873]: time="2025-09-09T23:43:50.296931314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-zc2z9,Uid:c2917430-427f-45b8-9107-84a2352e8086,Namespace:tigera-operator,Attempt:0,}" Sep 9 23:43:50.357956 containerd[1873]: time="2025-09-09T23:43:50.357612790Z" level=info msg="connecting to shim 91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d" address="unix:///run/containerd/s/6212b3341bd56c5b0c8d9e318b45e322925ecf5067bd502e80aa84ff10a06346" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:43:50.381129 systemd[1]: Started cri-containerd-91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d.scope - libcontainer container 91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d. Sep 9 23:43:50.410115 containerd[1873]: time="2025-09-09T23:43:50.410071655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-zc2z9,Uid:c2917430-427f-45b8-9107-84a2352e8086,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d\"" Sep 9 23:43:50.413124 containerd[1873]: time="2025-09-09T23:43:50.412975926Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 23:43:50.564579 kubelet[3429]: I0909 23:43:50.564479 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-kbdh8" podStartSLOduration=1.5644642979999999 podStartE2EDuration="1.564464298s" podCreationTimestamp="2025-09-09 23:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:43:50.564399983 +0000 UTC m=+6.141753132" watchObservedRunningTime="2025-09-09 23:43:50.564464298 +0000 UTC m=+6.141817447" Sep 9 23:43:50.928859 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1786611216.mount: Deactivated successfully. Sep 9 23:43:52.613354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1406490400.mount: Deactivated successfully. Sep 9 23:43:52.951417 containerd[1873]: time="2025-09-09T23:43:52.950752746Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:52.954196 containerd[1873]: time="2025-09-09T23:43:52.954171121Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 23:43:52.958116 containerd[1873]: time="2025-09-09T23:43:52.958091897Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:52.962088 containerd[1873]: time="2025-09-09T23:43:52.962051489Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:43:52.962662 containerd[1873]: time="2025-09-09T23:43:52.962367363Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.549065667s" Sep 9 23:43:52.962662 containerd[1873]: time="2025-09-09T23:43:52.962391388Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 23:43:52.964768 containerd[1873]: time="2025-09-09T23:43:52.964745633Z" level=info msg="CreateContainer within sandbox \"91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 23:43:52.994350 containerd[1873]: time="2025-09-09T23:43:52.994326642Z" level=info msg="Container 189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:43:53.014958 containerd[1873]: time="2025-09-09T23:43:53.014926055Z" level=info msg="CreateContainer within sandbox \"91dbd49792050abd5410dfe53bff86c86181bd3068014c178a129907375db69d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0\"" Sep 9 23:43:53.015441 containerd[1873]: time="2025-09-09T23:43:53.015420247Z" level=info msg="StartContainer for \"189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0\"" Sep 9 23:43:53.015992 containerd[1873]: time="2025-09-09T23:43:53.015972177Z" level=info msg="connecting to shim 189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0" address="unix:///run/containerd/s/6212b3341bd56c5b0c8d9e318b45e322925ecf5067bd502e80aa84ff10a06346" protocol=ttrpc version=3 Sep 9 23:43:53.036123 systemd[1]: Started cri-containerd-189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0.scope - libcontainer container 189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0. Sep 9 23:43:53.060599 containerd[1873]: time="2025-09-09T23:43:53.060568922Z" level=info msg="StartContainer for \"189a93f7cb4c1ca5a6b14893c37bc431d835278d41382df6e422f1633ede83c0\" returns successfully" Sep 9 23:43:56.211914 kubelet[3429]: I0909 23:43:56.211759 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-zc2z9" podStartSLOduration=4.661331437 podStartE2EDuration="7.2117469s" podCreationTimestamp="2025-09-09 23:43:49 +0000 UTC" firstStartedPulling="2025-09-09 23:43:50.412516207 +0000 UTC m=+5.989869356" lastFinishedPulling="2025-09-09 23:43:52.96293167 +0000 UTC m=+8.540284819" observedRunningTime="2025-09-09 23:43:53.570877341 +0000 UTC m=+9.148230490" watchObservedRunningTime="2025-09-09 23:43:56.2117469 +0000 UTC m=+11.789100049" Sep 9 23:43:58.197182 sudo[2358]: pam_unix(sudo:session): session closed for user root Sep 9 23:43:58.287712 sshd[2357]: Connection closed by 10.200.16.10 port 56698 Sep 9 23:43:58.288204 sshd-session[2354]: pam_unix(sshd:session): session closed for user core Sep 9 23:43:58.293171 systemd-logind[1853]: Session 9 logged out. Waiting for processes to exit. Sep 9 23:43:58.293622 systemd[1]: sshd@6-10.200.20.16:22-10.200.16.10:56698.service: Deactivated successfully. Sep 9 23:43:58.296769 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 23:43:58.297308 systemd[1]: session-9.scope: Consumed 2.824s CPU time, 217.5M memory peak. Sep 9 23:43:58.300106 systemd-logind[1853]: Removed session 9. Sep 9 23:44:01.452624 systemd[1]: Created slice kubepods-besteffort-podb12a3d9e_d7ae_4dd3_85df_1398512b3be9.slice - libcontainer container kubepods-besteffort-podb12a3d9e_d7ae_4dd3_85df_1398512b3be9.slice. Sep 9 23:44:01.459098 kubelet[3429]: W0909 23:44:01.458975 3429 reflector.go:569] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:01.459582 kubelet[3429]: I0909 23:44:01.459117 3429 status_manager.go:890] "Failed to get status for pod" podUID="b12a3d9e-d7ae-4dd3-85df-1398512b3be9" pod="calico-system/calico-typha-575f9b8656-crpsw" err="pods \"calico-typha-575f9b8656-crpsw\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" Sep 9 23:44:01.459582 kubelet[3429]: E0909 23:44:01.459514 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:01.459582 kubelet[3429]: W0909 23:44:01.459070 3429 reflector.go:569] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:01.459582 kubelet[3429]: E0909 23:44:01.459544 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:01.459692 kubelet[3429]: W0909 23:44:01.459097 3429 reflector.go:569] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:01.459692 kubelet[3429]: E0909 23:44:01.459555 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:01.480064 kubelet[3429]: I0909 23:44:01.480037 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7xz\" (UniqueName: \"kubernetes.io/projected/b12a3d9e-d7ae-4dd3-85df-1398512b3be9-kube-api-access-ch7xz\") pod \"calico-typha-575f9b8656-crpsw\" (UID: \"b12a3d9e-d7ae-4dd3-85df-1398512b3be9\") " pod="calico-system/calico-typha-575f9b8656-crpsw" Sep 9 23:44:01.480064 kubelet[3429]: I0909 23:44:01.480067 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b12a3d9e-d7ae-4dd3-85df-1398512b3be9-tigera-ca-bundle\") pod \"calico-typha-575f9b8656-crpsw\" (UID: \"b12a3d9e-d7ae-4dd3-85df-1398512b3be9\") " pod="calico-system/calico-typha-575f9b8656-crpsw" Sep 9 23:44:01.480167 kubelet[3429]: I0909 23:44:01.480079 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b12a3d9e-d7ae-4dd3-85df-1398512b3be9-typha-certs\") pod \"calico-typha-575f9b8656-crpsw\" (UID: \"b12a3d9e-d7ae-4dd3-85df-1398512b3be9\") " pod="calico-system/calico-typha-575f9b8656-crpsw" Sep 9 23:44:01.574812 systemd[1]: Created slice kubepods-besteffort-pod15d8495b_d728_464c_b16f_d3f0411ec4b6.slice - libcontainer container kubepods-besteffort-pod15d8495b_d728_464c_b16f_d3f0411ec4b6.slice. Sep 9 23:44:01.580843 kubelet[3429]: I0909 23:44:01.580804 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-lib-modules\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.580940 kubelet[3429]: I0909 23:44:01.580856 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15d8495b-d728-464c-b16f-d3f0411ec4b6-tigera-ca-bundle\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.580940 kubelet[3429]: I0909 23:44:01.580870 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-var-lib-calico\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.580940 kubelet[3429]: I0909 23:44:01.580882 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-cni-bin-dir\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.580940 kubelet[3429]: I0909 23:44:01.580891 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-cni-log-dir\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.580940 kubelet[3429]: I0909 23:44:01.580902 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/15d8495b-d728-464c-b16f-d3f0411ec4b6-node-certs\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581059 kubelet[3429]: I0909 23:44:01.580911 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-xtables-lock\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581059 kubelet[3429]: I0909 23:44:01.580923 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-flexvol-driver-host\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581059 kubelet[3429]: I0909 23:44:01.580934 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-policysync\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581059 kubelet[3429]: I0909 23:44:01.580946 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-var-run-calico\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581059 kubelet[3429]: I0909 23:44:01.580955 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/15d8495b-d728-464c-b16f-d3f0411ec4b6-cni-net-dir\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.581141 kubelet[3429]: I0909 23:44:01.580964 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jh4n\" (UniqueName: \"kubernetes.io/projected/15d8495b-d728-464c-b16f-d3f0411ec4b6-kube-api-access-7jh4n\") pod \"calico-node-nfnj2\" (UID: \"15d8495b-d728-464c-b16f-d3f0411ec4b6\") " pod="calico-system/calico-node-nfnj2" Sep 9 23:44:01.683101 kubelet[3429]: E0909 23:44:01.683068 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.683101 kubelet[3429]: W0909 23:44:01.683089 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.683101 kubelet[3429]: E0909 23:44:01.683109 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.684119 kubelet[3429]: E0909 23:44:01.684097 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.684119 kubelet[3429]: W0909 23:44:01.684112 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.684339 kubelet[3429]: E0909 23:44:01.684319 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.684339 kubelet[3429]: W0909 23:44:01.684334 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.684407 kubelet[3429]: E0909 23:44:01.684345 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.684553 kubelet[3429]: E0909 23:44:01.684538 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.684553 kubelet[3429]: W0909 23:44:01.684548 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.684631 kubelet[3429]: E0909 23:44:01.684557 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.685107 kubelet[3429]: E0909 23:44:01.684887 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.685325 kubelet[3429]: E0909 23:44:01.685212 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.685325 kubelet[3429]: W0909 23:44:01.685323 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.685497 kubelet[3429]: E0909 23:44:01.685472 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.685955 kubelet[3429]: E0909 23:44:01.685785 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.685955 kubelet[3429]: W0909 23:44:01.685801 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.685955 kubelet[3429]: E0909 23:44:01.685815 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.686080 kubelet[3429]: E0909 23:44:01.685967 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.686080 kubelet[3429]: W0909 23:44:01.685974 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.686080 kubelet[3429]: E0909 23:44:01.685981 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.686432 kubelet[3429]: E0909 23:44:01.686411 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.686432 kubelet[3429]: W0909 23:44:01.686425 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.686648 kubelet[3429]: E0909 23:44:01.686633 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.686648 kubelet[3429]: W0909 23:44:01.686643 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.686711 kubelet[3429]: E0909 23:44:01.686653 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.686935 kubelet[3429]: E0909 23:44:01.686909 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.687374 kubelet[3429]: E0909 23:44:01.687356 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.687374 kubelet[3429]: W0909 23:44:01.687370 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.687539 kubelet[3429]: E0909 23:44:01.687380 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.687871 kubelet[3429]: E0909 23:44:01.687843 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.690259 kubelet[3429]: W0909 23:44:01.689945 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.690259 kubelet[3429]: E0909 23:44:01.689973 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.690389 kubelet[3429]: E0909 23:44:01.690371 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.690389 kubelet[3429]: W0909 23:44:01.690385 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.690700 kubelet[3429]: E0909 23:44:01.690677 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.691888 kubelet[3429]: E0909 23:44:01.690826 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.691888 kubelet[3429]: W0909 23:44:01.690837 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.691888 kubelet[3429]: E0909 23:44:01.690845 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.709943 kubelet[3429]: E0909 23:44:01.709782 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:01.775107 kubelet[3429]: E0909 23:44:01.774976 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.775446 kubelet[3429]: W0909 23:44:01.774998 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.775446 kubelet[3429]: E0909 23:44:01.775261 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.775854 kubelet[3429]: E0909 23:44:01.775640 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.775936 kubelet[3429]: W0909 23:44:01.775650 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.776022 kubelet[3429]: E0909 23:44:01.776010 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.776308 kubelet[3429]: E0909 23:44:01.776253 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.776308 kubelet[3429]: W0909 23:44:01.776264 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.776308 kubelet[3429]: E0909 23:44:01.776274 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.776540 kubelet[3429]: E0909 23:44:01.776520 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.776590 kubelet[3429]: W0909 23:44:01.776580 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.776730 kubelet[3429]: E0909 23:44:01.776644 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.776962 kubelet[3429]: E0909 23:44:01.776950 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.777390 kubelet[3429]: W0909 23:44:01.777371 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.777676 kubelet[3429]: E0909 23:44:01.777462 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.777898 kubelet[3429]: E0909 23:44:01.777885 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.778217 kubelet[3429]: W0909 23:44:01.778038 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.778436 kubelet[3429]: E0909 23:44:01.778298 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.778642 kubelet[3429]: E0909 23:44:01.778550 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.778861 kubelet[3429]: W0909 23:44:01.778757 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.779186 kubelet[3429]: E0909 23:44:01.778931 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.779449 kubelet[3429]: E0909 23:44:01.779361 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.779593 kubelet[3429]: W0909 23:44:01.779518 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.779729 kubelet[3429]: E0909 23:44:01.779662 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.780478 kubelet[3429]: E0909 23:44:01.780064 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.780478 kubelet[3429]: W0909 23:44:01.780078 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.780478 kubelet[3429]: E0909 23:44:01.780088 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.780637 kubelet[3429]: E0909 23:44:01.780625 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.780690 kubelet[3429]: W0909 23:44:01.780679 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.780821 kubelet[3429]: E0909 23:44:01.780792 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.781462 kubelet[3429]: E0909 23:44:01.781421 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.781462 kubelet[3429]: W0909 23:44:01.781435 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.781653 kubelet[3429]: E0909 23:44:01.781445 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.781928 kubelet[3429]: E0909 23:44:01.781916 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.781998 kubelet[3429]: W0909 23:44:01.781987 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.782093 kubelet[3429]: E0909 23:44:01.782079 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.782363 kubelet[3429]: E0909 23:44:01.782288 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.782363 kubelet[3429]: W0909 23:44:01.782298 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.782363 kubelet[3429]: E0909 23:44:01.782307 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.782567 kubelet[3429]: E0909 23:44:01.782486 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.782567 kubelet[3429]: W0909 23:44:01.782497 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.782567 kubelet[3429]: E0909 23:44:01.782506 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.782699 kubelet[3429]: E0909 23:44:01.782690 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.782825 kubelet[3429]: W0909 23:44:01.782737 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.782825 kubelet[3429]: E0909 23:44:01.782751 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.782940 kubelet[3429]: E0909 23:44:01.782931 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.782992 kubelet[3429]: W0909 23:44:01.782981 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.783226 kubelet[3429]: E0909 23:44:01.783032 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.783310 kubelet[3429]: E0909 23:44:01.783298 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.783442 kubelet[3429]: W0909 23:44:01.783351 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.783442 kubelet[3429]: E0909 23:44:01.783365 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.783554 kubelet[3429]: E0909 23:44:01.783545 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.783594 kubelet[3429]: W0909 23:44:01.783586 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.783713 kubelet[3429]: E0909 23:44:01.783633 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.783812 kubelet[3429]: E0909 23:44:01.783802 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.783855 kubelet[3429]: W0909 23:44:01.783846 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.783896 kubelet[3429]: E0909 23:44:01.783885 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.784712 kubelet[3429]: E0909 23:44:01.784111 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.784712 kubelet[3429]: W0909 23:44:01.784211 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.784712 kubelet[3429]: E0909 23:44:01.784227 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.784969 kubelet[3429]: E0909 23:44:01.784868 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.785067 kubelet[3429]: W0909 23:44:01.785037 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.785067 kubelet[3429]: E0909 23:44:01.785059 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.785114 kubelet[3429]: I0909 23:44:01.785083 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c172c6c9-8c08-4d46-a400-b2fd0f4bf93b-socket-dir\") pod \"csi-node-driver-5n8qp\" (UID: \"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b\") " pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:01.787011 kubelet[3429]: E0909 23:44:01.786266 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.787011 kubelet[3429]: W0909 23:44:01.786280 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.787173 kubelet[3429]: E0909 23:44:01.787111 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.787173 kubelet[3429]: I0909 23:44:01.787137 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c172c6c9-8c08-4d46-a400-b2fd0f4bf93b-varrun\") pod \"csi-node-driver-5n8qp\" (UID: \"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b\") " pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:01.787405 kubelet[3429]: E0909 23:44:01.787393 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.787526 kubelet[3429]: W0909 23:44:01.787469 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.787526 kubelet[3429]: E0909 23:44:01.787493 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.787717 kubelet[3429]: E0909 23:44:01.787707 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.787836 kubelet[3429]: W0909 23:44:01.787772 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.787836 kubelet[3429]: E0909 23:44:01.787795 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.788209 kubelet[3429]: E0909 23:44:01.788111 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.788457 kubelet[3429]: W0909 23:44:01.788285 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.788457 kubelet[3429]: E0909 23:44:01.788311 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.788457 kubelet[3429]: I0909 23:44:01.788326 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c172c6c9-8c08-4d46-a400-b2fd0f4bf93b-registration-dir\") pod \"csi-node-driver-5n8qp\" (UID: \"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b\") " pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:01.788862 kubelet[3429]: E0909 23:44:01.788837 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.788862 kubelet[3429]: W0909 23:44:01.788849 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.789142 kubelet[3429]: E0909 23:44:01.789079 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.789319 kubelet[3429]: I0909 23:44:01.789304 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c172c6c9-8c08-4d46-a400-b2fd0f4bf93b-kubelet-dir\") pod \"csi-node-driver-5n8qp\" (UID: \"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b\") " pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:01.789765 kubelet[3429]: E0909 23:44:01.789731 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.789765 kubelet[3429]: W0909 23:44:01.789742 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.790131 kubelet[3429]: E0909 23:44:01.789998 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.790437 kubelet[3429]: E0909 23:44:01.790405 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.790437 kubelet[3429]: W0909 23:44:01.790416 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.790774 kubelet[3429]: E0909 23:44:01.790682 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.791191 kubelet[3429]: E0909 23:44:01.791067 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.791191 kubelet[3429]: W0909 23:44:01.791088 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.791507 kubelet[3429]: E0909 23:44:01.791435 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.791507 kubelet[3429]: I0909 23:44:01.791462 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdwzc\" (UniqueName: \"kubernetes.io/projected/c172c6c9-8c08-4d46-a400-b2fd0f4bf93b-kube-api-access-cdwzc\") pod \"csi-node-driver-5n8qp\" (UID: \"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b\") " pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:01.792093 kubelet[3429]: E0909 23:44:01.791929 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.792093 kubelet[3429]: W0909 23:44:01.792037 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.792366 kubelet[3429]: E0909 23:44:01.792351 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.793093 kubelet[3429]: E0909 23:44:01.793047 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.793093 kubelet[3429]: W0909 23:44:01.793062 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.793093 kubelet[3429]: E0909 23:44:01.793072 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.793525 kubelet[3429]: E0909 23:44:01.793424 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.793703 kubelet[3429]: W0909 23:44:01.793689 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.794147 kubelet[3429]: E0909 23:44:01.793972 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.794376 kubelet[3429]: E0909 23:44:01.794248 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.794616 kubelet[3429]: W0909 23:44:01.794259 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.794616 kubelet[3429]: E0909 23:44:01.794532 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.795153 kubelet[3429]: E0909 23:44:01.795120 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.795153 kubelet[3429]: W0909 23:44:01.795132 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.795153 kubelet[3429]: E0909 23:44:01.795142 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.795661 kubelet[3429]: E0909 23:44:01.795624 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.795661 kubelet[3429]: W0909 23:44:01.795637 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.795661 kubelet[3429]: E0909 23:44:01.795647 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.894218 kubelet[3429]: E0909 23:44:01.894176 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.894218 kubelet[3429]: W0909 23:44:01.894198 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.894348 kubelet[3429]: E0909 23:44:01.894230 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.894602 kubelet[3429]: E0909 23:44:01.894397 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.894602 kubelet[3429]: W0909 23:44:01.894408 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.894602 kubelet[3429]: E0909 23:44:01.894417 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.895118 kubelet[3429]: E0909 23:44:01.894858 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.895118 kubelet[3429]: W0909 23:44:01.894873 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.895118 kubelet[3429]: E0909 23:44:01.894895 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.895425 kubelet[3429]: E0909 23:44:01.895347 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.895503 kubelet[3429]: W0909 23:44:01.895491 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.895641 kubelet[3429]: E0909 23:44:01.895568 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.895813 kubelet[3429]: E0909 23:44:01.895803 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.895869 kubelet[3429]: W0909 23:44:01.895859 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.895945 kubelet[3429]: E0909 23:44:01.895922 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.896199 kubelet[3429]: E0909 23:44:01.896188 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.896257 kubelet[3429]: W0909 23:44:01.896248 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.896335 kubelet[3429]: E0909 23:44:01.896318 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.896517 kubelet[3429]: E0909 23:44:01.896505 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.896679 kubelet[3429]: W0909 23:44:01.896568 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.896679 kubelet[3429]: E0909 23:44:01.896592 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.896795 kubelet[3429]: E0909 23:44:01.896785 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.896843 kubelet[3429]: W0909 23:44:01.896834 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.896895 kubelet[3429]: E0909 23:44:01.896886 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.897113 kubelet[3429]: E0909 23:44:01.897098 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.897113 kubelet[3429]: W0909 23:44:01.897109 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.897179 kubelet[3429]: E0909 23:44:01.897120 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.897347 kubelet[3429]: E0909 23:44:01.897244 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.897347 kubelet[3429]: W0909 23:44:01.897255 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.897347 kubelet[3429]: E0909 23:44:01.897262 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.897566 kubelet[3429]: E0909 23:44:01.897473 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.897566 kubelet[3429]: W0909 23:44:01.897484 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.897566 kubelet[3429]: E0909 23:44:01.897499 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.898054 kubelet[3429]: E0909 23:44:01.898036 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.898141 kubelet[3429]: W0909 23:44:01.898129 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.898211 kubelet[3429]: E0909 23:44:01.898201 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.898448 kubelet[3429]: E0909 23:44:01.898429 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.898448 kubelet[3429]: W0909 23:44:01.898442 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.898560 kubelet[3429]: E0909 23:44:01.898533 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.898862 kubelet[3429]: E0909 23:44:01.898825 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.898862 kubelet[3429]: W0909 23:44:01.898860 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.899178 kubelet[3429]: E0909 23:44:01.898886 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.899298 kubelet[3429]: E0909 23:44:01.899281 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.899298 kubelet[3429]: W0909 23:44:01.899294 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.899413 kubelet[3429]: E0909 23:44:01.899398 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.899501 kubelet[3429]: E0909 23:44:01.899489 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.899501 kubelet[3429]: W0909 23:44:01.899499 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.899684 kubelet[3429]: E0909 23:44:01.899550 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.899814 kubelet[3429]: E0909 23:44:01.899804 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.899814 kubelet[3429]: W0909 23:44:01.899811 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.899866 kubelet[3429]: E0909 23:44:01.899824 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.899935 kubelet[3429]: E0909 23:44:01.899924 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.899935 kubelet[3429]: W0909 23:44:01.899934 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.899977 kubelet[3429]: E0909 23:44:01.899943 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.900120 kubelet[3429]: E0909 23:44:01.900105 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.900120 kubelet[3429]: W0909 23:44:01.900115 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.900348 kubelet[3429]: E0909 23:44:01.900181 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.900348 kubelet[3429]: E0909 23:44:01.900202 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.900348 kubelet[3429]: W0909 23:44:01.900207 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.900510 kubelet[3429]: E0909 23:44:01.900496 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.900510 kubelet[3429]: W0909 23:44:01.900505 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.900559 kubelet[3429]: E0909 23:44:01.900512 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.900672 kubelet[3429]: E0909 23:44:01.900652 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.901064 kubelet[3429]: E0909 23:44:01.901048 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.901064 kubelet[3429]: W0909 23:44:01.901062 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.901138 kubelet[3429]: E0909 23:44:01.901082 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.901228 kubelet[3429]: E0909 23:44:01.901216 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.901228 kubelet[3429]: W0909 23:44:01.901224 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.901288 kubelet[3429]: E0909 23:44:01.901239 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.901615 kubelet[3429]: E0909 23:44:01.901594 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.901615 kubelet[3429]: W0909 23:44:01.901610 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.901820 kubelet[3429]: E0909 23:44:01.901689 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:01.902208 kubelet[3429]: E0909 23:44:01.902191 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:01.902208 kubelet[3429]: W0909 23:44:01.902205 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:01.902263 kubelet[3429]: E0909 23:44:01.902216 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.356386 kubelet[3429]: E0909 23:44:02.356325 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.356584 kubelet[3429]: W0909 23:44:02.356469 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.356584 kubelet[3429]: E0909 23:44:02.356493 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.490572 kubelet[3429]: E0909 23:44:02.490495 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.492043 kubelet[3429]: W0909 23:44:02.490646 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.492043 kubelet[3429]: E0909 23:44:02.490997 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.494727 kubelet[3429]: E0909 23:44:02.494700 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.494727 kubelet[3429]: W0909 23:44:02.494717 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.494727 kubelet[3429]: E0909 23:44:02.494727 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.498203 kubelet[3429]: E0909 23:44:02.498191 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.498306 kubelet[3429]: W0909 23:44:02.498293 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.498384 kubelet[3429]: E0909 23:44:02.498371 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.582647 kubelet[3429]: E0909 23:44:02.582621 3429 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:02.582851 kubelet[3429]: E0909 23:44:02.582840 3429 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b12a3d9e-d7ae-4dd3-85df-1398512b3be9-tigera-ca-bundle podName:b12a3d9e-d7ae-4dd3-85df-1398512b3be9 nodeName:}" failed. No retries permitted until 2025-09-09 23:44:03.082820111 +0000 UTC m=+18.660173260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/b12a3d9e-d7ae-4dd3-85df-1398512b3be9-tigera-ca-bundle") pod "calico-typha-575f9b8656-crpsw" (UID: "b12a3d9e-d7ae-4dd3-85df-1398512b3be9") : failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:02.601421 kubelet[3429]: E0909 23:44:02.601401 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.601578 kubelet[3429]: W0909 23:44:02.601564 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.601635 kubelet[3429]: E0909 23:44:02.601623 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.682145 kubelet[3429]: E0909 23:44:02.682040 3429 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:02.682504 kubelet[3429]: E0909 23:44:02.682490 3429 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15d8495b-d728-464c-b16f-d3f0411ec4b6-tigera-ca-bundle podName:15d8495b-d728-464c-b16f-d3f0411ec4b6 nodeName:}" failed. No retries permitted until 2025-09-09 23:44:03.182463961 +0000 UTC m=+18.759817110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/15d8495b-d728-464c-b16f-d3f0411ec4b6-tigera-ca-bundle") pod "calico-node-nfnj2" (UID: "15d8495b-d728-464c-b16f-d3f0411ec4b6") : failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:02.702309 kubelet[3429]: E0909 23:44:02.702285 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.702309 kubelet[3429]: W0909 23:44:02.702303 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.702578 kubelet[3429]: E0909 23:44:02.702319 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.702949 kubelet[3429]: E0909 23:44:02.702785 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.702949 kubelet[3429]: W0909 23:44:02.702798 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.702949 kubelet[3429]: E0909 23:44:02.702818 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.804162 kubelet[3429]: E0909 23:44:02.804134 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.804162 kubelet[3429]: W0909 23:44:02.804153 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.804162 kubelet[3429]: E0909 23:44:02.804171 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.804515 kubelet[3429]: E0909 23:44:02.804471 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.804515 kubelet[3429]: W0909 23:44:02.804483 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.804515 kubelet[3429]: E0909 23:44:02.804493 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.905676 kubelet[3429]: E0909 23:44:02.905580 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.905676 kubelet[3429]: W0909 23:44:02.905604 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.905676 kubelet[3429]: E0909 23:44:02.905623 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:02.906057 kubelet[3429]: E0909 23:44:02.905983 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:02.906057 kubelet[3429]: W0909 23:44:02.905994 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:02.906154 kubelet[3429]: E0909 23:44:02.906140 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.007182 kubelet[3429]: E0909 23:44:03.007086 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.007182 kubelet[3429]: W0909 23:44:03.007110 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.007182 kubelet[3429]: E0909 23:44:03.007128 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.007540 kubelet[3429]: E0909 23:44:03.007491 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.007540 kubelet[3429]: W0909 23:44:03.007502 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.007540 kubelet[3429]: E0909 23:44:03.007512 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.108061 kubelet[3429]: E0909 23:44:03.107911 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.108061 kubelet[3429]: W0909 23:44:03.107932 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.108061 kubelet[3429]: E0909 23:44:03.107949 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.108436 kubelet[3429]: E0909 23:44:03.108319 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.108436 kubelet[3429]: W0909 23:44:03.108329 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.108436 kubelet[3429]: E0909 23:44:03.108344 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.108573 kubelet[3429]: E0909 23:44:03.108561 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.108713 kubelet[3429]: W0909 23:44:03.108620 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.108713 kubelet[3429]: E0909 23:44:03.108634 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.108821 kubelet[3429]: E0909 23:44:03.108811 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.108864 kubelet[3429]: W0909 23:44:03.108855 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.108912 kubelet[3429]: E0909 23:44:03.108900 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.109082 kubelet[3429]: E0909 23:44:03.109071 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.109269 kubelet[3429]: W0909 23:44:03.109145 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.109269 kubelet[3429]: E0909 23:44:03.109161 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.109384 kubelet[3429]: E0909 23:44:03.109374 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.109428 kubelet[3429]: W0909 23:44:03.109419 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.109469 kubelet[3429]: E0909 23:44:03.109458 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.110146 kubelet[3429]: E0909 23:44:03.110118 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.110265 kubelet[3429]: W0909 23:44:03.110221 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.110265 kubelet[3429]: E0909 23:44:03.110238 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.209844 kubelet[3429]: E0909 23:44:03.209719 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.209844 kubelet[3429]: W0909 23:44:03.209739 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.209844 kubelet[3429]: E0909 23:44:03.209753 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.210170 kubelet[3429]: E0909 23:44:03.210156 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.210297 kubelet[3429]: W0909 23:44:03.210238 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.210297 kubelet[3429]: E0909 23:44:03.210256 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.210482 kubelet[3429]: E0909 23:44:03.210471 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.210548 kubelet[3429]: W0909 23:44:03.210537 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.210592 kubelet[3429]: E0909 23:44:03.210580 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.210835 kubelet[3429]: E0909 23:44:03.210767 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.210835 kubelet[3429]: W0909 23:44:03.210778 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.210835 kubelet[3429]: E0909 23:44:03.210786 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.211581 kubelet[3429]: E0909 23:44:03.211068 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.211581 kubelet[3429]: W0909 23:44:03.211078 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.211581 kubelet[3429]: E0909 23:44:03.211088 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.211827 kubelet[3429]: E0909 23:44:03.211814 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:03.211903 kubelet[3429]: W0909 23:44:03.211891 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:03.211957 kubelet[3429]: E0909 23:44:03.211945 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:03.254905 containerd[1873]: time="2025-09-09T23:44:03.254871877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-575f9b8656-crpsw,Uid:b12a3d9e-d7ae-4dd3-85df-1398512b3be9,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:03.304637 containerd[1873]: time="2025-09-09T23:44:03.304516844Z" level=info msg="connecting to shim 15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897" address="unix:///run/containerd/s/05917e2bcf47a1e319532f1d1a78d4fd3061c475c81233dc23c4073ea91b48d6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:03.328152 systemd[1]: Started cri-containerd-15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897.scope - libcontainer container 15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897. Sep 9 23:44:03.355713 containerd[1873]: time="2025-09-09T23:44:03.355681796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-575f9b8656-crpsw,Uid:b12a3d9e-d7ae-4dd3-85df-1398512b3be9,Namespace:calico-system,Attempt:0,} returns sandbox id \"15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897\"" Sep 9 23:44:03.357817 containerd[1873]: time="2025-09-09T23:44:03.357791880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 23:44:03.378042 containerd[1873]: time="2025-09-09T23:44:03.377993867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nfnj2,Uid:15d8495b-d728-464c-b16f-d3f0411ec4b6,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:03.432456 containerd[1873]: time="2025-09-09T23:44:03.432367986Z" level=info msg="connecting to shim 0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4" address="unix:///run/containerd/s/71fcb82fd916cf429527ebe302f36571af58dd17fd13b21692a31d33b88dbd6b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:03.453138 systemd[1]: Started cri-containerd-0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4.scope - libcontainer container 0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4. Sep 9 23:44:03.477734 containerd[1873]: time="2025-09-09T23:44:03.477621372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nfnj2,Uid:15d8495b-d728-464c-b16f-d3f0411ec4b6,Namespace:calico-system,Attempt:0,} returns sandbox id \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\"" Sep 9 23:44:03.491247 kubelet[3429]: E0909 23:44:03.491203 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:04.668574 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1683664084.mount: Deactivated successfully. Sep 9 23:44:05.042382 containerd[1873]: time="2025-09-09T23:44:05.042337282Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:05.046571 containerd[1873]: time="2025-09-09T23:44:05.046541905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 23:44:05.049984 containerd[1873]: time="2025-09-09T23:44:05.049936351Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:05.054124 containerd[1873]: time="2025-09-09T23:44:05.054072924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:05.054806 containerd[1873]: time="2025-09-09T23:44:05.054559675Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.696738514s" Sep 9 23:44:05.054806 containerd[1873]: time="2025-09-09T23:44:05.054588036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 23:44:05.057801 containerd[1873]: time="2025-09-09T23:44:05.057753506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 23:44:05.064466 containerd[1873]: time="2025-09-09T23:44:05.064188490Z" level=info msg="CreateContainer within sandbox \"15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 23:44:05.090085 containerd[1873]: time="2025-09-09T23:44:05.090057459Z" level=info msg="Container 57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:05.120531 containerd[1873]: time="2025-09-09T23:44:05.120431365Z" level=info msg="CreateContainer within sandbox \"15539d7aa69c561f873bdbedf345f31022db48b95fb7eff12927b80b26e34897\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171\"" Sep 9 23:44:05.120925 containerd[1873]: time="2025-09-09T23:44:05.120866635Z" level=info msg="StartContainer for \"57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171\"" Sep 9 23:44:05.121778 containerd[1873]: time="2025-09-09T23:44:05.121734655Z" level=info msg="connecting to shim 57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171" address="unix:///run/containerd/s/05917e2bcf47a1e319532f1d1a78d4fd3061c475c81233dc23c4073ea91b48d6" protocol=ttrpc version=3 Sep 9 23:44:05.150170 systemd[1]: Started cri-containerd-57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171.scope - libcontainer container 57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171. Sep 9 23:44:05.207329 containerd[1873]: time="2025-09-09T23:44:05.207285715Z" level=info msg="StartContainer for \"57d57bd52332430f036860c225621e6e40729d010a4533a27ec5e77f10e33171\" returns successfully" Sep 9 23:44:05.491695 kubelet[3429]: E0909 23:44:05.491643 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:05.605545 kubelet[3429]: E0909 23:44:05.605502 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.605761 kubelet[3429]: W0909 23:44:05.605522 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.605761 kubelet[3429]: E0909 23:44:05.605602 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.606092 kubelet[3429]: E0909 23:44:05.606028 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.606092 kubelet[3429]: W0909 23:44:05.606042 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.606490 kubelet[3429]: E0909 23:44:05.606268 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.606793 kubelet[3429]: E0909 23:44:05.606725 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.607129 kubelet[3429]: W0909 23:44:05.607107 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.607284 kubelet[3429]: E0909 23:44:05.607187 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.608049 kubelet[3429]: E0909 23:44:05.608034 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.608253 kubelet[3429]: W0909 23:44:05.608107 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.608253 kubelet[3429]: E0909 23:44:05.608121 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.608935 kubelet[3429]: E0909 23:44:05.608636 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.608935 kubelet[3429]: W0909 23:44:05.608653 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.608935 kubelet[3429]: E0909 23:44:05.608665 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.609312 kubelet[3429]: E0909 23:44:05.609271 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.609403 kubelet[3429]: W0909 23:44:05.609392 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.609541 kubelet[3429]: E0909 23:44:05.609449 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.609703 kubelet[3429]: E0909 23:44:05.609692 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.609774 kubelet[3429]: W0909 23:44:05.609764 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.609834 kubelet[3429]: E0909 23:44:05.609814 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.610094 kubelet[3429]: E0909 23:44:05.610024 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.610094 kubelet[3429]: W0909 23:44:05.610034 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.610094 kubelet[3429]: E0909 23:44:05.610043 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.610325 kubelet[3429]: E0909 23:44:05.610284 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.610735 kubelet[3429]: W0909 23:44:05.610295 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.610735 kubelet[3429]: E0909 23:44:05.610625 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.610927 kubelet[3429]: E0909 23:44:05.610886 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.610927 kubelet[3429]: W0909 23:44:05.610897 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.611097 kubelet[3429]: E0909 23:44:05.611019 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.611229 kubelet[3429]: E0909 23:44:05.611215 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.611229 kubelet[3429]: W0909 23:44:05.611227 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.611349 kubelet[3429]: E0909 23:44:05.611236 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.611454 kubelet[3429]: E0909 23:44:05.611444 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.611454 kubelet[3429]: W0909 23:44:05.611453 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.611520 kubelet[3429]: E0909 23:44:05.611459 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.611634 kubelet[3429]: E0909 23:44:05.611623 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.611634 kubelet[3429]: W0909 23:44:05.611631 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.611737 kubelet[3429]: E0909 23:44:05.611637 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.611811 kubelet[3429]: E0909 23:44:05.611801 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.611811 kubelet[3429]: W0909 23:44:05.611809 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.611902 kubelet[3429]: E0909 23:44:05.611815 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.611967 kubelet[3429]: E0909 23:44:05.611958 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.611967 kubelet[3429]: W0909 23:44:05.611966 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.612038 kubelet[3429]: E0909 23:44:05.611971 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.625690 kubelet[3429]: E0909 23:44:05.625635 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.625690 kubelet[3429]: W0909 23:44:05.625648 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.625690 kubelet[3429]: E0909 23:44:05.625662 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.626373 kubelet[3429]: E0909 23:44:05.626083 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.626542 kubelet[3429]: W0909 23:44:05.626458 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.626542 kubelet[3429]: E0909 23:44:05.626491 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.626778 kubelet[3429]: E0909 23:44:05.626754 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.626778 kubelet[3429]: W0909 23:44:05.626765 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.627009 kubelet[3429]: E0909 23:44:05.626949 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.627329 kubelet[3429]: E0909 23:44:05.627305 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.627329 kubelet[3429]: W0909 23:44:05.627317 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.627500 kubelet[3429]: E0909 23:44:05.627471 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.627697 kubelet[3429]: E0909 23:44:05.627647 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.627697 kubelet[3429]: W0909 23:44:05.627658 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.627697 kubelet[3429]: E0909 23:44:05.627684 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.627957 kubelet[3429]: E0909 23:44:05.627900 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.627957 kubelet[3429]: W0909 23:44:05.627912 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.627957 kubelet[3429]: E0909 23:44:05.627932 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.628204 kubelet[3429]: E0909 23:44:05.628194 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.628303 kubelet[3429]: W0909 23:44:05.628246 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.628303 kubelet[3429]: E0909 23:44:05.628271 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.628559 kubelet[3429]: E0909 23:44:05.628483 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.628559 kubelet[3429]: W0909 23:44:05.628494 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.628559 kubelet[3429]: E0909 23:44:05.628508 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.628878 kubelet[3429]: E0909 23:44:05.628756 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.628878 kubelet[3429]: W0909 23:44:05.628767 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.628878 kubelet[3429]: E0909 23:44:05.628783 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.628955 kubelet[3429]: E0909 23:44:05.628947 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.628972 kubelet[3429]: W0909 23:44:05.628957 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.628972 kubelet[3429]: E0909 23:44:05.628967 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629090 kubelet[3429]: E0909 23:44:05.629079 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629090 kubelet[3429]: W0909 23:44:05.629087 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.629140 kubelet[3429]: E0909 23:44:05.629094 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629207 kubelet[3429]: E0909 23:44:05.629192 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629207 kubelet[3429]: W0909 23:44:05.629201 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.629250 kubelet[3429]: E0909 23:44:05.629211 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629491 kubelet[3429]: E0909 23:44:05.629481 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629555 kubelet[3429]: W0909 23:44:05.629544 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.629615 kubelet[3429]: E0909 23:44:05.629605 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629743 kubelet[3429]: E0909 23:44:05.629729 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629743 kubelet[3429]: W0909 23:44:05.629740 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.629809 kubelet[3429]: E0909 23:44:05.629752 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629863 kubelet[3429]: E0909 23:44:05.629842 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629863 kubelet[3429]: W0909 23:44:05.629847 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.629863 kubelet[3429]: E0909 23:44:05.629857 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.629971 kubelet[3429]: E0909 23:44:05.629961 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.629971 kubelet[3429]: W0909 23:44:05.629968 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.630019 kubelet[3429]: E0909 23:44:05.629977 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.630294 kubelet[3429]: E0909 23:44:05.630230 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.630294 kubelet[3429]: W0909 23:44:05.630242 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.630294 kubelet[3429]: E0909 23:44:05.630258 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:05.630532 kubelet[3429]: E0909 23:44:05.630497 3429 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:44:05.630532 kubelet[3429]: W0909 23:44:05.630508 3429 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:44:05.630532 kubelet[3429]: E0909 23:44:05.630516 3429 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:44:06.328396 containerd[1873]: time="2025-09-09T23:44:06.328335662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:06.331519 containerd[1873]: time="2025-09-09T23:44:06.331490468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 23:44:06.334653 containerd[1873]: time="2025-09-09T23:44:06.334625321Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:06.344783 containerd[1873]: time="2025-09-09T23:44:06.344751127Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:06.345368 containerd[1873]: time="2025-09-09T23:44:06.345131220Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.287347744s" Sep 9 23:44:06.345368 containerd[1873]: time="2025-09-09T23:44:06.345162733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 23:44:06.348271 containerd[1873]: time="2025-09-09T23:44:06.347401789Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 23:44:06.385826 containerd[1873]: time="2025-09-09T23:44:06.385802377Z" level=info msg="Container e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:06.405737 containerd[1873]: time="2025-09-09T23:44:06.405709291Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\"" Sep 9 23:44:06.406360 containerd[1873]: time="2025-09-09T23:44:06.406345127Z" level=info msg="StartContainer for \"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\"" Sep 9 23:44:06.407512 containerd[1873]: time="2025-09-09T23:44:06.407485652Z" level=info msg="connecting to shim e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c" address="unix:///run/containerd/s/71fcb82fd916cf429527ebe302f36571af58dd17fd13b21692a31d33b88dbd6b" protocol=ttrpc version=3 Sep 9 23:44:06.430144 systemd[1]: Started cri-containerd-e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c.scope - libcontainer container e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c. Sep 9 23:44:06.471086 systemd[1]: cri-containerd-e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c.scope: Deactivated successfully. Sep 9 23:44:06.471882 containerd[1873]: time="2025-09-09T23:44:06.471741905Z" level=info msg="StartContainer for \"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\" returns successfully" Sep 9 23:44:06.479137 containerd[1873]: time="2025-09-09T23:44:06.479046717Z" level=info msg="received exit event container_id:\"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\" id:\"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\" pid:4125 exited_at:{seconds:1757461446 nanos:478784804}" Sep 9 23:44:06.479240 containerd[1873]: time="2025-09-09T23:44:06.479206178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\" id:\"e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c\" pid:4125 exited_at:{seconds:1757461446 nanos:478784804}" Sep 9 23:44:06.505732 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e6ac4810c01c966ee9ae53aa570a337db19dc68ea3df4540ea4c14702e33e36c-rootfs.mount: Deactivated successfully. Sep 9 23:44:06.589252 kubelet[3429]: I0909 23:44:06.588413 3429 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:44:06.604939 kubelet[3429]: I0909 23:44:06.604885 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-575f9b8656-crpsw" podStartSLOduration=3.906150999 podStartE2EDuration="5.604869681s" podCreationTimestamp="2025-09-09 23:44:01 +0000 UTC" firstStartedPulling="2025-09-09 23:44:03.356872523 +0000 UTC m=+18.934225672" lastFinishedPulling="2025-09-09 23:44:05.055591205 +0000 UTC m=+20.632944354" observedRunningTime="2025-09-09 23:44:05.598483019 +0000 UTC m=+21.175836168" watchObservedRunningTime="2025-09-09 23:44:06.604869681 +0000 UTC m=+22.182222862" Sep 9 23:44:07.490853 kubelet[3429]: E0909 23:44:07.490784 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:07.593020 containerd[1873]: time="2025-09-09T23:44:07.592865743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 23:44:09.492688 kubelet[3429]: E0909 23:44:09.491958 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:09.821609 containerd[1873]: time="2025-09-09T23:44:09.821402291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:09.835743 containerd[1873]: time="2025-09-09T23:44:09.835710445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 23:44:09.839645 containerd[1873]: time="2025-09-09T23:44:09.839619005Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:09.844629 containerd[1873]: time="2025-09-09T23:44:09.844600863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:09.845372 containerd[1873]: time="2025-09-09T23:44:09.845345854Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.25244771s" Sep 9 23:44:09.845406 containerd[1873]: time="2025-09-09T23:44:09.845375527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 23:44:09.848142 containerd[1873]: time="2025-09-09T23:44:09.848108788Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 23:44:09.876827 containerd[1873]: time="2025-09-09T23:44:09.876795890Z" level=info msg="Container 9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:09.879242 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2577434526.mount: Deactivated successfully. Sep 9 23:44:09.902893 containerd[1873]: time="2025-09-09T23:44:09.902838638Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\"" Sep 9 23:44:09.903652 containerd[1873]: time="2025-09-09T23:44:09.903586381Z" level=info msg="StartContainer for \"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\"" Sep 9 23:44:09.904830 containerd[1873]: time="2025-09-09T23:44:09.904807579Z" level=info msg="connecting to shim 9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951" address="unix:///run/containerd/s/71fcb82fd916cf429527ebe302f36571af58dd17fd13b21692a31d33b88dbd6b" protocol=ttrpc version=3 Sep 9 23:44:09.925117 systemd[1]: Started cri-containerd-9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951.scope - libcontainer container 9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951. Sep 9 23:44:09.955256 containerd[1873]: time="2025-09-09T23:44:09.955213072Z" level=info msg="StartContainer for \"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\" returns successfully" Sep 9 23:44:11.102917 systemd[1]: cri-containerd-9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951.scope: Deactivated successfully. Sep 9 23:44:11.104107 systemd[1]: cri-containerd-9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951.scope: Consumed 296ms CPU time, 189.9M memory peak, 165.8M written to disk. Sep 9 23:44:11.105489 containerd[1873]: time="2025-09-09T23:44:11.105440449Z" level=info msg="received exit event container_id:\"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\" id:\"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\" pid:4182 exited_at:{seconds:1757461451 nanos:105191673}" Sep 9 23:44:11.106180 containerd[1873]: time="2025-09-09T23:44:11.105504307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\" id:\"9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951\" pid:4182 exited_at:{seconds:1757461451 nanos:105191673}" Sep 9 23:44:11.121956 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e45afbd02ef9f0c54ec80ec6e9d01d2168c81d9437094cc4ab0c27b043f4951-rootfs.mount: Deactivated successfully. Sep 9 23:44:11.154727 kubelet[3429]: I0909 23:44:11.154699 3429 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 23:44:11.499269 kubelet[3429]: W0909 23:44:11.209785 3429 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:11.499269 kubelet[3429]: E0909 23:44:11.209815 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:11.499269 kubelet[3429]: W0909 23:44:11.210068 3429 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:11.499269 kubelet[3429]: E0909 23:44:11.210191 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:11.199304 systemd[1]: Created slice kubepods-burstable-pod7e2b0195_849c_4cd6_920b_8ad423651e3a.slice - libcontainer container kubepods-burstable-pod7e2b0195_849c_4cd6_920b_8ad423651e3a.slice. Sep 9 23:44:11.499457 kubelet[3429]: W0909 23:44:11.210964 3429 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:11.499457 kubelet[3429]: E0909 23:44:11.210981 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:11.499457 kubelet[3429]: W0909 23:44:11.211029 3429 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4426.0.0-n-c59ad9327c" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object Sep 9 23:44:11.499457 kubelet[3429]: E0909 23:44:11.211040 3429 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" logger="UnhandledError" Sep 9 23:44:11.211249 systemd[1]: Created slice kubepods-besteffort-pod8214ef4d_09c4_4978_9043_c9a3b2f63681.slice - libcontainer container kubepods-besteffort-pod8214ef4d_09c4_4978_9043_c9a3b2f63681.slice. Sep 9 23:44:11.499550 kubelet[3429]: I0909 23:44:11.258698 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7694c43b-1384-4c4a-886a-a74f20763acd-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-pl29w\" (UID: \"7694c43b-1384-4c4a-886a-a74f20763acd\") " pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:11.499550 kubelet[3429]: I0909 23:44:11.258744 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8b6\" (UniqueName: \"kubernetes.io/projected/953db4df-f8b1-44fd-8681-7064a8d9e059-kube-api-access-mh8b6\") pod \"coredns-668d6bf9bc-56bns\" (UID: \"953db4df-f8b1-44fd-8681-7064a8d9e059\") " pod="kube-system/coredns-668d6bf9bc-56bns" Sep 9 23:44:11.499550 kubelet[3429]: I0909 23:44:11.258761 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82838c02-ad3d-4792-979e-e1bdb0c4a89c-calico-apiserver-certs\") pod \"calico-apiserver-f95989557-t9r46\" (UID: \"82838c02-ad3d-4792-979e-e1bdb0c4a89c\") " pod="calico-apiserver/calico-apiserver-f95989557-t9r46" Sep 9 23:44:11.499550 kubelet[3429]: I0909 23:44:11.258775 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8214ef4d-09c4-4978-9043-c9a3b2f63681-calico-apiserver-certs\") pod \"calico-apiserver-f95989557-fwwz2\" (UID: \"8214ef4d-09c4-4978-9043-c9a3b2f63681\") " pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" Sep 9 23:44:11.499550 kubelet[3429]: I0909 23:44:11.258794 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtz2\" (UniqueName: \"kubernetes.io/projected/7694c43b-1384-4c4a-886a-a74f20763acd-kube-api-access-2qtz2\") pod \"goldmane-54d579b49d-pl29w\" (UID: \"7694c43b-1384-4c4a-886a-a74f20763acd\") " pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:11.218391 systemd[1]: Created slice kubepods-besteffort-pod17f88a19_b388_4ae6_bd2d_76c9f0f7cf7f.slice - libcontainer container kubepods-besteffort-pod17f88a19_b388_4ae6_bd2d_76c9f0f7cf7f.slice. Sep 9 23:44:11.499651 kubelet[3429]: I0909 23:44:11.258806 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/953db4df-f8b1-44fd-8681-7064a8d9e059-config-volume\") pod \"coredns-668d6bf9bc-56bns\" (UID: \"953db4df-f8b1-44fd-8681-7064a8d9e059\") " pod="kube-system/coredns-668d6bf9bc-56bns" Sep 9 23:44:11.499651 kubelet[3429]: I0909 23:44:11.258819 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f-tigera-ca-bundle\") pod \"calico-kube-controllers-66795654f5-xmhj4\" (UID: \"17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f\") " pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" Sep 9 23:44:11.499651 kubelet[3429]: I0909 23:44:11.258833 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle\") pod \"whisker-867fd5fd46-tdf5f\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " pod="calico-system/whisker-867fd5fd46-tdf5f" Sep 9 23:44:11.499651 kubelet[3429]: I0909 23:44:11.258846 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e2b0195-849c-4cd6-920b-8ad423651e3a-config-volume\") pod \"coredns-668d6bf9bc-nwh2c\" (UID: \"7e2b0195-849c-4cd6-920b-8ad423651e3a\") " pod="kube-system/coredns-668d6bf9bc-nwh2c" Sep 9 23:44:11.499651 kubelet[3429]: I0909 23:44:11.258861 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwd4\" (UniqueName: \"kubernetes.io/projected/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-kube-api-access-vcwd4\") pod \"whisker-867fd5fd46-tdf5f\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " pod="calico-system/whisker-867fd5fd46-tdf5f" Sep 9 23:44:11.226638 systemd[1]: Created slice kubepods-besteffort-podca74bb3e_ee36_4da1_a63e_d2ce1fe1f57e.slice - libcontainer container kubepods-besteffort-podca74bb3e_ee36_4da1_a63e_d2ce1fe1f57e.slice. Sep 9 23:44:11.499758 kubelet[3429]: I0909 23:44:11.258889 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4lfj\" (UniqueName: \"kubernetes.io/projected/7e2b0195-849c-4cd6-920b-8ad423651e3a-kube-api-access-p4lfj\") pod \"coredns-668d6bf9bc-nwh2c\" (UID: \"7e2b0195-849c-4cd6-920b-8ad423651e3a\") " pod="kube-system/coredns-668d6bf9bc-nwh2c" Sep 9 23:44:11.499758 kubelet[3429]: I0909 23:44:11.258903 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7694c43b-1384-4c4a-886a-a74f20763acd-goldmane-key-pair\") pod \"goldmane-54d579b49d-pl29w\" (UID: \"7694c43b-1384-4c4a-886a-a74f20763acd\") " pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:11.499758 kubelet[3429]: I0909 23:44:11.258962 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn78k\" (UniqueName: \"kubernetes.io/projected/82838c02-ad3d-4792-979e-e1bdb0c4a89c-kube-api-access-xn78k\") pod \"calico-apiserver-f95989557-t9r46\" (UID: \"82838c02-ad3d-4792-979e-e1bdb0c4a89c\") " pod="calico-apiserver/calico-apiserver-f95989557-t9r46" Sep 9 23:44:11.499758 kubelet[3429]: I0909 23:44:11.258984 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wkz\" (UniqueName: \"kubernetes.io/projected/8214ef4d-09c4-4978-9043-c9a3b2f63681-kube-api-access-p4wkz\") pod \"calico-apiserver-f95989557-fwwz2\" (UID: \"8214ef4d-09c4-4978-9043-c9a3b2f63681\") " pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" Sep 9 23:44:11.499758 kubelet[3429]: I0909 23:44:11.259630 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7s7c\" (UniqueName: \"kubernetes.io/projected/17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f-kube-api-access-g7s7c\") pod \"calico-kube-controllers-66795654f5-xmhj4\" (UID: \"17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f\") " pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" Sep 9 23:44:11.230875 systemd[1]: Created slice kubepods-burstable-pod953db4df_f8b1_44fd_8681_7064a8d9e059.slice - libcontainer container kubepods-burstable-pod953db4df_f8b1_44fd_8681_7064a8d9e059.slice. Sep 9 23:44:11.499859 kubelet[3429]: I0909 23:44:11.261067 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7694c43b-1384-4c4a-886a-a74f20763acd-config\") pod \"goldmane-54d579b49d-pl29w\" (UID: \"7694c43b-1384-4c4a-886a-a74f20763acd\") " pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:11.499859 kubelet[3429]: I0909 23:44:11.261101 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-backend-key-pair\") pod \"whisker-867fd5fd46-tdf5f\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " pod="calico-system/whisker-867fd5fd46-tdf5f" Sep 9 23:44:11.237937 systemd[1]: Created slice kubepods-besteffort-pod7694c43b_1384_4c4a_886a_a74f20763acd.slice - libcontainer container kubepods-besteffort-pod7694c43b_1384_4c4a_886a_a74f20763acd.slice. Sep 9 23:44:11.243964 systemd[1]: Created slice kubepods-besteffort-pod82838c02_ad3d_4792_979e_e1bdb0c4a89c.slice - libcontainer container kubepods-besteffort-pod82838c02_ad3d_4792_979e_e1bdb0c4a89c.slice. Sep 9 23:44:11.496456 systemd[1]: Created slice kubepods-besteffort-podc172c6c9_8c08_4d46_a400_b2fd0f4bf93b.slice - libcontainer container kubepods-besteffort-podc172c6c9_8c08_4d46_a400_b2fd0f4bf93b.slice. Sep 9 23:44:11.505421 containerd[1873]: time="2025-09-09T23:44:11.505389171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5n8qp,Uid:c172c6c9-8c08-4d46-a400-b2fd0f4bf93b,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:11.799037 containerd[1873]: time="2025-09-09T23:44:11.798819635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nwh2c,Uid:7e2b0195-849c-4cd6-920b-8ad423651e3a,Namespace:kube-system,Attempt:0,}" Sep 9 23:44:11.804642 containerd[1873]: time="2025-09-09T23:44:11.804609469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-pl29w,Uid:7694c43b-1384-4c4a-886a-a74f20763acd,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:11.806115 containerd[1873]: time="2025-09-09T23:44:11.806088259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66795654f5-xmhj4,Uid:17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:11.817117 containerd[1873]: time="2025-09-09T23:44:11.817093543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56bns,Uid:953db4df-f8b1-44fd-8681-7064a8d9e059,Namespace:kube-system,Attempt:0,}" Sep 9 23:44:12.028302 containerd[1873]: time="2025-09-09T23:44:12.028223024Z" level=error msg="Failed to destroy network for sandbox \"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.045356 containerd[1873]: time="2025-09-09T23:44:12.045309800Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5n8qp,Uid:c172c6c9-8c08-4d46-a400-b2fd0f4bf93b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.047307 kubelet[3429]: E0909 23:44:12.045801 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.047307 kubelet[3429]: E0909 23:44:12.046497 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:12.047307 kubelet[3429]: E0909 23:44:12.046518 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5n8qp" Sep 9 23:44:12.047467 kubelet[3429]: E0909 23:44:12.046556 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5n8qp_calico-system(c172c6c9-8c08-4d46-a400-b2fd0f4bf93b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5n8qp_calico-system(c172c6c9-8c08-4d46-a400-b2fd0f4bf93b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4d5908e1d3ff1090de3f3ff4a4c639e9995df2a66fb425f2cd062c28aa5cbe7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5n8qp" podUID="c172c6c9-8c08-4d46-a400-b2fd0f4bf93b" Sep 9 23:44:12.071038 containerd[1873]: time="2025-09-09T23:44:12.070818020Z" level=error msg="Failed to destroy network for sandbox \"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.075259 containerd[1873]: time="2025-09-09T23:44:12.075216124Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nwh2c,Uid:7e2b0195-849c-4cd6-920b-8ad423651e3a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.075719 kubelet[3429]: E0909 23:44:12.075370 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.075719 kubelet[3429]: E0909 23:44:12.075408 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nwh2c" Sep 9 23:44:12.075719 kubelet[3429]: E0909 23:44:12.075424 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nwh2c" Sep 9 23:44:12.075796 kubelet[3429]: E0909 23:44:12.075450 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-nwh2c_kube-system(7e2b0195-849c-4cd6-920b-8ad423651e3a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-nwh2c_kube-system(7e2b0195-849c-4cd6-920b-8ad423651e3a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9e3c8048dea1ad6609da0e2fa3c09296828678b86cbef1eee6018fb8c0f69a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-nwh2c" podUID="7e2b0195-849c-4cd6-920b-8ad423651e3a" Sep 9 23:44:12.089951 containerd[1873]: time="2025-09-09T23:44:12.089923170Z" level=error msg="Failed to destroy network for sandbox \"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.090691 containerd[1873]: time="2025-09-09T23:44:12.090656129Z" level=error msg="Failed to destroy network for sandbox \"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.095060 containerd[1873]: time="2025-09-09T23:44:12.095025952Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66795654f5-xmhj4,Uid:17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.095254 kubelet[3429]: E0909 23:44:12.095232 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.095359 kubelet[3429]: E0909 23:44:12.095337 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" Sep 9 23:44:12.095441 kubelet[3429]: E0909 23:44:12.095410 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" Sep 9 23:44:12.095524 kubelet[3429]: E0909 23:44:12.095508 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-66795654f5-xmhj4_calico-system(17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-66795654f5-xmhj4_calico-system(17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d6872db27a06d5a26e375068b8c759770d43ee7def532aa46fe71040b68d1b5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" podUID="17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f" Sep 9 23:44:12.099582 containerd[1873]: time="2025-09-09T23:44:12.099553612Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-pl29w,Uid:7694c43b-1384-4c4a-886a-a74f20763acd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.099954 kubelet[3429]: E0909 23:44:12.099739 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.099954 kubelet[3429]: E0909 23:44:12.099875 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:12.099954 kubelet[3429]: E0909 23:44:12.099889 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-pl29w" Sep 9 23:44:12.100152 kubelet[3429]: E0909 23:44:12.099922 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-pl29w_calico-system(7694c43b-1384-4c4a-886a-a74f20763acd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-pl29w_calico-system(7694c43b-1384-4c4a-886a-a74f20763acd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e5ff16a5df08c722cc8be9e08383601b007c3391914be87ca8ecf0a71795e34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-pl29w" podUID="7694c43b-1384-4c4a-886a-a74f20763acd" Sep 9 23:44:12.100800 containerd[1873]: time="2025-09-09T23:44:12.100769569Z" level=error msg="Failed to destroy network for sandbox \"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.104423 containerd[1873]: time="2025-09-09T23:44:12.104398481Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56bns,Uid:953db4df-f8b1-44fd-8681-7064a8d9e059,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.104742 kubelet[3429]: E0909 23:44:12.104708 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:12.104798 kubelet[3429]: E0909 23:44:12.104744 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-56bns" Sep 9 23:44:12.104798 kubelet[3429]: E0909 23:44:12.104762 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-56bns" Sep 9 23:44:12.104798 kubelet[3429]: E0909 23:44:12.104787 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-56bns_kube-system(953db4df-f8b1-44fd-8681-7064a8d9e059)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-56bns_kube-system(953db4df-f8b1-44fd-8681-7064a8d9e059)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"364576da4620e1e9704e1f913066f8d81d33b288031bb93154d8bdb4df9f833e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-56bns" podUID="953db4df-f8b1-44fd-8681-7064a8d9e059" Sep 9 23:44:12.121093 systemd[1]: run-netns-cni\x2db8fb1bb9\x2d7f68\x2dc782\x2dee47\x2d0d5588f74ae3.mount: Deactivated successfully. Sep 9 23:44:12.121175 systemd[1]: run-netns-cni\x2d5385c4f8\x2d39d8\x2d7e38\x2df8e1\x2d182277d88547.mount: Deactivated successfully. Sep 9 23:44:12.362812 kubelet[3429]: E0909 23:44:12.362710 3429 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.362812 kubelet[3429]: E0909 23:44:12.362775 3429 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle podName:ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e nodeName:}" failed. No retries permitted until 2025-09-09 23:44:12.862760894 +0000 UTC m=+28.440114043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle") pod "whisker-867fd5fd46-tdf5f" (UID: "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e") : failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.372450 kubelet[3429]: E0909 23:44:12.372230 3429 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.372450 kubelet[3429]: E0909 23:44:12.372256 3429 projected.go:194] Error preparing data for projected volume kube-api-access-xn78k for pod calico-apiserver/calico-apiserver-f95989557-t9r46: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.372450 kubelet[3429]: E0909 23:44:12.372297 3429 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82838c02-ad3d-4792-979e-e1bdb0c4a89c-kube-api-access-xn78k podName:82838c02-ad3d-4792-979e-e1bdb0c4a89c nodeName:}" failed. No retries permitted until 2025-09-09 23:44:12.872287596 +0000 UTC m=+28.449640745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xn78k" (UniqueName: "kubernetes.io/projected/82838c02-ad3d-4792-979e-e1bdb0c4a89c-kube-api-access-xn78k") pod "calico-apiserver-f95989557-t9r46" (UID: "82838c02-ad3d-4792-979e-e1bdb0c4a89c") : failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.374434 kubelet[3429]: E0909 23:44:12.374371 3429 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.374434 kubelet[3429]: E0909 23:44:12.374392 3429 projected.go:194] Error preparing data for projected volume kube-api-access-p4wkz for pod calico-apiserver/calico-apiserver-f95989557-fwwz2: failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.374434 kubelet[3429]: E0909 23:44:12.374419 3429 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8214ef4d-09c4-4978-9043-c9a3b2f63681-kube-api-access-p4wkz podName:8214ef4d-09c4-4978-9043-c9a3b2f63681 nodeName:}" failed. No retries permitted until 2025-09-09 23:44:12.874410598 +0000 UTC m=+28.451763747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p4wkz" (UniqueName: "kubernetes.io/projected/8214ef4d-09c4-4978-9043-c9a3b2f63681-kube-api-access-p4wkz") pod "calico-apiserver-f95989557-fwwz2" (UID: "8214ef4d-09c4-4978-9043-c9a3b2f63681") : failed to sync configmap cache: timed out waiting for the condition Sep 9 23:44:12.610189 containerd[1873]: time="2025-09-09T23:44:12.609273037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 23:44:13.001337 containerd[1873]: time="2025-09-09T23:44:13.001103499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-fwwz2,Uid:8214ef4d-09c4-4978-9043-c9a3b2f63681,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:44:13.006826 containerd[1873]: time="2025-09-09T23:44:13.006793787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-t9r46,Uid:82838c02-ad3d-4792-979e-e1bdb0c4a89c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:44:13.029308 containerd[1873]: time="2025-09-09T23:44:13.029280514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-867fd5fd46-tdf5f,Uid:ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:13.053991 containerd[1873]: time="2025-09-09T23:44:13.053951164Z" level=error msg="Failed to destroy network for sandbox \"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.061442 containerd[1873]: time="2025-09-09T23:44:13.061385305Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-fwwz2,Uid:8214ef4d-09c4-4978-9043-c9a3b2f63681,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.062110 kubelet[3429]: E0909 23:44:13.061759 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.062110 kubelet[3429]: E0909 23:44:13.061816 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" Sep 9 23:44:13.062110 kubelet[3429]: E0909 23:44:13.061832 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" Sep 9 23:44:13.062208 kubelet[3429]: E0909 23:44:13.061864 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f95989557-fwwz2_calico-apiserver(8214ef4d-09c4-4978-9043-c9a3b2f63681)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f95989557-fwwz2_calico-apiserver(8214ef4d-09c4-4978-9043-c9a3b2f63681)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9237b8d6c423e7654c0a7ff377517cc68a8fcc73592d4e220143863cb6e2079a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" podUID="8214ef4d-09c4-4978-9043-c9a3b2f63681" Sep 9 23:44:13.067850 containerd[1873]: time="2025-09-09T23:44:13.067824048Z" level=error msg="Failed to destroy network for sandbox \"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.073591 containerd[1873]: time="2025-09-09T23:44:13.073556833Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-t9r46,Uid:82838c02-ad3d-4792-979e-e1bdb0c4a89c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.073869 kubelet[3429]: E0909 23:44:13.073839 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.073920 kubelet[3429]: E0909 23:44:13.073878 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f95989557-t9r46" Sep 9 23:44:13.073920 kubelet[3429]: E0909 23:44:13.073911 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f95989557-t9r46" Sep 9 23:44:13.074031 kubelet[3429]: E0909 23:44:13.073942 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f95989557-t9r46_calico-apiserver(82838c02-ad3d-4792-979e-e1bdb0c4a89c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f95989557-t9r46_calico-apiserver(82838c02-ad3d-4792-979e-e1bdb0c4a89c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a22b158909bd17feb3e452fc9b5fdb1420de2950b05c559a36c1ce3d820dbeb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f95989557-t9r46" podUID="82838c02-ad3d-4792-979e-e1bdb0c4a89c" Sep 9 23:44:13.086766 containerd[1873]: time="2025-09-09T23:44:13.086735560Z" level=error msg="Failed to destroy network for sandbox \"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.090552 containerd[1873]: time="2025-09-09T23:44:13.090523781Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-867fd5fd46-tdf5f,Uid:ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.090698 kubelet[3429]: E0909 23:44:13.090670 3429 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:44:13.090750 kubelet[3429]: E0909 23:44:13.090710 3429 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-867fd5fd46-tdf5f" Sep 9 23:44:13.090750 kubelet[3429]: E0909 23:44:13.090723 3429 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-867fd5fd46-tdf5f" Sep 9 23:44:13.090786 kubelet[3429]: E0909 23:44:13.090749 3429 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-867fd5fd46-tdf5f_calico-system(ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-867fd5fd46-tdf5f_calico-system(ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4c0005d08e1f5e3b3af851cdbbe55cdd4c605a98ab02ea6112bd6355fdb9591\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-867fd5fd46-tdf5f" podUID="ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e" Sep 9 23:44:16.223481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1632864738.mount: Deactivated successfully. Sep 9 23:44:16.398244 kubelet[3429]: I0909 23:44:16.398193 3429 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:44:16.618397 containerd[1873]: time="2025-09-09T23:44:16.618285679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:16.623021 containerd[1873]: time="2025-09-09T23:44:16.622856260Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 23:44:16.627774 containerd[1873]: time="2025-09-09T23:44:16.627725794Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:16.632593 containerd[1873]: time="2025-09-09T23:44:16.632547415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:16.633066 containerd[1873]: time="2025-09-09T23:44:16.632824384Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.022773531s" Sep 9 23:44:16.633066 containerd[1873]: time="2025-09-09T23:44:16.632845528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 23:44:16.642991 containerd[1873]: time="2025-09-09T23:44:16.642961817Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 23:44:16.673418 containerd[1873]: time="2025-09-09T23:44:16.673383421Z" level=info msg="Container cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:16.699701 containerd[1873]: time="2025-09-09T23:44:16.699659353Z" level=info msg="CreateContainer within sandbox \"0eef625273c03b17495753ecdd5ad0cd3471e6692286bccffb1349816c1ff7f4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\"" Sep 9 23:44:16.700266 containerd[1873]: time="2025-09-09T23:44:16.700238139Z" level=info msg="StartContainer for \"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\"" Sep 9 23:44:16.701392 containerd[1873]: time="2025-09-09T23:44:16.701368870Z" level=info msg="connecting to shim cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6" address="unix:///run/containerd/s/71fcb82fd916cf429527ebe302f36571af58dd17fd13b21692a31d33b88dbd6b" protocol=ttrpc version=3 Sep 9 23:44:16.718119 systemd[1]: Started cri-containerd-cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6.scope - libcontainer container cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6. Sep 9 23:44:16.749633 containerd[1873]: time="2025-09-09T23:44:16.749596968Z" level=info msg="StartContainer for \"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" returns successfully" Sep 9 23:44:17.161032 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 23:44:17.161171 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 23:44:17.398884 kubelet[3429]: I0909 23:44:17.398851 3429 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwd4\" (UniqueName: \"kubernetes.io/projected/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-kube-api-access-vcwd4\") pod \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " Sep 9 23:44:17.398884 kubelet[3429]: I0909 23:44:17.398885 3429 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle\") pod \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " Sep 9 23:44:17.399902 kubelet[3429]: I0909 23:44:17.398902 3429 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-backend-key-pair\") pod \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\" (UID: \"ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e\") " Sep 9 23:44:17.402704 systemd[1]: var-lib-kubelet-pods-ca74bb3e\x2dee36\x2d4da1\x2da63e\x2dd2ce1fe1f57e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvcwd4.mount: Deactivated successfully. Sep 9 23:44:17.404675 systemd[1]: var-lib-kubelet-pods-ca74bb3e\x2dee36\x2d4da1\x2da63e\x2dd2ce1fe1f57e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 23:44:17.405946 kubelet[3429]: I0909 23:44:17.405450 3429 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-kube-api-access-vcwd4" (OuterVolumeSpecName: "kube-api-access-vcwd4") pod "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e" (UID: "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e"). InnerVolumeSpecName "kube-api-access-vcwd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 23:44:17.407474 kubelet[3429]: I0909 23:44:17.407229 3429 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e" (UID: "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 23:44:17.408217 kubelet[3429]: I0909 23:44:17.408183 3429 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e" (UID: "ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 23:44:17.499659 kubelet[3429]: I0909 23:44:17.499602 3429 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vcwd4\" (UniqueName: \"kubernetes.io/projected/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-kube-api-access-vcwd4\") on node \"ci-4426.0.0-n-c59ad9327c\" DevicePath \"\"" Sep 9 23:44:17.499659 kubelet[3429]: I0909 23:44:17.499633 3429 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-ca-bundle\") on node \"ci-4426.0.0-n-c59ad9327c\" DevicePath \"\"" Sep 9 23:44:17.499659 kubelet[3429]: I0909 23:44:17.499641 3429 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e-whisker-backend-key-pair\") on node \"ci-4426.0.0-n-c59ad9327c\" DevicePath \"\"" Sep 9 23:44:17.624577 systemd[1]: Removed slice kubepods-besteffort-podca74bb3e_ee36_4da1_a63e_d2ce1fe1f57e.slice - libcontainer container kubepods-besteffort-podca74bb3e_ee36_4da1_a63e_d2ce1fe1f57e.slice. Sep 9 23:44:17.639317 kubelet[3429]: I0909 23:44:17.639148 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nfnj2" podStartSLOduration=3.484960333 podStartE2EDuration="16.639077565s" podCreationTimestamp="2025-09-09 23:44:01 +0000 UTC" firstStartedPulling="2025-09-09 23:44:03.479209999 +0000 UTC m=+19.056563148" lastFinishedPulling="2025-09-09 23:44:16.633327231 +0000 UTC m=+32.210680380" observedRunningTime="2025-09-09 23:44:17.638429033 +0000 UTC m=+33.215782182" watchObservedRunningTime="2025-09-09 23:44:17.639077565 +0000 UTC m=+33.216430714" Sep 9 23:44:17.708984 kubelet[3429]: I0909 23:44:17.708941 3429 status_manager.go:890] "Failed to get status for pod" podUID="84e1941c-59ae-4c4f-8c5c-a668cbb738cc" pod="calico-system/whisker-6979679477-bghc2" err="pods \"whisker-6979679477-bghc2\" is forbidden: User \"system:node:ci-4426.0.0-n-c59ad9327c\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426.0.0-n-c59ad9327c' and this object" Sep 9 23:44:17.713788 systemd[1]: Created slice kubepods-besteffort-pod84e1941c_59ae_4c4f_8c5c_a668cbb738cc.slice - libcontainer container kubepods-besteffort-pod84e1941c_59ae_4c4f_8c5c_a668cbb738cc.slice. Sep 9 23:44:17.801686 kubelet[3429]: I0909 23:44:17.801513 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84e1941c-59ae-4c4f-8c5c-a668cbb738cc-whisker-ca-bundle\") pod \"whisker-6979679477-bghc2\" (UID: \"84e1941c-59ae-4c4f-8c5c-a668cbb738cc\") " pod="calico-system/whisker-6979679477-bghc2" Sep 9 23:44:17.801686 kubelet[3429]: I0909 23:44:17.801556 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/84e1941c-59ae-4c4f-8c5c-a668cbb738cc-whisker-backend-key-pair\") pod \"whisker-6979679477-bghc2\" (UID: \"84e1941c-59ae-4c4f-8c5c-a668cbb738cc\") " pod="calico-system/whisker-6979679477-bghc2" Sep 9 23:44:17.801686 kubelet[3429]: I0909 23:44:17.801571 3429 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7czrs\" (UniqueName: \"kubernetes.io/projected/84e1941c-59ae-4c4f-8c5c-a668cbb738cc-kube-api-access-7czrs\") pod \"whisker-6979679477-bghc2\" (UID: \"84e1941c-59ae-4c4f-8c5c-a668cbb738cc\") " pod="calico-system/whisker-6979679477-bghc2" Sep 9 23:44:18.016865 containerd[1873]: time="2025-09-09T23:44:18.016806605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6979679477-bghc2,Uid:84e1941c-59ae-4c4f-8c5c-a668cbb738cc,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:18.158230 systemd-networkd[1684]: cali6488fc4893b: Link UP Sep 9 23:44:18.158960 systemd-networkd[1684]: cali6488fc4893b: Gained carrier Sep 9 23:44:18.175589 containerd[1873]: 2025-09-09 23:44:18.046 [INFO][4518] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:44:18.175589 containerd[1873]: 2025-09-09 23:44:18.068 [INFO][4518] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0 whisker-6979679477- calico-system 84e1941c-59ae-4c4f-8c5c-a668cbb738cc 857 0 2025-09-09 23:44:17 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6979679477 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c whisker-6979679477-bghc2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali6488fc4893b [] [] }} ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-" Sep 9 23:44:18.175589 containerd[1873]: 2025-09-09 23:44:18.068 [INFO][4518] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.175589 containerd[1873]: 2025-09-09 23:44:18.085 [INFO][4530] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" HandleID="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Workload="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.085 [INFO][4530] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" HandleID="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Workload="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"whisker-6979679477-bghc2", "timestamp":"2025-09-09 23:44:18.085143708 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.085 [INFO][4530] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.085 [INFO][4530] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.085 [INFO][4530] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.090 [INFO][4530] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.094 [INFO][4530] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.097 [INFO][4530] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.098 [INFO][4530] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175755 containerd[1873]: 2025-09-09 23:44:18.101 [INFO][4530] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.101 [INFO][4530] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.103 [INFO][4530] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37 Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.107 [INFO][4530] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.117 [INFO][4530] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.1/26] block=192.168.70.0/26 handle="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.117 [INFO][4530] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.1/26] handle="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.117 [INFO][4530] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:18.175882 containerd[1873]: 2025-09-09 23:44:18.117 [INFO][4530] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.1/26] IPv6=[] ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" HandleID="k8s-pod-network.dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Workload="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.175972 containerd[1873]: 2025-09-09 23:44:18.118 [INFO][4518] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0", GenerateName:"whisker-6979679477-", Namespace:"calico-system", SelfLink:"", UID:"84e1941c-59ae-4c4f-8c5c-a668cbb738cc", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6979679477", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"whisker-6979679477-bghc2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6488fc4893b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:18.175972 containerd[1873]: 2025-09-09 23:44:18.118 [INFO][4518] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.1/32] ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.176052 containerd[1873]: 2025-09-09 23:44:18.118 [INFO][4518] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6488fc4893b ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.176052 containerd[1873]: 2025-09-09 23:44:18.158 [INFO][4518] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.176087 containerd[1873]: 2025-09-09 23:44:18.159 [INFO][4518] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0", GenerateName:"whisker-6979679477-", Namespace:"calico-system", SelfLink:"", UID:"84e1941c-59ae-4c4f-8c5c-a668cbb738cc", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6979679477", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37", Pod:"whisker-6979679477-bghc2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.70.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6488fc4893b", MAC:"9a:aa:5d:6e:6f:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:18.176145 containerd[1873]: 2025-09-09 23:44:18.172 [INFO][4518] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" Namespace="calico-system" Pod="whisker-6979679477-bghc2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-whisker--6979679477--bghc2-eth0" Sep 9 23:44:18.221932 containerd[1873]: time="2025-09-09T23:44:18.221600076Z" level=info msg="connecting to shim dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37" address="unix:///run/containerd/s/59c91a621a29d6f4f4832160864045ccfd9e523c4a67d10a863ae353dfaadf1a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:18.246128 systemd[1]: Started cri-containerd-dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37.scope - libcontainer container dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37. Sep 9 23:44:18.277371 containerd[1873]: time="2025-09-09T23:44:18.277298198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6979679477-bghc2,Uid:84e1941c-59ae-4c4f-8c5c-a668cbb738cc,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37\"" Sep 9 23:44:18.279109 containerd[1873]: time="2025-09-09T23:44:18.279079373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 23:44:18.495406 kubelet[3429]: I0909 23:44:18.495268 3429 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e" path="/var/lib/kubelet/pods/ca74bb3e-ee36-4da1-a63e-d2ce1fe1f57e/volumes" Sep 9 23:44:19.013172 systemd-networkd[1684]: vxlan.calico: Link UP Sep 9 23:44:19.013177 systemd-networkd[1684]: vxlan.calico: Gained carrier Sep 9 23:44:19.294140 systemd-networkd[1684]: cali6488fc4893b: Gained IPv6LL Sep 9 23:44:19.488294 containerd[1873]: time="2025-09-09T23:44:19.487772169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:19.490901 containerd[1873]: time="2025-09-09T23:44:19.490875089Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 23:44:19.494241 containerd[1873]: time="2025-09-09T23:44:19.494220064Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:19.498995 containerd[1873]: time="2025-09-09T23:44:19.498964363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:19.499450 containerd[1873]: time="2025-09-09T23:44:19.499426153Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.2201806s" Sep 9 23:44:19.499500 containerd[1873]: time="2025-09-09T23:44:19.499452930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 23:44:19.503612 containerd[1873]: time="2025-09-09T23:44:19.503589282Z" level=info msg="CreateContainer within sandbox \"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 23:44:19.530568 containerd[1873]: time="2025-09-09T23:44:19.529553732Z" level=info msg="Container 6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:19.530417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1644142037.mount: Deactivated successfully. Sep 9 23:44:19.554329 containerd[1873]: time="2025-09-09T23:44:19.554231975Z" level=info msg="CreateContainer within sandbox \"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74\"" Sep 9 23:44:19.555344 containerd[1873]: time="2025-09-09T23:44:19.555312200Z" level=info msg="StartContainer for \"6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74\"" Sep 9 23:44:19.557116 containerd[1873]: time="2025-09-09T23:44:19.557092351Z" level=info msg="connecting to shim 6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74" address="unix:///run/containerd/s/59c91a621a29d6f4f4832160864045ccfd9e523c4a67d10a863ae353dfaadf1a" protocol=ttrpc version=3 Sep 9 23:44:19.576228 systemd[1]: Started cri-containerd-6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74.scope - libcontainer container 6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74. Sep 9 23:44:19.611629 containerd[1873]: time="2025-09-09T23:44:19.611595075Z" level=info msg="StartContainer for \"6b4515f951e0f30108201e9aebf92f778e00b5b4bd9de13c4fa0410826ba6e74\" returns successfully" Sep 9 23:44:19.614130 containerd[1873]: time="2025-09-09T23:44:19.614112601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 23:44:20.318161 systemd-networkd[1684]: vxlan.calico: Gained IPv6LL Sep 9 23:44:21.451492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1997620137.mount: Deactivated successfully. Sep 9 23:44:21.765344 containerd[1873]: time="2025-09-09T23:44:21.764536912Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:21.768252 containerd[1873]: time="2025-09-09T23:44:21.768226978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 23:44:21.772018 containerd[1873]: time="2025-09-09T23:44:21.771966750Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:21.776363 containerd[1873]: time="2025-09-09T23:44:21.776313260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:21.777385 containerd[1873]: time="2025-09-09T23:44:21.776839260Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.16262476s" Sep 9 23:44:21.777385 containerd[1873]: time="2025-09-09T23:44:21.776872949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 23:44:21.782130 containerd[1873]: time="2025-09-09T23:44:21.782108919Z" level=info msg="CreateContainer within sandbox \"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 23:44:21.807142 containerd[1873]: time="2025-09-09T23:44:21.807119356Z" level=info msg="Container 2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:21.833930 containerd[1873]: time="2025-09-09T23:44:21.833894967Z" level=info msg="CreateContainer within sandbox \"dd86959ab9599388cc6fee01b46293f2a62b5df08d571956cfd4dc1430bf9f37\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55\"" Sep 9 23:44:21.834681 containerd[1873]: time="2025-09-09T23:44:21.834520618Z" level=info msg="StartContainer for \"2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55\"" Sep 9 23:44:21.835445 containerd[1873]: time="2025-09-09T23:44:21.835421206Z" level=info msg="connecting to shim 2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55" address="unix:///run/containerd/s/59c91a621a29d6f4f4832160864045ccfd9e523c4a67d10a863ae353dfaadf1a" protocol=ttrpc version=3 Sep 9 23:44:21.853115 systemd[1]: Started cri-containerd-2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55.scope - libcontainer container 2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55. Sep 9 23:44:21.885957 containerd[1873]: time="2025-09-09T23:44:21.885905854Z" level=info msg="StartContainer for \"2ea16928c96d6f849211a6b3dd4dfa9c87ca1ff4a2d1a6d97f0dc88ce7765d55\" returns successfully" Sep 9 23:44:22.492873 containerd[1873]: time="2025-09-09T23:44:22.492593649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nwh2c,Uid:7e2b0195-849c-4cd6-920b-8ad423651e3a,Namespace:kube-system,Attempt:0,}" Sep 9 23:44:22.492873 containerd[1873]: time="2025-09-09T23:44:22.492717405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5n8qp,Uid:c172c6c9-8c08-4d46-a400-b2fd0f4bf93b,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:22.611823 systemd-networkd[1684]: cali5aea1b3008a: Link UP Sep 9 23:44:22.613732 systemd-networkd[1684]: cali5aea1b3008a: Gained carrier Sep 9 23:44:22.627945 containerd[1873]: 2025-09-09 23:44:22.538 [INFO][4868] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0 coredns-668d6bf9bc- kube-system 7e2b0195-849c-4cd6-920b-8ad423651e3a 781 0 2025-09-09 23:43:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c coredns-668d6bf9bc-nwh2c eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5aea1b3008a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-" Sep 9 23:44:22.627945 containerd[1873]: 2025-09-09 23:44:22.538 [INFO][4868] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.627945 containerd[1873]: 2025-09-09 23:44:22.567 [INFO][4893] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" HandleID="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.567 [INFO][4893] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" HandleID="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024aff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"coredns-668d6bf9bc-nwh2c", "timestamp":"2025-09-09 23:44:22.567225355 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.567 [INFO][4893] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.567 [INFO][4893] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.567 [INFO][4893] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.573 [INFO][4893] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.577 [INFO][4893] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.580 [INFO][4893] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.581 [INFO][4893] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628651 containerd[1873]: 2025-09-09 23:44:22.583 [INFO][4893] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.583 [INFO][4893] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.584 [INFO][4893] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3 Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.593 [INFO][4893] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4893] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.2/26] block=192.168.70.0/26 handle="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4893] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.2/26] handle="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4893] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:22.628802 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4893] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.2/26] IPv6=[] ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" HandleID="k8s-pod-network.db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.603 [INFO][4868] cni-plugin/k8s.go 418: Populated endpoint ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7e2b0195-849c-4cd6-920b-8ad423651e3a", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"coredns-668d6bf9bc-nwh2c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5aea1b3008a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.603 [INFO][4868] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.2/32] ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.603 [INFO][4868] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5aea1b3008a ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.614 [INFO][4868] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.614 [INFO][4868] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7e2b0195-849c-4cd6-920b-8ad423651e3a", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3", Pod:"coredns-668d6bf9bc-nwh2c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5aea1b3008a", MAC:"3a:a7:de:d9:25:88", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:22.628907 containerd[1873]: 2025-09-09 23:44:22.625 [INFO][4868] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" Namespace="kube-system" Pod="coredns-668d6bf9bc-nwh2c" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--nwh2c-eth0" Sep 9 23:44:22.650239 kubelet[3429]: I0909 23:44:22.649321 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6979679477-bghc2" podStartSLOduration=2.14942041 podStartE2EDuration="5.649307348s" podCreationTimestamp="2025-09-09 23:44:17 +0000 UTC" firstStartedPulling="2025-09-09 23:44:18.278392015 +0000 UTC m=+33.855745172" lastFinishedPulling="2025-09-09 23:44:21.778278961 +0000 UTC m=+37.355632110" observedRunningTime="2025-09-09 23:44:22.649095493 +0000 UTC m=+38.226448658" watchObservedRunningTime="2025-09-09 23:44:22.649307348 +0000 UTC m=+38.226660497" Sep 9 23:44:22.688154 containerd[1873]: time="2025-09-09T23:44:22.687965350Z" level=info msg="connecting to shim db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3" address="unix:///run/containerd/s/38131c3a75d72a8b6f3ef04062db857524a7c9cc59f8c5cf6e0b7d7fa2b1b908" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:22.714140 systemd[1]: Started cri-containerd-db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3.scope - libcontainer container db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3. Sep 9 23:44:22.730657 systemd-networkd[1684]: cali0543e0c51da: Link UP Sep 9 23:44:22.731750 systemd-networkd[1684]: cali0543e0c51da: Gained carrier Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.550 [INFO][4880] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0 csi-node-driver- calico-system c172c6c9-8c08-4d46-a400-b2fd0f4bf93b 682 0 2025-09-09 23:44:01 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c csi-node-driver-5n8qp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0543e0c51da [] [] }} ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.550 [INFO][4880] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.572 [INFO][4898] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" HandleID="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Workload="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.572 [INFO][4898] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" HandleID="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Workload="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b900), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"csi-node-driver-5n8qp", "timestamp":"2025-09-09 23:44:22.572315513 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.572 [INFO][4898] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4898] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.599 [INFO][4898] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.674 [INFO][4898] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.683 [INFO][4898] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.687 [INFO][4898] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.697 [INFO][4898] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.700 [INFO][4898] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.700 [INFO][4898] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.702 [INFO][4898] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.717 [INFO][4898] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.723 [INFO][4898] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.3/26] block=192.168.70.0/26 handle="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.723 [INFO][4898] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.3/26] handle="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.723 [INFO][4898] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:22.753503 containerd[1873]: 2025-09-09 23:44:22.723 [INFO][4898] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.3/26] IPv6=[] ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" HandleID="k8s-pod-network.9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Workload="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.724 [INFO][4880] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"csi-node-driver-5n8qp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0543e0c51da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.725 [INFO][4880] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.3/32] ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.725 [INFO][4880] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0543e0c51da ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.733 [INFO][4880] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.733 [INFO][4880] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c172c6c9-8c08-4d46-a400-b2fd0f4bf93b", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb", Pod:"csi-node-driver-5n8qp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.70.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0543e0c51da", MAC:"8a:19:0d:c3:d9:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:22.755457 containerd[1873]: 2025-09-09 23:44:22.751 [INFO][4880] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" Namespace="calico-system" Pod="csi-node-driver-5n8qp" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-csi--node--driver--5n8qp-eth0" Sep 9 23:44:22.755457 containerd[1873]: time="2025-09-09T23:44:22.753585802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nwh2c,Uid:7e2b0195-849c-4cd6-920b-8ad423651e3a,Namespace:kube-system,Attempt:0,} returns sandbox id \"db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3\"" Sep 9 23:44:22.758511 containerd[1873]: time="2025-09-09T23:44:22.758483841Z" level=info msg="CreateContainer within sandbox \"db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:44:22.796830 containerd[1873]: time="2025-09-09T23:44:22.796444022Z" level=info msg="Container df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:22.816399 containerd[1873]: time="2025-09-09T23:44:22.816366478Z" level=info msg="CreateContainer within sandbox \"db82af1b42f6c6f4a23c290548d1734f00240d1fcd5f3504604dc39c6170f4d3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27\"" Sep 9 23:44:22.816893 containerd[1873]: time="2025-09-09T23:44:22.816854125Z" level=info msg="StartContainer for \"df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27\"" Sep 9 23:44:22.817885 containerd[1873]: time="2025-09-09T23:44:22.817842547Z" level=info msg="connecting to shim df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27" address="unix:///run/containerd/s/38131c3a75d72a8b6f3ef04062db857524a7c9cc59f8c5cf6e0b7d7fa2b1b908" protocol=ttrpc version=3 Sep 9 23:44:22.833123 systemd[1]: Started cri-containerd-df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27.scope - libcontainer container df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27. Sep 9 23:44:22.842762 containerd[1873]: time="2025-09-09T23:44:22.842552023Z" level=info msg="connecting to shim 9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb" address="unix:///run/containerd/s/ecc200118ee48ed9e1edf91baca97be3f41c73503ac25b4f7dbae57bae467182" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:22.866168 systemd[1]: Started cri-containerd-9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb.scope - libcontainer container 9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb. Sep 9 23:44:22.892401 containerd[1873]: time="2025-09-09T23:44:22.891396380Z" level=info msg="StartContainer for \"df922dec0fa41db11723414f4d1917b559434ae954272d7ecc47a3ff7e908b27\" returns successfully" Sep 9 23:44:22.905524 containerd[1873]: time="2025-09-09T23:44:22.905488800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5n8qp,Uid:c172c6c9-8c08-4d46-a400-b2fd0f4bf93b,Namespace:calico-system,Attempt:0,} returns sandbox id \"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb\"" Sep 9 23:44:22.906897 containerd[1873]: time="2025-09-09T23:44:22.906870378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 23:44:23.652711 kubelet[3429]: I0909 23:44:23.652653 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-nwh2c" podStartSLOduration=34.652582493 podStartE2EDuration="34.652582493s" podCreationTimestamp="2025-09-09 23:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:44:23.65203422 +0000 UTC m=+39.229387369" watchObservedRunningTime="2025-09-09 23:44:23.652582493 +0000 UTC m=+39.229935642" Sep 9 23:44:23.838167 systemd-networkd[1684]: cali5aea1b3008a: Gained IPv6LL Sep 9 23:44:24.143116 containerd[1873]: time="2025-09-09T23:44:24.143067915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:24.146519 containerd[1873]: time="2025-09-09T23:44:24.146476696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 23:44:24.149938 containerd[1873]: time="2025-09-09T23:44:24.149888694Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:24.154454 containerd[1873]: time="2025-09-09T23:44:24.154411831Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:24.154806 containerd[1873]: time="2025-09-09T23:44:24.154701720Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.247804541s" Sep 9 23:44:24.154806 containerd[1873]: time="2025-09-09T23:44:24.154723945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 23:44:24.156860 containerd[1873]: time="2025-09-09T23:44:24.156828964Z" level=info msg="CreateContainer within sandbox \"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 23:44:24.187022 containerd[1873]: time="2025-09-09T23:44:24.184254765Z" level=info msg="Container bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:24.188688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2211242616.mount: Deactivated successfully. Sep 9 23:44:24.216929 containerd[1873]: time="2025-09-09T23:44:24.216893661Z" level=info msg="CreateContainer within sandbox \"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd\"" Sep 9 23:44:24.218506 containerd[1873]: time="2025-09-09T23:44:24.217409446Z" level=info msg="StartContainer for \"bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd\"" Sep 9 23:44:24.219618 containerd[1873]: time="2025-09-09T23:44:24.219598556Z" level=info msg="connecting to shim bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd" address="unix:///run/containerd/s/ecc200118ee48ed9e1edf91baca97be3f41c73503ac25b4f7dbae57bae467182" protocol=ttrpc version=3 Sep 9 23:44:24.238129 systemd[1]: Started cri-containerd-bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd.scope - libcontainer container bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd. Sep 9 23:44:24.273757 containerd[1873]: time="2025-09-09T23:44:24.273617082Z" level=info msg="StartContainer for \"bb16f37ea4d70496b8500acec741027c78df1034c0c76aa4da6223e8a15222bd\" returns successfully" Sep 9 23:44:24.276032 containerd[1873]: time="2025-09-09T23:44:24.275453557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 23:44:24.286225 systemd-networkd[1684]: cali0543e0c51da: Gained IPv6LL Sep 9 23:44:24.492436 containerd[1873]: time="2025-09-09T23:44:24.492401026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66795654f5-xmhj4,Uid:17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:24.492795 containerd[1873]: time="2025-09-09T23:44:24.492399946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-fwwz2,Uid:8214ef4d-09c4-4978-9043-c9a3b2f63681,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:44:24.636483 systemd-networkd[1684]: cali41c039880a1: Link UP Sep 9 23:44:24.637626 systemd-networkd[1684]: cali41c039880a1: Gained carrier Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.548 [INFO][5097] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0 calico-kube-controllers-66795654f5- calico-system 17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f 789 0 2025-09-09 23:44:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:66795654f5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c calico-kube-controllers-66795654f5-xmhj4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali41c039880a1 [] [] }} ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.548 [INFO][5097] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.579 [INFO][5123] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" HandleID="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.579 [INFO][5123] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" HandleID="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa140), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"calico-kube-controllers-66795654f5-xmhj4", "timestamp":"2025-09-09 23:44:24.579220357 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.579 [INFO][5123] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.579 [INFO][5123] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.579 [INFO][5123] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.588 [INFO][5123] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.593 [INFO][5123] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.602 [INFO][5123] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.603 [INFO][5123] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.605 [INFO][5123] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.605 [INFO][5123] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.606 [INFO][5123] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5 Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.611 [INFO][5123] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.620 [INFO][5123] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.4/26] block=192.168.70.0/26 handle="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.621 [INFO][5123] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.4/26] handle="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.621 [INFO][5123] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:24.659176 containerd[1873]: 2025-09-09 23:44:24.621 [INFO][5123] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.4/26] IPv6=[] ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" HandleID="k8s-pod-network.9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.623 [INFO][5097] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0", GenerateName:"calico-kube-controllers-66795654f5-", Namespace:"calico-system", SelfLink:"", UID:"17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66795654f5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"calico-kube-controllers-66795654f5-xmhj4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali41c039880a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.623 [INFO][5097] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.4/32] ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.623 [INFO][5097] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali41c039880a1 ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.638 [INFO][5097] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.638 [INFO][5097] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0", GenerateName:"calico-kube-controllers-66795654f5-", Namespace:"calico-system", SelfLink:"", UID:"17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"66795654f5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5", Pod:"calico-kube-controllers-66795654f5-xmhj4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.70.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali41c039880a1", MAC:"16:ce:eb:79:c9:1f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:24.660372 containerd[1873]: 2025-09-09 23:44:24.656 [INFO][5097] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" Namespace="calico-system" Pod="calico-kube-controllers-66795654f5-xmhj4" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--kube--controllers--66795654f5--xmhj4-eth0" Sep 9 23:44:24.724330 containerd[1873]: time="2025-09-09T23:44:24.724281390Z" level=info msg="connecting to shim 9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5" address="unix:///run/containerd/s/039823e1ddd777d6d5bc666bd1de7640dd3d81b6a3e8e4d156f58484a02bf55e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:24.737061 systemd-networkd[1684]: cali673a955ca61: Link UP Sep 9 23:44:24.737793 systemd-networkd[1684]: cali673a955ca61: Gained carrier Sep 9 23:44:24.761201 systemd[1]: Started cri-containerd-9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5.scope - libcontainer container 9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5. Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.558 [INFO][5107] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0 calico-apiserver-f95989557- calico-apiserver 8214ef4d-09c4-4978-9043-c9a3b2f63681 788 0 2025-09-09 23:43:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f95989557 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c calico-apiserver-f95989557-fwwz2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali673a955ca61 [] [] }} ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.559 [INFO][5107] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.594 [INFO][5128] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" HandleID="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.596 [INFO][5128] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" HandleID="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"calico-apiserver-f95989557-fwwz2", "timestamp":"2025-09-09 23:44:24.594127044 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.596 [INFO][5128] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.621 [INFO][5128] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.621 [INFO][5128] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.688 [INFO][5128] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.698 [INFO][5128] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.703 [INFO][5128] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.705 [INFO][5128] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.707 [INFO][5128] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.707 [INFO][5128] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.709 [INFO][5128] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203 Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.717 [INFO][5128] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.731 [INFO][5128] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.5/26] block=192.168.70.0/26 handle="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.731 [INFO][5128] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.5/26] handle="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.731 [INFO][5128] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:24.765106 containerd[1873]: 2025-09-09 23:44:24.732 [INFO][5128] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.5/26] IPv6=[] ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" HandleID="k8s-pod-network.fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.734 [INFO][5107] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0", GenerateName:"calico-apiserver-f95989557-", Namespace:"calico-apiserver", SelfLink:"", UID:"8214ef4d-09c4-4978-9043-c9a3b2f63681", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f95989557", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"calico-apiserver-f95989557-fwwz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali673a955ca61", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.734 [INFO][5107] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.5/32] ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.734 [INFO][5107] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali673a955ca61 ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.737 [INFO][5107] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.740 [INFO][5107] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0", GenerateName:"calico-apiserver-f95989557-", Namespace:"calico-apiserver", SelfLink:"", UID:"8214ef4d-09c4-4978-9043-c9a3b2f63681", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f95989557", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203", Pod:"calico-apiserver-f95989557-fwwz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali673a955ca61", MAC:"b2:a0:71:83:cf:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:24.765573 containerd[1873]: 2025-09-09 23:44:24.760 [INFO][5107] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-fwwz2" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--fwwz2-eth0" Sep 9 23:44:24.816790 containerd[1873]: time="2025-09-09T23:44:24.816747623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-66795654f5-xmhj4,Uid:17f88a19-b388-4ae6-bd2d-76c9f0f7cf7f,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5\"" Sep 9 23:44:24.819117 containerd[1873]: time="2025-09-09T23:44:24.818792937Z" level=info msg="connecting to shim fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203" address="unix:///run/containerd/s/41abadc055161007350b7b5973fd9f5425b8b45aeae3686f0f507dfe99d5a20d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:24.839125 systemd[1]: Started cri-containerd-fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203.scope - libcontainer container fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203. Sep 9 23:44:24.868927 containerd[1873]: time="2025-09-09T23:44:24.868886657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-fwwz2,Uid:8214ef4d-09c4-4978-9043-c9a3b2f63681,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203\"" Sep 9 23:44:25.706046 containerd[1873]: time="2025-09-09T23:44:25.705815190Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:25.709102 containerd[1873]: time="2025-09-09T23:44:25.709074751Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 23:44:25.712360 containerd[1873]: time="2025-09-09T23:44:25.712330088Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:25.717358 containerd[1873]: time="2025-09-09T23:44:25.717323576Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:25.717933 containerd[1873]: time="2025-09-09T23:44:25.717614577Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.442042064s" Sep 9 23:44:25.717933 containerd[1873]: time="2025-09-09T23:44:25.717640154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 23:44:25.719285 containerd[1873]: time="2025-09-09T23:44:25.719252646Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 23:44:25.720650 containerd[1873]: time="2025-09-09T23:44:25.720624850Z" level=info msg="CreateContainer within sandbox \"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 23:44:25.747949 containerd[1873]: time="2025-09-09T23:44:25.746844692Z" level=info msg="Container fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:25.772628 containerd[1873]: time="2025-09-09T23:44:25.772564774Z" level=info msg="CreateContainer within sandbox \"9481d119442754b7a1358fb6195c4962fe3510bc91303da77cf0d6511ce7fcdb\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089\"" Sep 9 23:44:25.774429 containerd[1873]: time="2025-09-09T23:44:25.773225235Z" level=info msg="StartContainer for \"fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089\"" Sep 9 23:44:25.774429 containerd[1873]: time="2025-09-09T23:44:25.774167785Z" level=info msg="connecting to shim fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089" address="unix:///run/containerd/s/ecc200118ee48ed9e1edf91baca97be3f41c73503ac25b4f7dbae57bae467182" protocol=ttrpc version=3 Sep 9 23:44:25.793128 systemd[1]: Started cri-containerd-fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089.scope - libcontainer container fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089. Sep 9 23:44:25.886132 systemd-networkd[1684]: cali673a955ca61: Gained IPv6LL Sep 9 23:44:25.901702 containerd[1873]: time="2025-09-09T23:44:25.901634797Z" level=info msg="StartContainer for \"fb7f1687b1c1be1c2bfdec8bfb7e2378479681ff4d8dfbcd6359e7b8f1cea089\" returns successfully" Sep 9 23:44:26.462344 systemd-networkd[1684]: cali41c039880a1: Gained IPv6LL Sep 9 23:44:26.493413 containerd[1873]: time="2025-09-09T23:44:26.493130194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56bns,Uid:953db4df-f8b1-44fd-8681-7064a8d9e059,Namespace:kube-system,Attempt:0,}" Sep 9 23:44:26.493832 containerd[1873]: time="2025-09-09T23:44:26.493183884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-pl29w,Uid:7694c43b-1384-4c4a-886a-a74f20763acd,Namespace:calico-system,Attempt:0,}" Sep 9 23:44:26.599161 kubelet[3429]: I0909 23:44:26.599122 3429 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 23:44:26.599161 kubelet[3429]: I0909 23:44:26.599214 3429 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 23:44:26.644302 systemd-networkd[1684]: calib04d73b0723: Link UP Sep 9 23:44:26.649032 systemd-networkd[1684]: calib04d73b0723: Gained carrier Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.562 [INFO][5282] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0 coredns-668d6bf9bc- kube-system 953db4df-f8b1-44fd-8681-7064a8d9e059 790 0 2025-09-09 23:43:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c coredns-668d6bf9bc-56bns eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib04d73b0723 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.562 [INFO][5282] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.591 [INFO][5306] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" HandleID="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.591 [INFO][5306] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" HandleID="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b210), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"coredns-668d6bf9bc-56bns", "timestamp":"2025-09-09 23:44:26.591445335 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.591 [INFO][5306] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.591 [INFO][5306] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.591 [INFO][5306] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.600 [INFO][5306] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.606 [INFO][5306] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.610 [INFO][5306] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.612 [INFO][5306] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.616 [INFO][5306] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.616 [INFO][5306] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.620 [INFO][5306] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08 Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.627 [INFO][5306] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.635 [INFO][5306] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.6/26] block=192.168.70.0/26 handle="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.635 [INFO][5306] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.6/26] handle="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.635 [INFO][5306] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:26.672693 containerd[1873]: 2025-09-09 23:44:26.636 [INFO][5306] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.6/26] IPv6=[] ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" HandleID="k8s-pod-network.cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Workload="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.639 [INFO][5282] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"953db4df-f8b1-44fd-8681-7064a8d9e059", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"coredns-668d6bf9bc-56bns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib04d73b0723", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.639 [INFO][5282] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.6/32] ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.639 [INFO][5282] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib04d73b0723 ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.650 [INFO][5282] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.654 [INFO][5282] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"953db4df-f8b1-44fd-8681-7064a8d9e059", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08", Pod:"coredns-668d6bf9bc-56bns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.70.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib04d73b0723", MAC:"de:e0:1c:6b:34:25", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:26.674072 containerd[1873]: 2025-09-09 23:44:26.671 [INFO][5282] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" Namespace="kube-system" Pod="coredns-668d6bf9bc-56bns" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-coredns--668d6bf9bc--56bns-eth0" Sep 9 23:44:26.675888 kubelet[3429]: I0909 23:44:26.675707 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5n8qp" podStartSLOduration=22.86376037 podStartE2EDuration="25.675688727s" podCreationTimestamp="2025-09-09 23:44:01 +0000 UTC" firstStartedPulling="2025-09-09 23:44:22.906596402 +0000 UTC m=+38.483949551" lastFinishedPulling="2025-09-09 23:44:25.718524759 +0000 UTC m=+41.295877908" observedRunningTime="2025-09-09 23:44:26.674766746 +0000 UTC m=+42.252119895" watchObservedRunningTime="2025-09-09 23:44:26.675688727 +0000 UTC m=+42.253041876" Sep 9 23:44:26.740136 systemd-networkd[1684]: calie2e9919e62d: Link UP Sep 9 23:44:26.740985 systemd-networkd[1684]: calie2e9919e62d: Gained carrier Sep 9 23:44:26.762994 containerd[1873]: time="2025-09-09T23:44:26.762813364Z" level=info msg="connecting to shim cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08" address="unix:///run/containerd/s/7a52d0380d82b85b6a04157f16073e198af53705dbc6f0e07e817dc02abe38fc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.571 [INFO][5294] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0 goldmane-54d579b49d- calico-system 7694c43b-1384-4c4a-886a-a74f20763acd 787 0 2025-09-09 23:44:01 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c goldmane-54d579b49d-pl29w eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie2e9919e62d [] [] }} ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.571 [INFO][5294] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.602 [INFO][5311] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" HandleID="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Workload="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.604 [INFO][5311] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" HandleID="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Workload="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"goldmane-54d579b49d-pl29w", "timestamp":"2025-09-09 23:44:26.602429743 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.605 [INFO][5311] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.636 [INFO][5311] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.636 [INFO][5311] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.700 [INFO][5311] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.706 [INFO][5311] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.711 [INFO][5311] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.713 [INFO][5311] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.715 [INFO][5311] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.715 [INFO][5311] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.716 [INFO][5311] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20 Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.725 [INFO][5311] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.735 [INFO][5311] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.7/26] block=192.168.70.0/26 handle="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.735 [INFO][5311] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.7/26] handle="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.735 [INFO][5311] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:26.771105 containerd[1873]: 2025-09-09 23:44:26.735 [INFO][5311] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.7/26] IPv6=[] ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" HandleID="k8s-pod-network.dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Workload="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.736 [INFO][5294] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7694c43b-1384-4c4a-886a-a74f20763acd", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"goldmane-54d579b49d-pl29w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie2e9919e62d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.737 [INFO][5294] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.7/32] ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.737 [INFO][5294] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie2e9919e62d ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.741 [INFO][5294] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.741 [INFO][5294] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7694c43b-1384-4c4a-886a-a74f20763acd", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20", Pod:"goldmane-54d579b49d-pl29w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.70.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie2e9919e62d", MAC:"ba:a3:90:7c:41:5f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:26.775569 containerd[1873]: 2025-09-09 23:44:26.766 [INFO][5294] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" Namespace="calico-system" Pod="goldmane-54d579b49d-pl29w" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-goldmane--54d579b49d--pl29w-eth0" Sep 9 23:44:26.789122 systemd[1]: Started cri-containerd-cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08.scope - libcontainer container cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08. Sep 9 23:44:26.828306 containerd[1873]: time="2025-09-09T23:44:26.828269050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-56bns,Uid:953db4df-f8b1-44fd-8681-7064a8d9e059,Namespace:kube-system,Attempt:0,} returns sandbox id \"cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08\"" Sep 9 23:44:26.833235 containerd[1873]: time="2025-09-09T23:44:26.833195944Z" level=info msg="CreateContainer within sandbox \"cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:44:26.898504 containerd[1873]: time="2025-09-09T23:44:26.898143149Z" level=info msg="connecting to shim dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20" address="unix:///run/containerd/s/2a2145efe2e41038521021e22b91c07026ee082d6561198917c4b08d84bc88e1" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:26.926272 systemd[1]: Started cri-containerd-dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20.scope - libcontainer container dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20. Sep 9 23:44:27.253150 containerd[1873]: time="2025-09-09T23:44:27.253117369Z" level=info msg="Container cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:27.405263 containerd[1873]: time="2025-09-09T23:44:27.405157258Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-pl29w,Uid:7694c43b-1384-4c4a-886a-a74f20763acd,Namespace:calico-system,Attempt:0,} returns sandbox id \"dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20\"" Sep 9 23:44:27.414019 containerd[1873]: time="2025-09-09T23:44:27.413925948Z" level=info msg="CreateContainer within sandbox \"cd1c6528842e2d90bc89e0e28fe58245e958881c9accc60b5869c91821f8fe08\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3\"" Sep 9 23:44:27.414359 containerd[1873]: time="2025-09-09T23:44:27.414266087Z" level=info msg="StartContainer for \"cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3\"" Sep 9 23:44:27.415252 containerd[1873]: time="2025-09-09T23:44:27.415192460Z" level=info msg="connecting to shim cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3" address="unix:///run/containerd/s/7a52d0380d82b85b6a04157f16073e198af53705dbc6f0e07e817dc02abe38fc" protocol=ttrpc version=3 Sep 9 23:44:27.430116 systemd[1]: Started cri-containerd-cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3.scope - libcontainer container cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3. Sep 9 23:44:27.453763 containerd[1873]: time="2025-09-09T23:44:27.453725361Z" level=info msg="StartContainer for \"cb5de343cf3ceb12f41681efdddbe90901370683f981c4efa7664eb2fc4c2ed3\" returns successfully" Sep 9 23:44:28.190433 systemd-networkd[1684]: calie2e9919e62d: Gained IPv6LL Sep 9 23:44:28.316556 kubelet[3429]: I0909 23:44:28.316520 3429 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:44:28.406715 containerd[1873]: time="2025-09-09T23:44:28.406664195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"d2b2d24d732c0474f668f3830e984400254d7b287398928ce5f90b8c490b3e73\" pid:5480 exited_at:{seconds:1757461468 nanos:406396050}" Sep 9 23:44:28.691967 kubelet[3429]: I0909 23:44:28.444035 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-56bns" podStartSLOduration=39.444018706 podStartE2EDuration="39.444018706s" podCreationTimestamp="2025-09-09 23:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:44:27.677420695 +0000 UTC m=+43.254773844" watchObservedRunningTime="2025-09-09 23:44:28.444018706 +0000 UTC m=+44.021371863" Sep 9 23:44:28.510321 systemd-networkd[1684]: calib04d73b0723: Gained IPv6LL Sep 9 23:44:28.692379 containerd[1873]: time="2025-09-09T23:44:28.493090506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-t9r46,Uid:82838c02-ad3d-4792-979e-e1bdb0c4a89c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:44:28.692379 containerd[1873]: time="2025-09-09T23:44:28.501715630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"7ef61c4d5c8dc790092de2e6dbb504eec42fad96fcbad6b1a3418a4de0401cc3\" pid:5503 exited_at:{seconds:1757461468 nanos:501260360}" Sep 9 23:44:29.179556 containerd[1873]: time="2025-09-09T23:44:29.179483118Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:29.183177 containerd[1873]: time="2025-09-09T23:44:29.183147252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 23:44:29.193021 containerd[1873]: time="2025-09-09T23:44:29.192602243Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:29.202145 containerd[1873]: time="2025-09-09T23:44:29.201968176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:29.202477 containerd[1873]: time="2025-09-09T23:44:29.202249473Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.482966898s" Sep 9 23:44:29.202477 containerd[1873]: time="2025-09-09T23:44:29.202384565Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 23:44:29.204509 containerd[1873]: time="2025-09-09T23:44:29.204122069Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:44:29.229461 containerd[1873]: time="2025-09-09T23:44:29.229427289Z" level=info msg="CreateContainer within sandbox \"9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 23:44:29.260192 systemd-networkd[1684]: calidfefcfb691c: Link UP Sep 9 23:44:29.263286 systemd-networkd[1684]: calidfefcfb691c: Gained carrier Sep 9 23:44:29.266790 containerd[1873]: time="2025-09-09T23:44:29.266639516Z" level=info msg="Container 4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.172 [INFO][5519] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0 calico-apiserver-f95989557- calico-apiserver 82838c02-ad3d-4792-979e-e1bdb0c4a89c 791 0 2025-09-09 23:43:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f95989557 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.0.0-n-c59ad9327c calico-apiserver-f95989557-t9r46 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidfefcfb691c [] [] }} ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.172 [INFO][5519] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.193 [INFO][5531] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" HandleID="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.193 [INFO][5531] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" HandleID="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.0.0-n-c59ad9327c", "pod":"calico-apiserver-f95989557-t9r46", "timestamp":"2025-09-09 23:44:29.193785945 +0000 UTC"}, Hostname:"ci-4426.0.0-n-c59ad9327c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.194 [INFO][5531] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.194 [INFO][5531] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.194 [INFO][5531] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.0.0-n-c59ad9327c' Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.199 [INFO][5531] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.207 [INFO][5531] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.220 [INFO][5531] ipam/ipam.go 511: Trying affinity for 192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.226 [INFO][5531] ipam/ipam.go 158: Attempting to load block cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.230 [INFO][5531] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.70.0/26 host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.230 [INFO][5531] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.70.0/26 handle="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.232 [INFO][5531] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.242 [INFO][5531] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.70.0/26 handle="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.255 [INFO][5531] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.70.8/26] block=192.168.70.0/26 handle="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.255 [INFO][5531] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.70.8/26] handle="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" host="ci-4426.0.0-n-c59ad9327c" Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.255 [INFO][5531] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:44:29.282364 containerd[1873]: 2025-09-09 23:44:29.255 [INFO][5531] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.70.8/26] IPv6=[] ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" HandleID="k8s-pod-network.0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Workload="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.256 [INFO][5519] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0", GenerateName:"calico-apiserver-f95989557-", Namespace:"calico-apiserver", SelfLink:"", UID:"82838c02-ad3d-4792-979e-e1bdb0c4a89c", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f95989557", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"", Pod:"calico-apiserver-f95989557-t9r46", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidfefcfb691c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.256 [INFO][5519] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.70.8/32] ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.256 [INFO][5519] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidfefcfb691c ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.263 [INFO][5519] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.266 [INFO][5519] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0", GenerateName:"calico-apiserver-f95989557-", Namespace:"calico-apiserver", SelfLink:"", UID:"82838c02-ad3d-4792-979e-e1bdb0c4a89c", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f95989557", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.0.0-n-c59ad9327c", ContainerID:"0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d", Pod:"calico-apiserver-f95989557-t9r46", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.70.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidfefcfb691c", MAC:"f6:9c:71:b8:2a:14", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:44:29.282968 containerd[1873]: 2025-09-09 23:44:29.279 [INFO][5519] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" Namespace="calico-apiserver" Pod="calico-apiserver-f95989557-t9r46" WorkloadEndpoint="ci--4426.0.0--n--c59ad9327c-k8s-calico--apiserver--f95989557--t9r46-eth0" Sep 9 23:44:29.292016 containerd[1873]: time="2025-09-09T23:44:29.291966161Z" level=info msg="CreateContainer within sandbox \"9a898e74dd4296c9f6bfa87e7c7fcc3fc916999866bd62e0d9c3c9f572675ae5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\"" Sep 9 23:44:29.292835 containerd[1873]: time="2025-09-09T23:44:29.292806292Z" level=info msg="StartContainer for \"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\"" Sep 9 23:44:29.293955 containerd[1873]: time="2025-09-09T23:44:29.293846677Z" level=info msg="connecting to shim 4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77" address="unix:///run/containerd/s/039823e1ddd777d6d5bc666bd1de7640dd3d81b6a3e8e4d156f58484a02bf55e" protocol=ttrpc version=3 Sep 9 23:44:29.317125 systemd[1]: Started cri-containerd-4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77.scope - libcontainer container 4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77. Sep 9 23:44:29.348014 containerd[1873]: time="2025-09-09T23:44:29.347904149Z" level=info msg="connecting to shim 0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d" address="unix:///run/containerd/s/088fff8da9985d0cee647babeae9e4b6966c8b85bb51abbd998ac9f2ce0f0bd8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:44:29.354840 containerd[1873]: time="2025-09-09T23:44:29.354675150Z" level=info msg="StartContainer for \"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" returns successfully" Sep 9 23:44:29.377340 systemd[1]: Started cri-containerd-0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d.scope - libcontainer container 0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d. Sep 9 23:44:29.424273 containerd[1873]: time="2025-09-09T23:44:29.424238135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f95989557-t9r46,Uid:82838c02-ad3d-4792-979e-e1bdb0c4a89c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d\"" Sep 9 23:44:29.686710 kubelet[3429]: I0909 23:44:29.686656 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-66795654f5-xmhj4" podStartSLOduration=24.301552988 podStartE2EDuration="28.686639704s" podCreationTimestamp="2025-09-09 23:44:01 +0000 UTC" firstStartedPulling="2025-09-09 23:44:24.818493679 +0000 UTC m=+40.395846836" lastFinishedPulling="2025-09-09 23:44:29.203580395 +0000 UTC m=+44.780933552" observedRunningTime="2025-09-09 23:44:29.685976827 +0000 UTC m=+45.263329984" watchObservedRunningTime="2025-09-09 23:44:29.686639704 +0000 UTC m=+45.263992917" Sep 9 23:44:29.699841 containerd[1873]: time="2025-09-09T23:44:29.699802591Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"f3d6660bb3cbad326b9547af4fe4b7c078075d8b0f610b7507f0dc306e548f98\" pid:5653 exited_at:{seconds:1757461469 nanos:699551606}" Sep 9 23:44:31.007158 systemd-networkd[1684]: calidfefcfb691c: Gained IPv6LL Sep 9 23:44:31.711881 containerd[1873]: time="2025-09-09T23:44:31.711826578Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:31.716065 containerd[1873]: time="2025-09-09T23:44:31.716023791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 23:44:31.720204 containerd[1873]: time="2025-09-09T23:44:31.720155138Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:31.724951 containerd[1873]: time="2025-09-09T23:44:31.724915521Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:31.725504 containerd[1873]: time="2025-09-09T23:44:31.725427209Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.521265874s" Sep 9 23:44:31.725504 containerd[1873]: time="2025-09-09T23:44:31.725452578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:44:31.726655 containerd[1873]: time="2025-09-09T23:44:31.726367791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 23:44:31.728758 containerd[1873]: time="2025-09-09T23:44:31.728730810Z" level=info msg="CreateContainer within sandbox \"fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:44:31.765531 containerd[1873]: time="2025-09-09T23:44:31.765494743Z" level=info msg="Container 792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:31.790446 containerd[1873]: time="2025-09-09T23:44:31.790407300Z" level=info msg="CreateContainer within sandbox \"fe44405fa2a7725c8660b13755391f94d3b88c44272a291624b0a104c87de203\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657\"" Sep 9 23:44:31.791492 containerd[1873]: time="2025-09-09T23:44:31.791464214Z" level=info msg="StartContainer for \"792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657\"" Sep 9 23:44:31.795015 containerd[1873]: time="2025-09-09T23:44:31.794820144Z" level=info msg="connecting to shim 792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657" address="unix:///run/containerd/s/41abadc055161007350b7b5973fd9f5425b8b45aeae3686f0f507dfe99d5a20d" protocol=ttrpc version=3 Sep 9 23:44:31.838142 systemd[1]: Started cri-containerd-792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657.scope - libcontainer container 792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657. Sep 9 23:44:31.944023 containerd[1873]: time="2025-09-09T23:44:31.943895573Z" level=info msg="StartContainer for \"792eda2541ba54a961d9274415363cd32ab0c3cdebdafccf31a699190a4a3657\" returns successfully" Sep 9 23:44:32.704473 kubelet[3429]: I0909 23:44:32.704413 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f95989557-fwwz2" podStartSLOduration=26.848194449 podStartE2EDuration="33.704397686s" podCreationTimestamp="2025-09-09 23:43:59 +0000 UTC" firstStartedPulling="2025-09-09 23:44:24.870068335 +0000 UTC m=+40.447421492" lastFinishedPulling="2025-09-09 23:44:31.726271572 +0000 UTC m=+47.303624729" observedRunningTime="2025-09-09 23:44:32.703869077 +0000 UTC m=+48.281222242" watchObservedRunningTime="2025-09-09 23:44:32.704397686 +0000 UTC m=+48.281750835" Sep 9 23:44:36.300826 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2305167423.mount: Deactivated successfully. Sep 9 23:44:37.180523 containerd[1873]: time="2025-09-09T23:44:37.180031536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:37.184535 containerd[1873]: time="2025-09-09T23:44:37.184515319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 23:44:37.189138 containerd[1873]: time="2025-09-09T23:44:37.189116096Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:37.196040 containerd[1873]: time="2025-09-09T23:44:37.195917736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:37.196389 containerd[1873]: time="2025-09-09T23:44:37.196361958Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 5.469968743s" Sep 9 23:44:37.196547 containerd[1873]: time="2025-09-09T23:44:37.196457345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 23:44:37.203810 containerd[1873]: time="2025-09-09T23:44:37.203640757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:44:37.213187 containerd[1873]: time="2025-09-09T23:44:37.212304751Z" level=info msg="CreateContainer within sandbox \"dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 23:44:37.234872 containerd[1873]: time="2025-09-09T23:44:37.234850338Z" level=info msg="Container 6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:37.255463 containerd[1873]: time="2025-09-09T23:44:37.255436646Z" level=info msg="CreateContainer within sandbox \"dbf89352ddeedfad359ca674acf3d413649e92218a00a11fa582bda7c1b05e20\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\"" Sep 9 23:44:37.258125 containerd[1873]: time="2025-09-09T23:44:37.258104611Z" level=info msg="StartContainer for \"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\"" Sep 9 23:44:37.259707 containerd[1873]: time="2025-09-09T23:44:37.259665085Z" level=info msg="connecting to shim 6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8" address="unix:///run/containerd/s/2a2145efe2e41038521021e22b91c07026ee082d6561198917c4b08d84bc88e1" protocol=ttrpc version=3 Sep 9 23:44:37.279946 systemd[1]: Started cri-containerd-6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8.scope - libcontainer container 6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8. Sep 9 23:44:37.316801 containerd[1873]: time="2025-09-09T23:44:37.316777215Z" level=info msg="StartContainer for \"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" returns successfully" Sep 9 23:44:37.563051 containerd[1873]: time="2025-09-09T23:44:37.562836885Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:44:37.570769 containerd[1873]: time="2025-09-09T23:44:37.570462535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 23:44:37.571774 containerd[1873]: time="2025-09-09T23:44:37.571745712Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 367.647653ms" Sep 9 23:44:37.571874 containerd[1873]: time="2025-09-09T23:44:37.571859027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:44:37.574655 containerd[1873]: time="2025-09-09T23:44:37.574355803Z" level=info msg="CreateContainer within sandbox \"0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:44:37.601064 containerd[1873]: time="2025-09-09T23:44:37.601033544Z" level=info msg="Container 37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:44:37.629991 containerd[1873]: time="2025-09-09T23:44:37.629952309Z" level=info msg="CreateContainer within sandbox \"0e84af48a1e653fcb64b7c7e1de6362455ef3856eab153f2f9bb001edfc9805d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83\"" Sep 9 23:44:37.630679 containerd[1873]: time="2025-09-09T23:44:37.630621602Z" level=info msg="StartContainer for \"37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83\"" Sep 9 23:44:37.632538 containerd[1873]: time="2025-09-09T23:44:37.632516110Z" level=info msg="connecting to shim 37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83" address="unix:///run/containerd/s/088fff8da9985d0cee647babeae9e4b6966c8b85bb51abbd998ac9f2ce0f0bd8" protocol=ttrpc version=3 Sep 9 23:44:37.650121 systemd[1]: Started cri-containerd-37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83.scope - libcontainer container 37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83. Sep 9 23:44:37.682936 containerd[1873]: time="2025-09-09T23:44:37.682903131Z" level=info msg="StartContainer for \"37896f13753c4e99ddc6be92a1426a2180f4bcecf85c1403606e3816dd22ac83\" returns successfully" Sep 9 23:44:37.719434 kubelet[3429]: I0909 23:44:37.719037 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-pl29w" podStartSLOduration=26.921866892 podStartE2EDuration="36.718910304s" podCreationTimestamp="2025-09-09 23:44:01 +0000 UTC" firstStartedPulling="2025-09-09 23:44:27.406133066 +0000 UTC m=+42.983486223" lastFinishedPulling="2025-09-09 23:44:37.203176486 +0000 UTC m=+52.780529635" observedRunningTime="2025-09-09 23:44:37.716166561 +0000 UTC m=+53.293519710" watchObservedRunningTime="2025-09-09 23:44:37.718910304 +0000 UTC m=+53.296263453" Sep 9 23:44:37.839728 containerd[1873]: time="2025-09-09T23:44:37.839515903Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"8ed9a116de9baea1e031eaaa68912f33f264340aa6a6efd1f881b07aa4eaf7f4\" pid:5805 exit_status:1 exited_at:{seconds:1757461477 nanos:839121491}" Sep 9 23:44:38.701747 kubelet[3429]: I0909 23:44:38.701682 3429 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:44:38.763214 containerd[1873]: time="2025-09-09T23:44:38.763175459Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"ddd01ba0f17464aa8eda1940eace35823197c9930f39a196828eb5e61e534f75\" pid:5836 exit_status:1 exited_at:{seconds:1757461478 nanos:762865545}" Sep 9 23:44:39.761787 containerd[1873]: time="2025-09-09T23:44:39.761746244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"5269d7fc0a65b99d59fdf656807676ebc1f594123fbe1795ba0a5be27959f0f7\" pid:5866 exit_status:1 exited_at:{seconds:1757461479 nanos:761530725}" Sep 9 23:44:41.479688 kubelet[3429]: I0909 23:44:41.479593 3429 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:44:41.511366 kubelet[3429]: I0909 23:44:41.510882 3429 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f95989557-t9r46" podStartSLOduration=34.364207029 podStartE2EDuration="42.510870305s" podCreationTimestamp="2025-09-09 23:43:59 +0000 UTC" firstStartedPulling="2025-09-09 23:44:29.425795698 +0000 UTC m=+45.003148847" lastFinishedPulling="2025-09-09 23:44:37.572458974 +0000 UTC m=+53.149812123" observedRunningTime="2025-09-09 23:44:37.739591232 +0000 UTC m=+53.316944381" watchObservedRunningTime="2025-09-09 23:44:41.510870305 +0000 UTC m=+57.088223454" Sep 9 23:44:53.606961 containerd[1873]: time="2025-09-09T23:44:53.606919147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"cca1b941a2ccb29db05464a2ae33b487def065f525f0294f175c928a78812742\" pid:5902 exited_at:{seconds:1757461493 nanos:605797967}" Sep 9 23:44:58.473882 containerd[1873]: time="2025-09-09T23:44:58.473835698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"b91e4fcc0b0667d24d52bf643748314e02f4094cddafb472cb295339277371af\" pid:5928 exited_at:{seconds:1757461498 nanos:473558073}" Sep 9 23:44:59.700957 containerd[1873]: time="2025-09-09T23:44:59.700914121Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"9e7667ba3dba2d4aa89dc9f8d0fd791c42c7edeeae195a3fa34cd58101d7eda9\" pid:5957 exited_at:{seconds:1757461499 nanos:700609055}" Sep 9 23:45:02.193524 containerd[1873]: time="2025-09-09T23:45:02.193326630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"26b5353b37bb7a3bebd46c0c73b5fb4d1de903c1c7ed4f443e123b4990f11aa0\" pid:5977 exited_at:{seconds:1757461502 nanos:193106095}" Sep 9 23:45:09.751222 containerd[1873]: time="2025-09-09T23:45:09.751181545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"97475fca374d36c916230f00cdf2405296e1c5ee5561876b162e389d4427d198\" pid:6002 exited_at:{seconds:1757461509 nanos:750881496}" Sep 9 23:45:28.468226 containerd[1873]: time="2025-09-09T23:45:28.468129126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"ced0e0ce278c5fa85604c5bbc26a6abdbd5b4e02f98ba5b3ff0572c30f9c3701\" pid:6030 exited_at:{seconds:1757461528 nanos:467818156}" Sep 9 23:45:29.697253 containerd[1873]: time="2025-09-09T23:45:29.697206871Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"224908505cc3255e4d17139d5b2cff3f0a554def308722dc192c4aa430db264d\" pid:6053 exited_at:{seconds:1757461529 nanos:696792922}" Sep 9 23:45:39.753578 containerd[1873]: time="2025-09-09T23:45:39.753402995Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"1fb9581f101e886e0bbb7494e20100575d03bf681b80845e7a5a21522f775950\" pid:6080 exited_at:{seconds:1757461539 nanos:752850018}" Sep 9 23:45:53.603028 containerd[1873]: time="2025-09-09T23:45:53.602903404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"6d9651c2a6a833d33affa4e46f8a625f89cf9270d0c48e7ea3bf16452499993f\" pid:6116 exited_at:{seconds:1757461553 nanos:602565250}" Sep 9 23:45:58.462237 containerd[1873]: time="2025-09-09T23:45:58.462201723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"08c7c67f332e24478de0f960efacb900bcc30a211bc979d9e1311aa902b0a4eb\" pid:6152 exited_at:{seconds:1757461558 nanos:460832128}" Sep 9 23:45:59.696832 containerd[1873]: time="2025-09-09T23:45:59.696794695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"322519f1ffdb4323bfa78abc13cd3f98ab5575926f830f51ee3e181ae6092d77\" pid:6177 exited_at:{seconds:1757461559 nanos:696559232}" Sep 9 23:46:01.575307 systemd[1]: Started sshd@7-10.200.20.16:22-10.200.16.10:49620.service - OpenSSH per-connection server daemon (10.200.16.10:49620). Sep 9 23:46:02.002338 sshd[6191]: Accepted publickey for core from 10.200.16.10 port 49620 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:02.004216 sshd-session[6191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:02.011223 systemd-logind[1853]: New session 10 of user core. Sep 9 23:46:02.020142 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 23:46:02.194534 containerd[1873]: time="2025-09-09T23:46:02.194497895Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"82fd23c442a225ba70235c27b658f338b11ac3350d792ad8d0cce529ffeadad9\" pid:6207 exited_at:{seconds:1757461562 nanos:194340682}" Sep 9 23:46:02.402720 sshd[6194]: Connection closed by 10.200.16.10 port 49620 Sep 9 23:46:02.405219 sshd-session[6191]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:02.408411 systemd-logind[1853]: Session 10 logged out. Waiting for processes to exit. Sep 9 23:46:02.408474 systemd[1]: sshd@7-10.200.20.16:22-10.200.16.10:49620.service: Deactivated successfully. Sep 9 23:46:02.410270 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 23:46:02.412079 systemd-logind[1853]: Removed session 10. Sep 9 23:46:07.496650 systemd[1]: Started sshd@8-10.200.20.16:22-10.200.16.10:49636.service - OpenSSH per-connection server daemon (10.200.16.10:49636). Sep 9 23:46:07.992852 sshd[6229]: Accepted publickey for core from 10.200.16.10 port 49636 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:07.994107 sshd-session[6229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:07.998100 systemd-logind[1853]: New session 11 of user core. Sep 9 23:46:08.004353 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 23:46:08.407557 sshd[6232]: Connection closed by 10.200.16.10 port 49636 Sep 9 23:46:08.408163 sshd-session[6229]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:08.411969 systemd[1]: sshd@8-10.200.20.16:22-10.200.16.10:49636.service: Deactivated successfully. Sep 9 23:46:08.414013 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 23:46:08.415107 systemd-logind[1853]: Session 11 logged out. Waiting for processes to exit. Sep 9 23:46:08.416872 systemd-logind[1853]: Removed session 11. Sep 9 23:46:09.757610 containerd[1873]: time="2025-09-09T23:46:09.757568582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"6cc113e2a3eb41b0a13f1cf915a55b4f46c828117ef4138c2a75ef3fd409c18c\" pid:6255 exited_at:{seconds:1757461569 nanos:757136769}" Sep 9 23:46:13.503183 systemd[1]: Started sshd@9-10.200.20.16:22-10.200.16.10:57568.service - OpenSSH per-connection server daemon (10.200.16.10:57568). Sep 9 23:46:13.994316 sshd[6267]: Accepted publickey for core from 10.200.16.10 port 57568 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:13.995399 sshd-session[6267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:13.999142 systemd-logind[1853]: New session 12 of user core. Sep 9 23:46:14.011301 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 23:46:14.399122 sshd[6270]: Connection closed by 10.200.16.10 port 57568 Sep 9 23:46:14.399705 sshd-session[6267]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:14.402886 systemd[1]: sshd@9-10.200.20.16:22-10.200.16.10:57568.service: Deactivated successfully. Sep 9 23:46:14.404375 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 23:46:14.405138 systemd-logind[1853]: Session 12 logged out. Waiting for processes to exit. Sep 9 23:46:14.406512 systemd-logind[1853]: Removed session 12. Sep 9 23:46:14.496609 systemd[1]: Started sshd@10-10.200.20.16:22-10.200.16.10:57570.service - OpenSSH per-connection server daemon (10.200.16.10:57570). Sep 9 23:46:14.991735 sshd[6282]: Accepted publickey for core from 10.200.16.10 port 57570 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:14.992784 sshd-session[6282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:14.996380 systemd-logind[1853]: New session 13 of user core. Sep 9 23:46:15.004127 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 23:46:15.430269 sshd[6285]: Connection closed by 10.200.16.10 port 57570 Sep 9 23:46:15.430689 sshd-session[6282]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:15.434096 systemd[1]: sshd@10-10.200.20.16:22-10.200.16.10:57570.service: Deactivated successfully. Sep 9 23:46:15.435656 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 23:46:15.437543 systemd-logind[1853]: Session 13 logged out. Waiting for processes to exit. Sep 9 23:46:15.439200 systemd-logind[1853]: Removed session 13. Sep 9 23:46:15.510315 systemd[1]: Started sshd@11-10.200.20.16:22-10.200.16.10:57580.service - OpenSSH per-connection server daemon (10.200.16.10:57580). Sep 9 23:46:15.966592 sshd[6295]: Accepted publickey for core from 10.200.16.10 port 57580 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:15.967660 sshd-session[6295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:15.971793 systemd-logind[1853]: New session 14 of user core. Sep 9 23:46:15.984126 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 23:46:16.349614 sshd[6298]: Connection closed by 10.200.16.10 port 57580 Sep 9 23:46:16.349926 sshd-session[6295]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:16.353199 systemd[1]: sshd@11-10.200.20.16:22-10.200.16.10:57580.service: Deactivated successfully. Sep 9 23:46:16.354892 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 23:46:16.355827 systemd-logind[1853]: Session 14 logged out. Waiting for processes to exit. Sep 9 23:46:16.357430 systemd-logind[1853]: Removed session 14. Sep 9 23:46:21.419657 systemd[1]: Started sshd@12-10.200.20.16:22-10.200.16.10:58334.service - OpenSSH per-connection server daemon (10.200.16.10:58334). Sep 9 23:46:21.838904 sshd[6315]: Accepted publickey for core from 10.200.16.10 port 58334 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:21.840198 sshd-session[6315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:21.843627 systemd-logind[1853]: New session 15 of user core. Sep 9 23:46:21.849127 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 23:46:22.198914 sshd[6318]: Connection closed by 10.200.16.10 port 58334 Sep 9 23:46:22.199507 sshd-session[6315]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:22.202916 systemd-logind[1853]: Session 15 logged out. Waiting for processes to exit. Sep 9 23:46:22.202983 systemd[1]: sshd@12-10.200.20.16:22-10.200.16.10:58334.service: Deactivated successfully. Sep 9 23:46:22.206797 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 23:46:22.209361 systemd-logind[1853]: Removed session 15. Sep 9 23:46:27.295319 systemd[1]: Started sshd@13-10.200.20.16:22-10.200.16.10:58346.service - OpenSSH per-connection server daemon (10.200.16.10:58346). Sep 9 23:46:27.793496 sshd[6331]: Accepted publickey for core from 10.200.16.10 port 58346 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:27.794979 sshd-session[6331]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:27.798825 systemd-logind[1853]: New session 16 of user core. Sep 9 23:46:27.803122 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 23:46:28.187183 sshd[6334]: Connection closed by 10.200.16.10 port 58346 Sep 9 23:46:28.188837 sshd-session[6331]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:28.191836 systemd-logind[1853]: Session 16 logged out. Waiting for processes to exit. Sep 9 23:46:28.192193 systemd[1]: sshd@13-10.200.20.16:22-10.200.16.10:58346.service: Deactivated successfully. Sep 9 23:46:28.195665 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 23:46:28.197160 systemd-logind[1853]: Removed session 16. Sep 9 23:46:28.487512 containerd[1873]: time="2025-09-09T23:46:28.487474913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"a1627740925ef1e906eb22b124d03cc0129197828dea8f4aab8e85b943ee43e5\" pid:6357 exited_at:{seconds:1757461588 nanos:486738105}" Sep 9 23:46:29.705359 containerd[1873]: time="2025-09-09T23:46:29.705301217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"0cf2cecf50d5f604f1b5d296d97ee42adda5a148872f248e9edff8f2c9aaefee\" pid:6383 exited_at:{seconds:1757461589 nanos:705104994}" Sep 9 23:46:33.277372 systemd[1]: Started sshd@14-10.200.20.16:22-10.200.16.10:57586.service - OpenSSH per-connection server daemon (10.200.16.10:57586). Sep 9 23:46:33.776774 sshd[6393]: Accepted publickey for core from 10.200.16.10 port 57586 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:33.778465 sshd-session[6393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:33.784186 systemd-logind[1853]: New session 17 of user core. Sep 9 23:46:33.789118 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 23:46:34.174825 sshd[6396]: Connection closed by 10.200.16.10 port 57586 Sep 9 23:46:34.174717 sshd-session[6393]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:34.177709 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 23:46:34.178489 systemd[1]: sshd@14-10.200.20.16:22-10.200.16.10:57586.service: Deactivated successfully. Sep 9 23:46:34.181639 systemd-logind[1853]: Session 17 logged out. Waiting for processes to exit. Sep 9 23:46:34.183292 systemd-logind[1853]: Removed session 17. Sep 9 23:46:34.263208 systemd[1]: Started sshd@15-10.200.20.16:22-10.200.16.10:57596.service - OpenSSH per-connection server daemon (10.200.16.10:57596). Sep 9 23:46:34.770492 sshd[6408]: Accepted publickey for core from 10.200.16.10 port 57596 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:34.772257 sshd-session[6408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:34.777231 systemd-logind[1853]: New session 18 of user core. Sep 9 23:46:34.781127 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 23:46:35.317245 sshd[6411]: Connection closed by 10.200.16.10 port 57596 Sep 9 23:46:35.317806 sshd-session[6408]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:35.321394 systemd[1]: sshd@15-10.200.20.16:22-10.200.16.10:57596.service: Deactivated successfully. Sep 9 23:46:35.322752 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 23:46:35.323881 systemd-logind[1853]: Session 18 logged out. Waiting for processes to exit. Sep 9 23:46:35.324866 systemd-logind[1853]: Removed session 18. Sep 9 23:46:35.405589 systemd[1]: Started sshd@16-10.200.20.16:22-10.200.16.10:57612.service - OpenSSH per-connection server daemon (10.200.16.10:57612). Sep 9 23:46:35.907098 sshd[6421]: Accepted publickey for core from 10.200.16.10 port 57612 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:35.908308 sshd-session[6421]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:35.915188 systemd-logind[1853]: New session 19 of user core. Sep 9 23:46:35.921133 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 23:46:36.721729 sshd[6424]: Connection closed by 10.200.16.10 port 57612 Sep 9 23:46:36.722212 sshd-session[6421]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:36.725960 systemd[1]: sshd@16-10.200.20.16:22-10.200.16.10:57612.service: Deactivated successfully. Sep 9 23:46:36.727734 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 23:46:36.729604 systemd-logind[1853]: Session 19 logged out. Waiting for processes to exit. Sep 9 23:46:36.730949 systemd-logind[1853]: Removed session 19. Sep 9 23:46:36.805221 systemd[1]: Started sshd@17-10.200.20.16:22-10.200.16.10:57622.service - OpenSSH per-connection server daemon (10.200.16.10:57622). Sep 9 23:46:37.269750 sshd[6445]: Accepted publickey for core from 10.200.16.10 port 57622 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:37.271160 sshd-session[6445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:37.275019 systemd-logind[1853]: New session 20 of user core. Sep 9 23:46:37.278119 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 23:46:37.751205 sshd[6448]: Connection closed by 10.200.16.10 port 57622 Sep 9 23:46:37.751770 sshd-session[6445]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:37.755672 systemd-logind[1853]: Session 20 logged out. Waiting for processes to exit. Sep 9 23:46:37.755840 systemd[1]: sshd@17-10.200.20.16:22-10.200.16.10:57622.service: Deactivated successfully. Sep 9 23:46:37.757360 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 23:46:37.758812 systemd-logind[1853]: Removed session 20. Sep 9 23:46:37.832565 systemd[1]: Started sshd@18-10.200.20.16:22-10.200.16.10:57634.service - OpenSSH per-connection server daemon (10.200.16.10:57634). Sep 9 23:46:38.283645 sshd[6458]: Accepted publickey for core from 10.200.16.10 port 57634 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:38.284716 sshd-session[6458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:38.288366 systemd-logind[1853]: New session 21 of user core. Sep 9 23:46:38.294121 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 23:46:38.669936 sshd[6461]: Connection closed by 10.200.16.10 port 57634 Sep 9 23:46:38.671639 sshd-session[6458]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:38.674301 systemd[1]: sshd@18-10.200.20.16:22-10.200.16.10:57634.service: Deactivated successfully. Sep 9 23:46:38.676291 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 23:46:38.677600 systemd-logind[1853]: Session 21 logged out. Waiting for processes to exit. Sep 9 23:46:38.678840 systemd-logind[1853]: Removed session 21. Sep 9 23:46:39.752497 containerd[1873]: time="2025-09-09T23:46:39.752355856Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"0d48f6e3f2ba697199518dfded7a6ad8dd3fac9a9e6b372c992edd0632509a22\" pid:6484 exited_at:{seconds:1757461599 nanos:752022557}" Sep 9 23:46:43.757561 systemd[1]: Started sshd@19-10.200.20.16:22-10.200.16.10:41418.service - OpenSSH per-connection server daemon (10.200.16.10:41418). Sep 9 23:46:44.214583 sshd[6497]: Accepted publickey for core from 10.200.16.10 port 41418 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:44.215558 sshd-session[6497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:44.219145 systemd-logind[1853]: New session 22 of user core. Sep 9 23:46:44.225116 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 23:46:44.600125 sshd[6500]: Connection closed by 10.200.16.10 port 41418 Sep 9 23:46:44.602179 sshd-session[6497]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:44.605706 systemd[1]: sshd@19-10.200.20.16:22-10.200.16.10:41418.service: Deactivated successfully. Sep 9 23:46:44.607723 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 23:46:44.609054 systemd-logind[1853]: Session 22 logged out. Waiting for processes to exit. Sep 9 23:46:44.610631 systemd-logind[1853]: Removed session 22. Sep 9 23:46:49.696287 systemd[1]: Started sshd@20-10.200.20.16:22-10.200.16.10:41426.service - OpenSSH per-connection server daemon (10.200.16.10:41426). Sep 9 23:46:50.142531 sshd[6514]: Accepted publickey for core from 10.200.16.10 port 41426 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:50.143546 sshd-session[6514]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:50.146942 systemd-logind[1853]: New session 23 of user core. Sep 9 23:46:50.151209 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 23:46:50.525772 sshd[6517]: Connection closed by 10.200.16.10 port 41426 Sep 9 23:46:50.526473 sshd-session[6514]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:50.529542 systemd[1]: sshd@20-10.200.20.16:22-10.200.16.10:41426.service: Deactivated successfully. Sep 9 23:46:50.531283 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 23:46:50.533964 systemd-logind[1853]: Session 23 logged out. Waiting for processes to exit. Sep 9 23:46:50.535626 systemd-logind[1853]: Removed session 23. Sep 9 23:46:53.602501 containerd[1873]: time="2025-09-09T23:46:53.602398788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"a8a2c4e0e93aad11b5f9ba98e1c115d44e793649301fb140855b6d9f6af718f0\" pid:6543 exited_at:{seconds:1757461613 nanos:601982319}" Sep 9 23:46:55.613323 systemd[1]: Started sshd@21-10.200.20.16:22-10.200.16.10:43118.service - OpenSSH per-connection server daemon (10.200.16.10:43118). Sep 9 23:46:56.078199 sshd[6553]: Accepted publickey for core from 10.200.16.10 port 43118 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:46:56.079942 sshd-session[6553]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:46:56.083441 systemd-logind[1853]: New session 24 of user core. Sep 9 23:46:56.091113 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 23:46:56.457817 sshd[6556]: Connection closed by 10.200.16.10 port 43118 Sep 9 23:46:56.458444 sshd-session[6553]: pam_unix(sshd:session): session closed for user core Sep 9 23:46:56.461878 systemd-logind[1853]: Session 24 logged out. Waiting for processes to exit. Sep 9 23:46:56.462500 systemd[1]: sshd@21-10.200.20.16:22-10.200.16.10:43118.service: Deactivated successfully. Sep 9 23:46:56.466056 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 23:46:56.467433 systemd-logind[1853]: Removed session 24. Sep 9 23:46:58.463935 containerd[1873]: time="2025-09-09T23:46:58.463844543Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfc6823c76e936eaed3d4de7892478f709db8128192617fcd85283a53955e9b6\" id:\"83a4f73b798b12c37866101639d62080846c1d85bf308701e8482581675ec100\" pid:6578 exited_at:{seconds:1757461618 nanos:463591535}" Sep 9 23:46:59.696507 containerd[1873]: time="2025-09-09T23:46:59.696462763Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"c04b863e495ce77245961dab2e599ddc10dbbfb619cda77a85c13a5e068508c8\" pid:6607 exited_at:{seconds:1757461619 nanos:696193466}" Sep 9 23:47:01.541943 systemd[1]: Started sshd@22-10.200.20.16:22-10.200.16.10:60900.service - OpenSSH per-connection server daemon (10.200.16.10:60900). Sep 9 23:47:01.994944 sshd[6617]: Accepted publickey for core from 10.200.16.10 port 60900 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:47:01.996067 sshd-session[6617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:01.999941 systemd-logind[1853]: New session 25 of user core. Sep 9 23:47:02.007119 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 23:47:02.188861 containerd[1873]: time="2025-09-09T23:47:02.188811608Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e3231a6b775372a359a915ce29989662556e8b6073d41a74861618310a7df77\" id:\"ddcf2ac49f581435f025ed7ca331ada4e9b31af60de2bfcbb620f159474e6647\" pid:6632 exited_at:{seconds:1757461622 nanos:188606690}" Sep 9 23:47:02.373187 sshd[6620]: Connection closed by 10.200.16.10 port 60900 Sep 9 23:47:02.373722 sshd-session[6617]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:02.377216 systemd[1]: sshd@22-10.200.20.16:22-10.200.16.10:60900.service: Deactivated successfully. Sep 9 23:47:02.378801 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 23:47:02.379474 systemd-logind[1853]: Session 25 logged out. Waiting for processes to exit. Sep 9 23:47:02.380997 systemd-logind[1853]: Removed session 25. Sep 9 23:47:07.461698 systemd[1]: Started sshd@23-10.200.20.16:22-10.200.16.10:60914.service - OpenSSH per-connection server daemon (10.200.16.10:60914). Sep 9 23:47:07.911762 sshd[6652]: Accepted publickey for core from 10.200.16.10 port 60914 ssh2: RSA SHA256:KyX5lBKi2eDd1vr6ifAfO0y3trFgfVvc0oH4+isjbRs Sep 9 23:47:07.912941 sshd-session[6652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:07.916796 systemd-logind[1853]: New session 26 of user core. Sep 9 23:47:07.923122 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 23:47:08.293084 sshd[6655]: Connection closed by 10.200.16.10 port 60914 Sep 9 23:47:08.293644 sshd-session[6652]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:08.296614 systemd[1]: sshd@23-10.200.20.16:22-10.200.16.10:60914.service: Deactivated successfully. Sep 9 23:47:08.298354 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 23:47:08.299598 systemd-logind[1853]: Session 26 logged out. Waiting for processes to exit. Sep 9 23:47:08.300355 systemd-logind[1853]: Removed session 26. Sep 9 23:47:09.761776 containerd[1873]: time="2025-09-09T23:47:09.761628997Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6860e8450f60493442600e68c0b33dac9062e0fedfe0da2fb462842b61e86dc8\" id:\"7818c29ed2cfb514b720c4d214dbed639b513ece68c406d9bee29a907c8f2e7e\" pid:6680 exited_at:{seconds:1757461629 nanos:761358437}"