Sep 9 04:54:49.203985 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 9 04:54:49.204003 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:54:49.204010 kernel: KASLR enabled Sep 9 04:54:49.204014 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 9 04:54:49.204018 kernel: printk: legacy bootconsole [pl11] enabled Sep 9 04:54:49.204022 kernel: efi: EFI v2.7 by EDK II Sep 9 04:54:49.204027 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3eac7018 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 Sep 9 04:54:49.204031 kernel: random: crng init done Sep 9 04:54:49.204035 kernel: secureboot: Secure boot disabled Sep 9 04:54:49.204039 kernel: ACPI: Early table checksum verification disabled Sep 9 04:54:49.204043 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 9 04:54:49.204047 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204051 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204056 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 9 04:54:49.204061 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204065 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204069 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204074 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204079 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204083 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204087 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 9 04:54:49.204092 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:49.204096 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 9 04:54:49.204100 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:54:49.204104 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 9 04:54:49.204109 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 9 04:54:49.204113 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 9 04:54:49.204117 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 9 04:54:49.204121 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 9 04:54:49.204126 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 9 04:54:49.204131 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 9 04:54:49.204135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 9 04:54:49.204139 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 9 04:54:49.204143 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 9 04:54:49.204148 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 9 04:54:49.204152 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 9 04:54:49.204156 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 9 04:54:49.204160 kernel: NODE_DATA(0) allocated [mem 0x1bf7fda00-0x1bf804fff] Sep 9 04:54:49.204164 kernel: Zone ranges: Sep 9 04:54:49.204169 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 9 04:54:49.204176 kernel: DMA32 empty Sep 9 04:54:49.204180 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 04:54:49.204185 kernel: Device empty Sep 9 04:54:49.204189 kernel: Movable zone start for each node Sep 9 04:54:49.204193 kernel: Early memory node ranges Sep 9 04:54:49.204199 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 9 04:54:49.204203 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 9 04:54:49.204207 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 9 04:54:49.204212 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 9 04:54:49.204216 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 9 04:54:49.204221 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 9 04:54:49.204225 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 9 04:54:49.204229 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 9 04:54:49.204234 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 04:54:49.204238 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 9 04:54:49.204242 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 9 04:54:49.204247 kernel: cma: Reserved 16 MiB at 0x000000003ec00000 on node -1 Sep 9 04:54:49.204252 kernel: psci: probing for conduit method from ACPI. Sep 9 04:54:49.204256 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:54:49.204261 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:54:49.204265 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 9 04:54:49.204270 kernel: psci: SMC Calling Convention v1.4 Sep 9 04:54:49.204274 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 9 04:54:49.204278 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 9 04:54:49.204283 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:54:49.204287 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:54:49.204291 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 9 04:54:49.204296 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:54:49.204301 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 9 04:54:49.204306 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:54:49.204310 kernel: CPU features: detected: Spectre-v4 Sep 9 04:54:49.204314 kernel: CPU features: detected: Spectre-BHB Sep 9 04:54:49.204319 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:54:49.204323 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:54:49.204328 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 9 04:54:49.204332 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:54:49.204336 kernel: alternatives: applying boot alternatives Sep 9 04:54:49.204342 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:54:49.204346 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:54:49.204352 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:54:49.204386 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:54:49.204391 kernel: Fallback order for Node 0: 0 Sep 9 04:54:49.204396 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 9 04:54:49.204400 kernel: Policy zone: Normal Sep 9 04:54:49.204404 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:54:49.204409 kernel: software IO TLB: area num 2. Sep 9 04:54:49.204413 kernel: software IO TLB: mapped [mem 0x000000003a460000-0x000000003e460000] (64MB) Sep 9 04:54:49.204417 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 04:54:49.204422 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:54:49.204427 kernel: rcu: RCU event tracing is enabled. Sep 9 04:54:49.204433 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 04:54:49.204437 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:54:49.204441 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:54:49.204446 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:54:49.204450 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 04:54:49.204455 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 04:54:49.204459 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 04:54:49.204464 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:54:49.204468 kernel: GICv3: 960 SPIs implemented Sep 9 04:54:49.204472 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:54:49.204477 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:54:49.204481 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 9 04:54:49.204486 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 9 04:54:49.204491 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 9 04:54:49.204495 kernel: ITS: No ITS available, not enabling LPIs Sep 9 04:54:49.204500 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:54:49.204504 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 9 04:54:49.204508 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 04:54:49.204513 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 9 04:54:49.204518 kernel: Console: colour dummy device 80x25 Sep 9 04:54:49.204522 kernel: printk: legacy console [tty1] enabled Sep 9 04:54:49.204527 kernel: ACPI: Core revision 20240827 Sep 9 04:54:49.204532 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 9 04:54:49.204537 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:54:49.204542 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:54:49.204546 kernel: landlock: Up and running. Sep 9 04:54:49.204551 kernel: SELinux: Initializing. Sep 9 04:54:49.204555 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:54:49.204564 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:54:49.204569 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 9 04:54:49.204574 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 9 04:54:49.204579 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 9 04:54:49.204583 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:54:49.204588 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:54:49.204594 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:54:49.204599 kernel: Remapping and enabling EFI services. Sep 9 04:54:49.204603 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:54:49.204608 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:54:49.204613 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 9 04:54:49.204619 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 9 04:54:49.204623 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 04:54:49.204628 kernel: SMP: Total of 2 processors activated. Sep 9 04:54:49.204633 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:54:49.204638 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:54:49.204642 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 9 04:54:49.204647 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:54:49.204652 kernel: CPU features: detected: Common not Private translations Sep 9 04:54:49.204657 kernel: CPU features: detected: CRC32 instructions Sep 9 04:54:49.204662 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 9 04:54:49.204667 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:54:49.204672 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:54:49.204677 kernel: CPU features: detected: Privileged Access Never Sep 9 04:54:49.204681 kernel: CPU features: detected: Speculation barrier (SB) Sep 9 04:54:49.204686 kernel: CPU features: detected: TLB range maintenance instructions Sep 9 04:54:49.204691 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:54:49.204696 kernel: CPU features: detected: Scalable Vector Extension Sep 9 04:54:49.204701 kernel: alternatives: applying system-wide alternatives Sep 9 04:54:49.204706 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 9 04:54:49.204711 kernel: SVE: maximum available vector length 16 bytes per vector Sep 9 04:54:49.204716 kernel: SVE: default vector length 16 bytes per vector Sep 9 04:54:49.204721 kernel: Memory: 3959604K/4194160K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 213368K reserved, 16384K cma-reserved) Sep 9 04:54:49.204726 kernel: devtmpfs: initialized Sep 9 04:54:49.204731 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:54:49.204736 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 04:54:49.204740 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:54:49.204745 kernel: 0 pages in range for non-PLT usage Sep 9 04:54:49.204751 kernel: 508560 pages in range for PLT usage Sep 9 04:54:49.204756 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:54:49.204761 kernel: SMBIOS 3.1.0 present. Sep 9 04:54:49.204765 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 9 04:54:49.204770 kernel: DMI: Memory slots populated: 2/2 Sep 9 04:54:49.204775 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:54:49.204779 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:54:49.204784 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:54:49.204789 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:54:49.204795 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:54:49.204800 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Sep 9 04:54:49.204804 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:54:49.204809 kernel: cpuidle: using governor menu Sep 9 04:54:49.204814 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:54:49.204818 kernel: ASID allocator initialised with 32768 entries Sep 9 04:54:49.204823 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:54:49.204828 kernel: Serial: AMBA PL011 UART driver Sep 9 04:54:49.204833 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:54:49.204838 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:54:49.204843 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:54:49.204848 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:54:49.204852 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:54:49.204857 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:54:49.204862 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:54:49.204867 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:54:49.204871 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:54:49.204876 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:54:49.204888 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:54:49.204893 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:54:49.204900 kernel: ACPI: Interpreter enabled Sep 9 04:54:49.204905 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:54:49.204910 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:54:49.204915 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:54:49.204920 kernel: printk: legacy bootconsole [pl11] disabled Sep 9 04:54:49.204924 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 9 04:54:49.204929 kernel: ACPI: CPU0 has been hot-added Sep 9 04:54:49.204935 kernel: ACPI: CPU1 has been hot-added Sep 9 04:54:49.204940 kernel: iommu: Default domain type: Translated Sep 9 04:54:49.204945 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:54:49.204949 kernel: efivars: Registered efivars operations Sep 9 04:54:49.204954 kernel: vgaarb: loaded Sep 9 04:54:49.204959 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:54:49.204963 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:54:49.204968 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:54:49.204973 kernel: pnp: PnP ACPI init Sep 9 04:54:49.204978 kernel: pnp: PnP ACPI: found 0 devices Sep 9 04:54:49.204983 kernel: NET: Registered PF_INET protocol family Sep 9 04:54:49.204988 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:54:49.204992 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:54:49.204997 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:54:49.205002 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:54:49.205007 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:54:49.205011 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:54:49.205016 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:54:49.205022 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:54:49.205027 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:54:49.205031 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:54:49.205036 kernel: kvm [1]: HYP mode not available Sep 9 04:54:49.205041 kernel: Initialise system trusted keyrings Sep 9 04:54:49.205045 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:54:49.205050 kernel: Key type asymmetric registered Sep 9 04:54:49.205054 kernel: Asymmetric key parser 'x509' registered Sep 9 04:54:49.205059 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:54:49.205065 kernel: io scheduler mq-deadline registered Sep 9 04:54:49.205070 kernel: io scheduler kyber registered Sep 9 04:54:49.205074 kernel: io scheduler bfq registered Sep 9 04:54:49.205079 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:54:49.205084 kernel: thunder_xcv, ver 1.0 Sep 9 04:54:49.205089 kernel: thunder_bgx, ver 1.0 Sep 9 04:54:49.205093 kernel: nicpf, ver 1.0 Sep 9 04:54:49.205098 kernel: nicvf, ver 1.0 Sep 9 04:54:49.205216 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:54:49.205268 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:54:48 UTC (1757393688) Sep 9 04:54:49.205274 kernel: efifb: probing for efifb Sep 9 04:54:49.205279 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 9 04:54:49.205284 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 9 04:54:49.205288 kernel: efifb: scrolling: redraw Sep 9 04:54:49.205293 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 9 04:54:49.205298 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 04:54:49.205302 kernel: fb0: EFI VGA frame buffer device Sep 9 04:54:49.205308 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 9 04:54:49.205313 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:54:49.205318 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:54:49.205322 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:54:49.205327 kernel: watchdog: NMI not fully supported Sep 9 04:54:49.205332 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:54:49.205337 kernel: Segment Routing with IPv6 Sep 9 04:54:49.205341 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:54:49.205346 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:54:49.205352 kernel: Key type dns_resolver registered Sep 9 04:54:49.205366 kernel: registered taskstats version 1 Sep 9 04:54:49.205371 kernel: Loading compiled-in X.509 certificates Sep 9 04:54:49.205376 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:54:49.205381 kernel: Demotion targets for Node 0: null Sep 9 04:54:49.205385 kernel: Key type .fscrypt registered Sep 9 04:54:49.205390 kernel: Key type fscrypt-provisioning registered Sep 9 04:54:49.205395 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:54:49.205400 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:54:49.205406 kernel: ima: No architecture policies found Sep 9 04:54:49.205411 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:54:49.205415 kernel: clk: Disabling unused clocks Sep 9 04:54:49.205420 kernel: PM: genpd: Disabling unused power domains Sep 9 04:54:49.205425 kernel: Warning: unable to open an initial console. Sep 9 04:54:49.205430 kernel: Freeing unused kernel memory: 38976K Sep 9 04:54:49.205434 kernel: Run /init as init process Sep 9 04:54:49.205439 kernel: with arguments: Sep 9 04:54:49.205444 kernel: /init Sep 9 04:54:49.205449 kernel: with environment: Sep 9 04:54:49.205454 kernel: HOME=/ Sep 9 04:54:49.205459 kernel: TERM=linux Sep 9 04:54:49.205463 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:54:49.205469 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:54:49.205476 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:54:49.205481 systemd[1]: Detected virtualization microsoft. Sep 9 04:54:49.205487 systemd[1]: Detected architecture arm64. Sep 9 04:54:49.205493 systemd[1]: Running in initrd. Sep 9 04:54:49.205498 systemd[1]: No hostname configured, using default hostname. Sep 9 04:54:49.205503 systemd[1]: Hostname set to . Sep 9 04:54:49.205508 systemd[1]: Initializing machine ID from random generator. Sep 9 04:54:49.205513 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:54:49.205518 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:54:49.205523 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:54:49.205529 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:54:49.205535 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:54:49.205541 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:54:49.205547 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:54:49.205552 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:54:49.205558 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:54:49.205563 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:54:49.205569 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:54:49.205574 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:54:49.205579 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:54:49.205585 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:54:49.205590 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:54:49.205595 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:54:49.205600 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:54:49.205605 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:54:49.205610 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:54:49.205616 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:54:49.205621 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:54:49.205627 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:54:49.205632 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:54:49.205637 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:54:49.205642 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:54:49.205647 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:54:49.205653 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:54:49.205659 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:54:49.205664 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:54:49.205669 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:54:49.205686 systemd-journald[224]: Collecting audit messages is disabled. Sep 9 04:54:49.205700 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:49.205706 systemd-journald[224]: Journal started Sep 9 04:54:49.205721 systemd-journald[224]: Runtime Journal (/run/log/journal/5df7788799dc4db2bda8dbf0901f301c) is 8M, max 78.5M, 70.5M free. Sep 9 04:54:49.197335 systemd-modules-load[226]: Inserted module 'overlay' Sep 9 04:54:49.216430 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:54:49.216446 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:54:49.231199 kernel: Bridge firewalling registered Sep 9 04:54:49.234578 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:54:49.236447 systemd-modules-load[226]: Inserted module 'br_netfilter' Sep 9 04:54:49.246661 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:54:49.258231 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:54:49.268719 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:54:49.279266 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:49.292645 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:54:49.327861 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:54:49.334200 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:54:49.358128 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:54:49.381557 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:54:49.388534 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:54:49.401608 systemd-tmpfiles[251]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:54:49.406063 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:54:49.420477 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:54:49.435684 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:54:49.462479 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:54:49.474978 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:54:49.509102 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:54:49.527878 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:54:49.554516 systemd-resolved[261]: Positive Trust Anchors: Sep 9 04:54:49.554527 systemd-resolved[261]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:54:49.554547 systemd-resolved[261]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:54:49.556306 systemd-resolved[261]: Defaulting to hostname 'linux'. Sep 9 04:54:49.557020 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:54:49.565047 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:54:49.655368 kernel: SCSI subsystem initialized Sep 9 04:54:49.660374 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:54:49.668393 kernel: iscsi: registered transport (tcp) Sep 9 04:54:49.681627 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:54:49.681661 kernel: QLogic iSCSI HBA Driver Sep 9 04:54:49.694599 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:54:49.717668 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:54:49.724853 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:54:49.777724 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:54:49.785071 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:54:49.850383 kernel: raid6: neonx8 gen() 18541 MB/s Sep 9 04:54:49.869367 kernel: raid6: neonx4 gen() 18550 MB/s Sep 9 04:54:49.888364 kernel: raid6: neonx2 gen() 17102 MB/s Sep 9 04:54:49.909365 kernel: raid6: neonx1 gen() 15038 MB/s Sep 9 04:54:49.928365 kernel: raid6: int64x8 gen() 10514 MB/s Sep 9 04:54:49.948384 kernel: raid6: int64x4 gen() 10611 MB/s Sep 9 04:54:49.969365 kernel: raid6: int64x2 gen() 8980 MB/s Sep 9 04:54:49.992533 kernel: raid6: int64x1 gen() 7004 MB/s Sep 9 04:54:49.992546 kernel: raid6: using algorithm neonx4 gen() 18550 MB/s Sep 9 04:54:50.016399 kernel: raid6: .... xor() 15145 MB/s, rmw enabled Sep 9 04:54:50.016475 kernel: raid6: using neon recovery algorithm Sep 9 04:54:50.026867 kernel: xor: measuring software checksum speed Sep 9 04:54:50.026925 kernel: 8regs : 28657 MB/sec Sep 9 04:54:50.030055 kernel: 32regs : 28822 MB/sec Sep 9 04:54:50.033550 kernel: arm64_neon : 37840 MB/sec Sep 9 04:54:50.037428 kernel: xor: using function: arm64_neon (37840 MB/sec) Sep 9 04:54:50.076384 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:54:50.081856 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:54:50.093493 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:54:50.118232 systemd-udevd[473]: Using default interface naming scheme 'v255'. Sep 9 04:54:50.122897 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:54:50.132496 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:54:50.164648 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 9 04:54:50.187256 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:54:50.194474 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:54:50.243289 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:54:50.252970 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:54:50.321466 kernel: hv_vmbus: Vmbus version:5.3 Sep 9 04:54:50.352413 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 9 04:54:50.352464 kernel: hv_vmbus: registering driver hid_hyperv Sep 9 04:54:50.352472 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Sep 9 04:54:50.352486 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Sep 9 04:54:50.352493 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 9 04:54:50.362523 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 9 04:54:50.362699 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 9 04:54:50.376566 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:54:50.376696 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:50.404491 kernel: hv_vmbus: registering driver hv_storvsc Sep 9 04:54:50.404512 kernel: hv_vmbus: registering driver hv_netvsc Sep 9 04:54:50.404519 kernel: scsi host1: storvsc_host_t Sep 9 04:54:50.404642 kernel: scsi host0: storvsc_host_t Sep 9 04:54:50.403982 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:50.429875 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 9 04:54:50.429915 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 9 04:54:50.431218 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:50.440977 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:54:50.478676 kernel: PTP clock support registered Sep 9 04:54:50.478694 kernel: hv_utils: Registering HyperV Utility Driver Sep 9 04:54:50.441780 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:54:50.441849 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:50.799063 kernel: hv_vmbus: registering driver hv_utils Sep 9 04:54:50.799084 kernel: hv_netvsc 000d3afc-c1ec-000d-3afc-c1ec000d3afc eth0: VF slot 1 added Sep 9 04:54:50.799231 kernel: hv_utils: Heartbeat IC version 3.0 Sep 9 04:54:50.799238 kernel: hv_utils: Shutdown IC version 3.2 Sep 9 04:54:50.799245 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 9 04:54:50.799331 kernel: hv_utils: TimeSync IC version 4.0 Sep 9 04:54:50.461809 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:50.831059 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 9 04:54:50.831226 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 9 04:54:50.831297 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 9 04:54:50.831359 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 9 04:54:50.831419 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#131 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:50.805317 systemd-resolved[261]: Clock change detected. Flushing caches. Sep 9 04:54:50.849048 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#138 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:50.805446 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:50.870782 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:50.870823 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 9 04:54:50.886461 kernel: hv_vmbus: registering driver hv_pci Sep 9 04:54:50.886503 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 9 04:54:50.886657 kernel: hv_pci cfddb35a-ecfd-4307-9df8-7dc93a2bc2a9: PCI VMBus probing: Using version 0x10004 Sep 9 04:54:50.886732 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 04:54:50.892193 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 9 04:54:50.905075 kernel: hv_pci cfddb35a-ecfd-4307-9df8-7dc93a2bc2a9: PCI host bridge to bus ecfd:00 Sep 9 04:54:50.905213 kernel: pci_bus ecfd:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 9 04:54:50.911122 kernel: pci_bus ecfd:00: No busn resource found for root bus, will use [bus 00-ff] Sep 9 04:54:50.918516 kernel: pci ecfd:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 9 04:54:50.918564 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#110 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:54:50.931051 kernel: pci ecfd:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 9 04:54:50.937981 kernel: pci ecfd:00:02.0: enabling Extended Tags Sep 9 04:54:50.969548 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#84 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:54:50.969726 kernel: pci ecfd:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at ecfd:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 9 04:54:50.983059 kernel: pci_bus ecfd:00: busn_res: [bus 00-ff] end is updated to 00 Sep 9 04:54:50.983206 kernel: pci ecfd:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 9 04:54:51.042490 kernel: mlx5_core ecfd:00:02.0: enabling device (0000 -> 0002) Sep 9 04:54:51.052807 kernel: mlx5_core ecfd:00:02.0: PTM is not supported by PCIe Sep 9 04:54:51.053010 kernel: mlx5_core ecfd:00:02.0: firmware version: 16.30.5006 Sep 9 04:54:51.225017 kernel: hv_netvsc 000d3afc-c1ec-000d-3afc-c1ec000d3afc eth0: VF registering: eth1 Sep 9 04:54:51.225231 kernel: mlx5_core ecfd:00:02.0 eth1: joined to eth0 Sep 9 04:54:51.231054 kernel: mlx5_core ecfd:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 9 04:54:51.245277 kernel: mlx5_core ecfd:00:02.0 enP60669s1: renamed from eth1 Sep 9 04:54:51.439062 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 9 04:54:51.489215 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 04:54:51.513564 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 9 04:54:51.524853 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 9 04:54:51.533554 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 9 04:54:51.552101 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:54:51.571655 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:54:51.577711 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:54:51.589165 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:54:51.602350 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:54:51.628228 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#107 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:51.622856 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:54:51.647339 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:51.664830 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:54:52.661790 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#87 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:52.674002 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:52.674891 disk-uuid[651]: The operation has completed successfully. Sep 9 04:54:52.749314 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:54:52.749406 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:54:52.769826 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:54:52.796433 sh[817]: Success Sep 9 04:54:52.831939 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:54:52.831999 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:54:52.838265 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:54:52.851997 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:54:53.266016 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:54:53.281552 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:54:53.291526 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:54:53.321006 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (835) Sep 9 04:54:53.322052 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:54:53.333895 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:53.694472 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:54:53.694554 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:54:53.731146 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:54:53.736223 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:54:53.745707 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:54:53.746402 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:54:53.772638 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:54:53.799034 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (858) Sep 9 04:54:53.813339 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:53.813381 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:53.869219 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:53.869280 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:53.871291 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:54:53.886215 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:54:53.902994 kernel: BTRFS info (device sda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:53.906105 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:54:53.914150 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:54:53.941695 systemd-networkd[1002]: lo: Link UP Sep 9 04:54:53.941706 systemd-networkd[1002]: lo: Gained carrier Sep 9 04:54:53.942480 systemd-networkd[1002]: Enumeration completed Sep 9 04:54:53.945076 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:54:53.950709 systemd[1]: Reached target network.target - Network. Sep 9 04:54:53.954098 systemd-networkd[1002]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:54:53.954101 systemd-networkd[1002]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:54:54.024001 kernel: mlx5_core ecfd:00:02.0 enP60669s1: Link up Sep 9 04:54:54.024241 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 04:54:54.057988 kernel: hv_netvsc 000d3afc-c1ec-000d-3afc-c1ec000d3afc eth0: Data path switched to VF: enP60669s1 Sep 9 04:54:54.057995 systemd-networkd[1002]: enP60669s1: Link UP Sep 9 04:54:54.058054 systemd-networkd[1002]: eth0: Link UP Sep 9 04:54:54.058146 systemd-networkd[1002]: eth0: Gained carrier Sep 9 04:54:54.058160 systemd-networkd[1002]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:54:54.085440 systemd-networkd[1002]: enP60669s1: Gained carrier Sep 9 04:54:54.098003 systemd-networkd[1002]: eth0: DHCPv4 address 10.200.20.14/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:54:55.074666 ignition[1005]: Ignition 2.22.0 Sep 9 04:54:55.074682 ignition[1005]: Stage: fetch-offline Sep 9 04:54:55.081167 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:54:55.074778 ignition[1005]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:55.090068 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 04:54:55.074785 ignition[1005]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:55.074853 ignition[1005]: parsed url from cmdline: "" Sep 9 04:54:55.074855 ignition[1005]: no config URL provided Sep 9 04:54:55.074858 ignition[1005]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:54:55.074863 ignition[1005]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:54:55.074867 ignition[1005]: failed to fetch config: resource requires networking Sep 9 04:54:55.075152 ignition[1005]: Ignition finished successfully Sep 9 04:54:55.126366 ignition[1016]: Ignition 2.22.0 Sep 9 04:54:55.126372 ignition[1016]: Stage: fetch Sep 9 04:54:55.126552 ignition[1016]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:55.126559 ignition[1016]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:55.126635 ignition[1016]: parsed url from cmdline: "" Sep 9 04:54:55.126637 ignition[1016]: no config URL provided Sep 9 04:54:55.126640 ignition[1016]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:54:55.126647 ignition[1016]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:54:55.126663 ignition[1016]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 9 04:54:55.197124 ignition[1016]: GET result: OK Sep 9 04:54:55.197176 ignition[1016]: config has been read from IMDS userdata Sep 9 04:54:55.197196 ignition[1016]: parsing config with SHA512: 99502c3aed5465ed997b465bb74f520df4ff8dfe54219c0f3d96231bf0f5e0200f1b25b9ce0babca97add246ce83ddc4ff210ec07d26b4477d2685394cd11072 Sep 9 04:54:55.200142 unknown[1016]: fetched base config from "system" Sep 9 04:54:55.200378 ignition[1016]: fetch: fetch complete Sep 9 04:54:55.200148 unknown[1016]: fetched base config from "system" Sep 9 04:54:55.200381 ignition[1016]: fetch: fetch passed Sep 9 04:54:55.200151 unknown[1016]: fetched user config from "azure" Sep 9 04:54:55.200427 ignition[1016]: Ignition finished successfully Sep 9 04:54:55.205875 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 04:54:55.215570 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:54:55.260644 ignition[1023]: Ignition 2.22.0 Sep 9 04:54:55.260658 ignition[1023]: Stage: kargs Sep 9 04:54:55.265183 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:54:55.260811 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:55.274804 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:54:55.260818 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:55.261347 ignition[1023]: kargs: kargs passed Sep 9 04:54:55.261392 ignition[1023]: Ignition finished successfully Sep 9 04:54:55.310210 ignition[1029]: Ignition 2.22.0 Sep 9 04:54:55.310220 ignition[1029]: Stage: disks Sep 9 04:54:55.314596 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:54:55.310368 ignition[1029]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:55.322828 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:54:55.310374 ignition[1029]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:55.333091 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:54:55.310875 ignition[1029]: disks: disks passed Sep 9 04:54:55.343885 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:54:55.310910 ignition[1029]: Ignition finished successfully Sep 9 04:54:55.354064 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:54:55.363935 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:54:55.375490 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:54:55.468300 systemd-fsck[1037]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 9 04:54:55.477159 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:54:55.484551 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:54:55.505099 systemd-networkd[1002]: eth0: Gained IPv6LL Sep 9 04:54:57.392990 kernel: EXT4-fs (sda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:54:57.393858 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:54:57.398433 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:54:57.433444 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:54:57.454012 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:54:57.462788 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 04:54:57.483450 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1051) Sep 9 04:54:57.483467 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:57.491064 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:54:57.515955 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:57.503277 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:54:57.522994 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:54:57.537042 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:57.537061 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:57.542249 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:54:57.548604 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:54:57.998616 coreos-metadata[1053]: Sep 09 04:54:57.998 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 04:54:58.006735 coreos-metadata[1053]: Sep 09 04:54:58.006 INFO Fetch successful Sep 9 04:54:58.011616 coreos-metadata[1053]: Sep 09 04:54:58.006 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 9 04:54:58.022478 coreos-metadata[1053]: Sep 09 04:54:58.021 INFO Fetch successful Sep 9 04:54:58.045940 coreos-metadata[1053]: Sep 09 04:54:58.045 INFO wrote hostname ci-4452.0.0-n-7e0b6f01e2 to /sysroot/etc/hostname Sep 9 04:54:58.047304 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 04:54:58.229876 initrd-setup-root[1082]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:54:58.302477 initrd-setup-root[1089]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:54:58.319734 initrd-setup-root[1096]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:54:58.325548 initrd-setup-root[1103]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:54:59.408435 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:54:59.414751 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:54:59.438544 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:54:59.454338 kernel: BTRFS info (device sda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:59.451749 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:54:59.477157 ignition[1171]: INFO : Ignition 2.22.0 Sep 9 04:54:59.477157 ignition[1171]: INFO : Stage: mount Sep 9 04:54:59.477157 ignition[1171]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:59.477157 ignition[1171]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:59.477157 ignition[1171]: INFO : mount: mount passed Sep 9 04:54:59.477157 ignition[1171]: INFO : Ignition finished successfully Sep 9 04:54:59.485551 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:54:59.496778 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:54:59.508593 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:54:59.539173 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:54:59.565001 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1182) Sep 9 04:54:59.576622 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:59.576654 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:59.586942 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:59.586980 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:59.588379 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:54:59.615446 ignition[1200]: INFO : Ignition 2.22.0 Sep 9 04:54:59.619778 ignition[1200]: INFO : Stage: files Sep 9 04:54:59.619778 ignition[1200]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:59.619778 ignition[1200]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:59.619778 ignition[1200]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:54:59.647235 ignition[1200]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:54:59.647235 ignition[1200]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:54:59.729175 ignition[1200]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:54:59.735360 ignition[1200]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:54:59.735360 ignition[1200]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:54:59.729530 unknown[1200]: wrote ssh authorized keys file for user: core Sep 9 04:54:59.773194 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 04:54:59.782777 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 9 04:54:59.819061 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:55:00.164312 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 04:55:00.164312 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:55:00.184897 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:55:00.296205 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:55:00.305526 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:55:00.305526 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:55:00.328701 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:55:00.341355 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:55:00.341355 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 9 04:55:00.821567 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:55:01.089025 ignition[1200]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:55:01.089025 ignition[1200]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:55:01.132730 ignition[1200]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:55:01.156032 ignition[1200]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:55:01.156032 ignition[1200]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:55:01.172629 ignition[1200]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:55:01.172629 ignition[1200]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:55:01.172629 ignition[1200]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:55:01.172629 ignition[1200]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:55:01.172629 ignition[1200]: INFO : files: files passed Sep 9 04:55:01.172629 ignition[1200]: INFO : Ignition finished successfully Sep 9 04:55:01.171712 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:55:01.178934 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:55:01.211688 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:55:01.229274 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:55:01.234598 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:55:01.270443 initrd-setup-root-after-ignition[1228]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:55:01.270443 initrd-setup-root-after-ignition[1228]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:55:01.293859 initrd-setup-root-after-ignition[1232]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:55:01.276008 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:55:01.286268 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:55:01.300391 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:55:01.355396 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:55:01.355495 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:55:01.366820 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:55:01.378257 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:55:01.388656 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:55:01.389399 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:55:01.430024 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:55:01.437699 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:55:01.464034 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:55:01.470365 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:55:01.482526 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:55:01.493504 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:55:01.493605 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:55:01.508830 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:55:01.514753 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:55:01.525777 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:55:01.536766 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:55:01.547824 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:55:01.559607 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:55:01.572443 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:55:01.583264 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:55:01.594888 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:55:01.605485 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:55:01.616433 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:55:01.625376 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:55:01.625483 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:55:01.639977 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:55:01.645886 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:55:01.657565 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:55:01.657632 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:55:01.669083 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:55:01.669177 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:55:01.685703 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:55:01.685786 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:55:01.692252 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:55:01.692320 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:55:01.701930 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 04:55:01.701999 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 04:55:01.715756 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:55:01.792317 ignition[1252]: INFO : Ignition 2.22.0 Sep 9 04:55:01.792317 ignition[1252]: INFO : Stage: umount Sep 9 04:55:01.792317 ignition[1252]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:55:01.792317 ignition[1252]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:55:01.792317 ignition[1252]: INFO : umount: umount passed Sep 9 04:55:01.792317 ignition[1252]: INFO : Ignition finished successfully Sep 9 04:55:01.751696 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:55:01.766058 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:55:01.766195 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:55:01.788176 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:55:01.788265 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:55:01.802076 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:55:01.802172 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:55:01.811614 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:55:01.811889 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:55:01.822326 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:55:01.822376 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:55:01.833649 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 04:55:01.833693 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 04:55:01.842438 systemd[1]: Stopped target network.target - Network. Sep 9 04:55:01.851348 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:55:01.851391 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:55:01.862134 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:55:01.872370 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:55:01.875989 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:55:01.884511 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:55:01.894864 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:55:01.905605 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:55:01.905644 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:55:01.916219 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:55:01.916263 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:55:01.926786 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:55:01.926845 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:55:01.937716 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:55:01.937750 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:55:01.948599 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:55:01.959263 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:55:01.977109 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:55:01.977641 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:55:01.977725 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:55:01.998963 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:55:01.999171 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:55:01.999250 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:55:02.014696 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:55:02.014878 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:55:02.240627 kernel: hv_netvsc 000d3afc-c1ec-000d-3afc-c1ec000d3afc eth0: Data path switched from VF: enP60669s1 Sep 9 04:55:02.014963 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:55:02.026552 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:55:02.026633 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:55:02.041891 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:55:02.049930 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:55:02.049998 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:55:02.060745 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:55:02.060795 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:55:02.071293 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:55:02.087032 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:55:02.087103 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:55:02.098214 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:55:02.098251 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:55:02.116631 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:55:02.116667 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:55:02.123128 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:55:02.123176 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:55:02.135777 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:55:02.144851 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:55:02.144904 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:55:02.179366 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:55:02.181009 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:55:02.193718 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:55:02.193758 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:55:02.204801 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:55:02.204847 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:55:02.222939 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:55:02.223008 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:55:02.240727 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:55:02.240774 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:55:02.252432 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:55:02.252472 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:55:02.268672 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:55:02.287332 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:55:02.520640 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 9 04:55:02.287404 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:55:02.300372 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:55:02.300421 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:55:02.319755 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 04:55:02.319807 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:55:02.331091 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:55:02.331130 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:55:02.337627 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:55:02.337656 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:02.356489 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 04:55:02.356543 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 04:55:02.356566 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 04:55:02.356593 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:55:02.356863 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:55:02.356958 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:55:02.365909 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:55:02.365990 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:55:02.377827 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:55:02.388742 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:55:02.433778 systemd[1]: Switching root. Sep 9 04:55:02.640236 systemd-journald[224]: Journal stopped Sep 9 04:55:10.347818 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:55:10.347837 kernel: SELinux: policy capability open_perms=1 Sep 9 04:55:10.347845 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:55:10.347850 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:55:10.347856 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:55:10.347861 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:55:10.347867 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:55:10.347873 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:55:10.347878 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:55:10.347884 kernel: audit: type=1403 audit(1757393703.982:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:55:10.347891 systemd[1]: Successfully loaded SELinux policy in 182.243ms. Sep 9 04:55:10.347898 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.319ms. Sep 9 04:55:10.347905 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:55:10.347911 systemd[1]: Detected virtualization microsoft. Sep 9 04:55:10.347918 systemd[1]: Detected architecture arm64. Sep 9 04:55:10.347924 systemd[1]: Detected first boot. Sep 9 04:55:10.347931 systemd[1]: Hostname set to . Sep 9 04:55:10.347936 systemd[1]: Initializing machine ID from random generator. Sep 9 04:55:10.347944 zram_generator::config[1295]: No configuration found. Sep 9 04:55:10.347950 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:55:10.347956 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:55:10.347963 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:55:10.347970 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:55:10.348793 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:55:10.348802 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:55:10.348810 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:55:10.348817 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:55:10.348824 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:55:10.348830 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:55:10.348842 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:55:10.348849 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:55:10.348855 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:55:10.348862 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:55:10.348871 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:55:10.348877 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:55:10.348884 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:55:10.348890 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:55:10.348896 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:55:10.348904 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:55:10.348910 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:55:10.348918 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:55:10.348925 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:55:10.348931 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:55:10.348937 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:55:10.348943 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:55:10.348951 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:55:10.348957 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:55:10.348963 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:55:10.348969 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:55:10.348990 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:55:10.348997 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:55:10.349003 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:55:10.349011 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:55:10.349017 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:55:10.349024 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:55:10.349030 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:55:10.349036 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:55:10.349043 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:55:10.349050 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:55:10.349056 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:55:10.349063 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:55:10.349069 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:55:10.349075 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:55:10.349082 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:55:10.349088 systemd[1]: Reached target machines.target - Containers. Sep 9 04:55:10.349094 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:55:10.349101 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:10.349107 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:55:10.349114 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:55:10.349120 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:10.349126 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:55:10.349132 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:10.349139 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:55:10.349145 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:10.349151 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:55:10.349159 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:55:10.349166 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:55:10.349172 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:55:10.349178 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:55:10.349185 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:10.349191 kernel: fuse: init (API version 7.41) Sep 9 04:55:10.349197 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:55:10.349203 kernel: ACPI: bus type drm_connector registered Sep 9 04:55:10.349209 kernel: loop: module loaded Sep 9 04:55:10.349216 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:55:10.349222 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:55:10.349257 systemd-journald[1399]: Collecting audit messages is disabled. Sep 9 04:55:10.349274 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:55:10.349281 systemd-journald[1399]: Journal started Sep 9 04:55:10.349296 systemd-journald[1399]: Runtime Journal (/run/log/journal/d5984342e34642078935668b48ea8f8b) is 8M, max 78.5M, 70.5M free. Sep 9 04:55:09.434265 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:55:09.448411 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 04:55:09.448795 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:55:09.449060 systemd[1]: systemd-journald.service: Consumed 3.048s CPU time. Sep 9 04:55:10.372042 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:55:10.394265 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:55:10.404758 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:55:10.404817 systemd[1]: Stopped verity-setup.service. Sep 9 04:55:10.421265 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:55:10.422591 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:55:10.427966 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:55:10.433283 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:55:10.438243 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:55:10.444018 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:55:10.449788 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:55:10.455654 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:55:10.463211 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:55:10.470598 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:55:10.470731 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:55:10.477707 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:10.477824 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:10.484576 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:55:10.484692 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:55:10.490683 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:10.490802 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:10.497782 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:55:10.497899 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:55:10.504577 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:10.504732 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:10.510821 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:55:10.516820 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:55:10.524365 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:55:10.530696 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:55:10.537630 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:55:10.552486 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:55:10.559273 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:55:10.574046 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:55:10.580796 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:55:10.580825 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:55:10.587287 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:55:10.595072 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:55:10.600708 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:10.614344 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:55:10.630673 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:55:10.637389 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:55:10.638228 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:55:10.643911 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:55:10.644674 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:55:10.652335 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:55:10.660368 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:55:10.667520 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:55:10.674106 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:55:10.686181 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:55:10.694818 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:55:10.702116 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:55:10.739240 systemd-journald[1399]: Time spent on flushing to /var/log/journal/d5984342e34642078935668b48ea8f8b is 14.331ms for 945 entries. Sep 9 04:55:10.739240 systemd-journald[1399]: System Journal (/var/log/journal/d5984342e34642078935668b48ea8f8b) is 8M, max 2.6G, 2.6G free. Sep 9 04:55:10.816255 systemd-journald[1399]: Received client request to flush runtime journal. Sep 9 04:55:10.816319 kernel: loop0: detected capacity change from 0 to 119368 Sep 9 04:55:10.768028 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:55:10.817577 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:55:10.857649 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:55:10.858928 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:55:10.916503 systemd-tmpfiles[1436]: ACLs are not supported, ignoring. Sep 9 04:55:10.916516 systemd-tmpfiles[1436]: ACLs are not supported, ignoring. Sep 9 04:55:10.919405 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:55:10.928134 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:55:11.274994 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:55:11.337013 kernel: loop1: detected capacity change from 0 to 100632 Sep 9 04:55:11.493356 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:55:11.502063 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:55:11.522937 systemd-tmpfiles[1453]: ACLs are not supported, ignoring. Sep 9 04:55:11.523210 systemd-tmpfiles[1453]: ACLs are not supported, ignoring. Sep 9 04:55:11.525322 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:55:11.827004 kernel: loop2: detected capacity change from 0 to 27936 Sep 9 04:55:12.237777 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:55:12.245585 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:55:12.272060 systemd-udevd[1459]: Using default interface naming scheme 'v255'. Sep 9 04:55:12.355996 kernel: loop3: detected capacity change from 0 to 203944 Sep 9 04:55:12.392000 kernel: loop4: detected capacity change from 0 to 119368 Sep 9 04:55:12.403987 kernel: loop5: detected capacity change from 0 to 100632 Sep 9 04:55:12.418984 kernel: loop6: detected capacity change from 0 to 27936 Sep 9 04:55:12.431985 kernel: loop7: detected capacity change from 0 to 203944 Sep 9 04:55:12.449730 (sd-merge)[1461]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 9 04:55:12.450114 (sd-merge)[1461]: Merged extensions into '/usr'. Sep 9 04:55:12.453584 systemd[1]: Reload requested from client PID 1434 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:55:12.453693 systemd[1]: Reloading... Sep 9 04:55:12.503001 zram_generator::config[1486]: No configuration found. Sep 9 04:55:12.662213 systemd[1]: Reloading finished in 208 ms. Sep 9 04:55:12.683810 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:55:12.694874 systemd[1]: Starting ensure-sysext.service... Sep 9 04:55:12.699474 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:55:12.743431 systemd[1]: Reload requested from client PID 1542 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:55:12.743444 systemd[1]: Reloading... Sep 9 04:55:12.778459 systemd-tmpfiles[1543]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:55:12.778510 systemd-tmpfiles[1543]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:55:12.778741 systemd-tmpfiles[1543]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:55:12.778883 systemd-tmpfiles[1543]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:55:12.779316 systemd-tmpfiles[1543]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:55:12.779447 systemd-tmpfiles[1543]: ACLs are not supported, ignoring. Sep 9 04:55:12.779473 systemd-tmpfiles[1543]: ACLs are not supported, ignoring. Sep 9 04:55:12.787999 zram_generator::config[1571]: No configuration found. Sep 9 04:55:12.854066 systemd-tmpfiles[1543]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:55:12.854075 systemd-tmpfiles[1543]: Skipping /boot Sep 9 04:55:12.858716 systemd-tmpfiles[1543]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:55:12.858826 systemd-tmpfiles[1543]: Skipping /boot Sep 9 04:55:12.934699 systemd[1]: Reloading finished in 191 ms. Sep 9 04:55:12.940373 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:55:12.958246 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:55:12.981725 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:55:13.001521 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:55:13.009890 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:13.011868 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:13.021238 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:13.031741 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:13.039481 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:13.039619 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:13.042246 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:55:13.057521 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:55:13.072115 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:55:13.081446 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:55:13.092659 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:13.094205 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:13.105434 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:13.107216 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:13.118268 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:13.119058 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:13.132461 systemd[1]: Finished ensure-sysext.service. Sep 9 04:55:13.142847 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:55:13.143287 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 9 04:55:13.153313 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:13.156065 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:13.169112 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:55:13.177113 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:13.188092 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:13.198465 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:13.198509 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:13.198564 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:55:13.213192 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:55:13.220775 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:13.220935 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:13.228622 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:55:13.236547 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 04:55:13.236620 kernel: hv_vmbus: registering driver hv_balloon Sep 9 04:55:13.236648 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 9 04:55:13.231662 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:55:13.244759 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 9 04:55:13.246118 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:13.248671 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:13.259991 kernel: hv_vmbus: registering driver hyperv_fb Sep 9 04:55:13.261134 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:13.261270 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:13.275384 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#287 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:55:13.281963 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 9 04:55:13.283751 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:55:13.283835 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:55:13.285802 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:55:13.308384 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 9 04:55:13.308460 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 9 04:55:13.321328 kernel: Console: switching to colour dummy device 80x25 Sep 9 04:55:13.327547 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:55:13.337460 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 04:55:13.376940 augenrules[1737]: No rules Sep 9 04:55:13.379836 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:55:13.384042 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:55:13.390483 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:55:13.392001 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:13.398838 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:55:13.405135 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:55:13.435095 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:55:13.446114 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:55:13.516448 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 04:55:13.523529 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:55:13.548013 kernel: MACsec IEEE 802.1AE Sep 9 04:55:13.581559 systemd-resolved[1668]: Positive Trust Anchors: Sep 9 04:55:13.582073 systemd-resolved[1668]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:55:13.582170 systemd-resolved[1668]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:55:13.614421 systemd-networkd[1664]: lo: Link UP Sep 9 04:55:13.614695 systemd-networkd[1664]: lo: Gained carrier Sep 9 04:55:13.615945 systemd-networkd[1664]: Enumeration completed Sep 9 04:55:13.616151 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:55:13.616415 systemd-networkd[1664]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:13.616496 systemd-networkd[1664]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:55:13.624099 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:55:13.633101 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:55:13.639545 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:55:13.641431 systemd-resolved[1668]: Using system hostname 'ci-4452.0.0-n-7e0b6f01e2'. Sep 9 04:55:13.675506 kernel: mlx5_core ecfd:00:02.0 enP60669s1: Link up Sep 9 04:55:13.675825 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 04:55:13.698996 kernel: hv_netvsc 000d3afc-c1ec-000d-3afc-c1ec000d3afc eth0: Data path switched to VF: enP60669s1 Sep 9 04:55:13.701082 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:55:13.701274 systemd-networkd[1664]: enP60669s1: Link UP Sep 9 04:55:13.701389 systemd-networkd[1664]: eth0: Link UP Sep 9 04:55:13.701391 systemd-networkd[1664]: eth0: Gained carrier Sep 9 04:55:13.701412 systemd-networkd[1664]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:13.709382 systemd[1]: Reached target network.target - Network. Sep 9 04:55:13.715023 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:55:13.723538 systemd-networkd[1664]: enP60669s1: Gained carrier Sep 9 04:55:13.723961 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:55:13.735026 systemd-networkd[1664]: eth0: DHCPv4 address 10.200.20.14/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:55:14.769184 systemd-networkd[1664]: eth0: Gained IPv6LL Sep 9 04:55:14.774863 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:55:14.782520 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:55:14.863870 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:15.467681 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:55:15.474174 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:55:18.319913 ldconfig[1429]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:55:18.329325 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:55:18.336603 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:55:18.365072 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:55:18.371157 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:55:18.376656 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:55:18.382917 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:55:18.390084 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:55:18.395461 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:55:18.402297 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:55:18.408947 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:55:18.408978 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:55:18.413767 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:55:18.431724 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:55:18.438857 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:55:18.444828 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:55:18.451697 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:55:18.458196 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:55:18.465471 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:55:18.482768 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:55:18.490227 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:55:18.496514 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:55:18.501576 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:55:18.506713 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:55:18.506732 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:55:18.520167 systemd[1]: Starting chronyd.service - NTP client/server... Sep 9 04:55:18.532057 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:55:18.540086 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 04:55:18.549505 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:55:18.556171 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:55:18.567564 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:55:18.575106 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:55:18.581323 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:55:18.584051 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 9 04:55:18.584195 chronyd[1828]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 9 04:55:18.591452 jq[1836]: false Sep 9 04:55:18.591746 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 9 04:55:18.591947 KVP[1838]: KVP starting; pid is:1838 Sep 9 04:55:18.593317 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:18.601781 KVP[1838]: KVP LIC Version: 3.1 Sep 9 04:55:18.602079 kernel: hv_utils: KVP IC version 4.0 Sep 9 04:55:18.603486 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:55:18.611088 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:55:18.618597 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:55:18.628098 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:55:18.636237 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:55:18.646118 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:55:18.651504 chronyd[1828]: Timezone right/UTC failed leap second check, ignoring Sep 9 04:55:18.651642 chronyd[1828]: Loaded seccomp filter (level 2) Sep 9 04:55:18.652773 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:55:18.656175 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:55:18.657420 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:55:18.665100 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:55:18.672432 systemd[1]: Started chronyd.service - NTP client/server. Sep 9 04:55:18.678087 extend-filesystems[1837]: Found /dev/sda6 Sep 9 04:55:18.681305 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:55:18.694132 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:55:18.694289 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:55:18.696253 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:55:18.696391 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:55:18.699294 jq[1861]: true Sep 9 04:55:18.703993 extend-filesystems[1837]: Found /dev/sda9 Sep 9 04:55:18.712068 extend-filesystems[1837]: Checking size of /dev/sda9 Sep 9 04:55:18.708732 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:55:18.708870 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:55:18.721425 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:55:18.752752 (ntainerd)[1873]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:55:18.754718 jq[1872]: true Sep 9 04:55:18.765516 extend-filesystems[1837]: Old size kept for /dev/sda9 Sep 9 04:55:18.765055 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:55:18.776385 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:55:18.777587 systemd-logind[1850]: New seat seat0. Sep 9 04:55:18.780110 systemd-logind[1850]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 04:55:18.790100 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:55:18.795511 update_engine[1854]: I20250909 04:55:18.795441 1854 main.cc:92] Flatcar Update Engine starting Sep 9 04:55:18.846656 tar[1870]: linux-arm64/helm Sep 9 04:55:18.889062 bash[1913]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:55:18.892256 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:55:18.902784 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:55:18.963664 sshd_keygen[1866]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:55:18.995986 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:55:19.008866 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:55:19.025897 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 9 04:55:19.037053 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:55:19.039213 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:55:19.050710 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:55:19.071119 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 9 04:55:19.088603 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:55:19.096146 dbus-daemon[1833]: [system] SELinux support is enabled Sep 9 04:55:19.097105 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:55:19.105680 update_engine[1854]: I20250909 04:55:19.105527 1854 update_check_scheduler.cc:74] Next update check in 11m48s Sep 9 04:55:19.107154 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:55:19.119439 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:55:19.128790 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:55:19.137316 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:55:19.137338 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:55:19.147379 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:55:19.147395 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:55:19.156154 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:55:19.156205 dbus-daemon[1833]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 04:55:19.165177 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:55:19.188425 coreos-metadata[1830]: Sep 09 04:55:19.188 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 04:55:19.191405 coreos-metadata[1830]: Sep 09 04:55:19.190 INFO Fetch successful Sep 9 04:55:19.191405 coreos-metadata[1830]: Sep 09 04:55:19.191 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 9 04:55:19.195067 coreos-metadata[1830]: Sep 09 04:55:19.194 INFO Fetch successful Sep 9 04:55:19.195756 coreos-metadata[1830]: Sep 09 04:55:19.195 INFO Fetching http://168.63.129.16/machine/4c675c30-f8cb-43cc-9b9c-03ce19d0c4a2/f57e0a94%2Da8cf%2D4f5f%2D96a9%2D678708be83b8.%5Fci%2D4452.0.0%2Dn%2D7e0b6f01e2?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 9 04:55:19.197237 coreos-metadata[1830]: Sep 09 04:55:19.197 INFO Fetch successful Sep 9 04:55:19.198145 coreos-metadata[1830]: Sep 09 04:55:19.198 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 9 04:55:19.207092 coreos-metadata[1830]: Sep 09 04:55:19.207 INFO Fetch successful Sep 9 04:55:19.233732 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 04:55:19.244795 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:55:19.261674 tar[1870]: linux-arm64/LICENSE Sep 9 04:55:19.261674 tar[1870]: linux-arm64/README.md Sep 9 04:55:19.275870 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:55:19.330027 locksmithd[1995]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:55:19.366055 containerd[1873]: time="2025-09-09T04:55:19Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:55:19.369005 containerd[1873]: time="2025-09-09T04:55:19.367998416Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:55:19.373885 containerd[1873]: time="2025-09-09T04:55:19.373847432Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.016µs" Sep 9 04:55:19.373885 containerd[1873]: time="2025-09-09T04:55:19.373874232Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:55:19.373885 containerd[1873]: time="2025-09-09T04:55:19.373888024Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:55:19.374039 containerd[1873]: time="2025-09-09T04:55:19.374020752Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:55:19.374039 containerd[1873]: time="2025-09-09T04:55:19.374037600Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:55:19.374077 containerd[1873]: time="2025-09-09T04:55:19.374065416Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374106440Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374115608Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374272368Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374294752Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374302088Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374307448Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374358128Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374489448Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374507560Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374514296Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:55:19.374616 containerd[1873]: time="2025-09-09T04:55:19.374539048Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:55:19.374756 containerd[1873]: time="2025-09-09T04:55:19.374669808Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:55:19.374756 containerd[1873]: time="2025-09-09T04:55:19.374715872Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:55:19.386234 containerd[1873]: time="2025-09-09T04:55:19.386202600Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:55:19.386291 containerd[1873]: time="2025-09-09T04:55:19.386244656Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:55:19.386291 containerd[1873]: time="2025-09-09T04:55:19.386258328Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:55:19.386291 containerd[1873]: time="2025-09-09T04:55:19.386266504Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:55:19.386291 containerd[1873]: time="2025-09-09T04:55:19.386274600Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:55:19.386291 containerd[1873]: time="2025-09-09T04:55:19.386282336Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386299048Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386310384Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386317152Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386331072Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386337824Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386346552Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386440968Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386454808Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386465040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386471824Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386478392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386485136Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386492032Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386499072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:55:19.386512 containerd[1873]: time="2025-09-09T04:55:19.386506288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:55:19.387076 containerd[1873]: time="2025-09-09T04:55:19.386512864Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:55:19.387076 containerd[1873]: time="2025-09-09T04:55:19.386520016Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:55:19.387076 containerd[1873]: time="2025-09-09T04:55:19.386574600Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:55:19.387076 containerd[1873]: time="2025-09-09T04:55:19.386583960Z" level=info msg="Start snapshots syncer" Sep 9 04:55:19.387076 containerd[1873]: time="2025-09-09T04:55:19.386607888Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:55:19.387221 containerd[1873]: time="2025-09-09T04:55:19.386757680Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:55:19.387221 containerd[1873]: time="2025-09-09T04:55:19.386793568Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386845008Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386926224Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386940040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386947648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386954816Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386962440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.386993008Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387001720Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387019344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387026552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387038048Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387064976Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387074448Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:55:19.387397 containerd[1873]: time="2025-09-09T04:55:19.387080200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387086000Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387090664Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387096544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387107112Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387118520Z" level=info msg="runtime interface created" Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387121760Z" level=info msg="created NRI interface" Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387127632Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387135656Z" level=info msg="Connect containerd service" Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387157568Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:55:19.387574 containerd[1873]: time="2025-09-09T04:55:19.387691824Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:55:19.585681 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:19.657268 (kubelet)[2020]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:19.868914 containerd[1873]: time="2025-09-09T04:55:19.868818080Z" level=info msg="Start subscribing containerd event" Sep 9 04:55:19.868914 containerd[1873]: time="2025-09-09T04:55:19.868900640Z" level=info msg="Start recovering state" Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.868983880Z" level=info msg="Start event monitor" Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.868994464Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.868999768Z" level=info msg="Start streaming server" Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.869005432Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.869010240Z" level=info msg="runtime interface starting up..." Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.869018496Z" level=info msg="starting plugins..." Sep 9 04:55:19.869057 containerd[1873]: time="2025-09-09T04:55:19.869029480Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:55:19.869384 containerd[1873]: time="2025-09-09T04:55:19.869349536Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:55:19.869482 containerd[1873]: time="2025-09-09T04:55:19.869469704Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:55:19.869731 containerd[1873]: time="2025-09-09T04:55:19.869714128Z" level=info msg="containerd successfully booted in 0.504061s" Sep 9 04:55:19.870654 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:55:19.878013 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:55:19.886093 systemd[1]: Startup finished in 1.645s (kernel) + 14.875s (initrd) + 16.083s (userspace) = 32.604s. Sep 9 04:55:20.060758 kubelet[2020]: E0909 04:55:20.060711 2020 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:20.062806 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:20.062911 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:20.063284 systemd[1]: kubelet.service: Consumed 546ms CPU time, 256.2M memory peak. Sep 9 04:55:20.473044 login[1990]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:20.473653 login[1992]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:20.479689 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:55:20.480537 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:55:20.485627 systemd-logind[1850]: New session 1 of user core. Sep 9 04:55:20.488968 systemd-logind[1850]: New session 2 of user core. Sep 9 04:55:20.508688 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:55:20.510669 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:55:20.532777 (systemd)[2044]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:55:20.534919 systemd-logind[1850]: New session c1 of user core. Sep 9 04:55:20.844380 systemd[2044]: Queued start job for default target default.target. Sep 9 04:55:20.856167 systemd[2044]: Created slice app.slice - User Application Slice. Sep 9 04:55:20.856194 systemd[2044]: Reached target paths.target - Paths. Sep 9 04:55:20.856227 systemd[2044]: Reached target timers.target - Timers. Sep 9 04:55:20.857207 systemd[2044]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:55:20.864323 systemd[2044]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:55:20.864365 systemd[2044]: Reached target sockets.target - Sockets. Sep 9 04:55:20.864395 systemd[2044]: Reached target basic.target - Basic System. Sep 9 04:55:20.864414 systemd[2044]: Reached target default.target - Main User Target. Sep 9 04:55:20.864433 systemd[2044]: Startup finished in 324ms. Sep 9 04:55:20.864597 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:55:20.874109 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:55:20.874630 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:55:21.057486 waagent[1988]: 2025-09-09T04:55:21.052193Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 9 04:55:21.058056 waagent[1988]: 2025-09-09T04:55:21.058010Z INFO Daemon Daemon OS: flatcar 4452.0.0 Sep 9 04:55:21.062264 waagent[1988]: 2025-09-09T04:55:21.062226Z INFO Daemon Daemon Python: 3.11.13 Sep 9 04:55:21.066352 waagent[1988]: 2025-09-09T04:55:21.066313Z INFO Daemon Daemon Run daemon Sep 9 04:55:21.071030 waagent[1988]: 2025-09-09T04:55:21.070993Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4452.0.0' Sep 9 04:55:21.080455 waagent[1988]: 2025-09-09T04:55:21.080421Z INFO Daemon Daemon Using waagent for provisioning Sep 9 04:55:21.086463 waagent[1988]: 2025-09-09T04:55:21.086424Z INFO Daemon Daemon Activate resource disk Sep 9 04:55:21.091839 waagent[1988]: 2025-09-09T04:55:21.091803Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 9 04:55:21.103322 waagent[1988]: 2025-09-09T04:55:21.103242Z INFO Daemon Daemon Found device: None Sep 9 04:55:21.107820 waagent[1988]: 2025-09-09T04:55:21.107788Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 9 04:55:21.117507 waagent[1988]: 2025-09-09T04:55:21.117466Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 9 04:55:21.130203 waagent[1988]: 2025-09-09T04:55:21.130163Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 04:55:21.136050 waagent[1988]: 2025-09-09T04:55:21.136018Z INFO Daemon Daemon Running default provisioning handler Sep 9 04:55:21.146468 waagent[1988]: 2025-09-09T04:55:21.146431Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 9 04:55:21.159966 waagent[1988]: 2025-09-09T04:55:21.159929Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 9 04:55:21.169378 waagent[1988]: 2025-09-09T04:55:21.169342Z INFO Daemon Daemon cloud-init is enabled: False Sep 9 04:55:21.174178 waagent[1988]: 2025-09-09T04:55:21.174142Z INFO Daemon Daemon Copying ovf-env.xml Sep 9 04:55:21.286955 waagent[1988]: 2025-09-09T04:55:21.286891Z INFO Daemon Daemon Successfully mounted dvd Sep 9 04:55:21.313676 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 9 04:55:21.315604 waagent[1988]: 2025-09-09T04:55:21.315550Z INFO Daemon Daemon Detect protocol endpoint Sep 9 04:55:21.320061 waagent[1988]: 2025-09-09T04:55:21.320024Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 04:55:21.324792 waagent[1988]: 2025-09-09T04:55:21.324764Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 9 04:55:21.330285 waagent[1988]: 2025-09-09T04:55:21.330259Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 9 04:55:21.334913 waagent[1988]: 2025-09-09T04:55:21.334880Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 9 04:55:21.339360 waagent[1988]: 2025-09-09T04:55:21.339331Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 9 04:55:21.381454 waagent[1988]: 2025-09-09T04:55:21.381376Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 9 04:55:21.387402 waagent[1988]: 2025-09-09T04:55:21.387377Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 9 04:55:21.391906 waagent[1988]: 2025-09-09T04:55:21.391881Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 9 04:55:21.508235 waagent[1988]: 2025-09-09T04:55:21.508152Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 9 04:55:21.513856 waagent[1988]: 2025-09-09T04:55:21.513821Z INFO Daemon Daemon Forcing an update of the goal state. Sep 9 04:55:21.523613 waagent[1988]: 2025-09-09T04:55:21.523576Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 04:55:21.561584 waagent[1988]: 2025-09-09T04:55:21.561547Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 9 04:55:21.566659 waagent[1988]: 2025-09-09T04:55:21.566625Z INFO Daemon Sep 9 04:55:21.569094 waagent[1988]: 2025-09-09T04:55:21.569068Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 9a09197f-f2c8-4610-b056-4ee8976d1fdd eTag: 16821965095433560132 source: Fabric] Sep 9 04:55:21.579141 waagent[1988]: 2025-09-09T04:55:21.579112Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 9 04:55:21.584929 waagent[1988]: 2025-09-09T04:55:21.584901Z INFO Daemon Sep 9 04:55:21.587538 waagent[1988]: 2025-09-09T04:55:21.587514Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 9 04:55:21.596446 waagent[1988]: 2025-09-09T04:55:21.596420Z INFO Daemon Daemon Downloading artifacts profile blob Sep 9 04:55:21.652315 waagent[1988]: 2025-09-09T04:55:21.652213Z INFO Daemon Downloaded certificate {'thumbprint': '4329B1B1A8F2CB3651158D6668040035B3601B1F', 'hasPrivateKey': True} Sep 9 04:55:21.660892 waagent[1988]: 2025-09-09T04:55:21.660856Z INFO Daemon Fetch goal state completed Sep 9 04:55:21.671459 waagent[1988]: 2025-09-09T04:55:21.671427Z INFO Daemon Daemon Starting provisioning Sep 9 04:55:21.676170 waagent[1988]: 2025-09-09T04:55:21.676140Z INFO Daemon Daemon Handle ovf-env.xml. Sep 9 04:55:21.680747 waagent[1988]: 2025-09-09T04:55:21.680725Z INFO Daemon Daemon Set hostname [ci-4452.0.0-n-7e0b6f01e2] Sep 9 04:55:21.702997 waagent[1988]: 2025-09-09T04:55:21.700724Z INFO Daemon Daemon Publish hostname [ci-4452.0.0-n-7e0b6f01e2] Sep 9 04:55:21.706488 waagent[1988]: 2025-09-09T04:55:21.706452Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 9 04:55:21.712320 waagent[1988]: 2025-09-09T04:55:21.712285Z INFO Daemon Daemon Primary interface is [eth0] Sep 9 04:55:21.722712 systemd-networkd[1664]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:21.722718 systemd-networkd[1664]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:55:21.723447 waagent[1988]: 2025-09-09T04:55:21.723371Z INFO Daemon Daemon Create user account if not exists Sep 9 04:55:21.722770 systemd-networkd[1664]: eth0: DHCP lease lost Sep 9 04:55:21.728761 waagent[1988]: 2025-09-09T04:55:21.728717Z INFO Daemon Daemon User core already exists, skip useradd Sep 9 04:55:21.734120 waagent[1988]: 2025-09-09T04:55:21.734081Z INFO Daemon Daemon Configure sudoer Sep 9 04:55:21.746225 waagent[1988]: 2025-09-09T04:55:21.742276Z INFO Daemon Daemon Configure sshd Sep 9 04:55:21.750668 waagent[1988]: 2025-09-09T04:55:21.750621Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 9 04:55:21.762659 waagent[1988]: 2025-09-09T04:55:21.762622Z INFO Daemon Daemon Deploy ssh public key. Sep 9 04:55:21.770036 systemd-networkd[1664]: eth0: DHCPv4 address 10.200.20.14/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:55:22.893370 waagent[1988]: 2025-09-09T04:55:22.893324Z INFO Daemon Daemon Provisioning complete Sep 9 04:55:22.907124 waagent[1988]: 2025-09-09T04:55:22.907092Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 9 04:55:22.913033 waagent[1988]: 2025-09-09T04:55:22.912999Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 9 04:55:22.921652 waagent[1988]: 2025-09-09T04:55:22.921627Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 9 04:55:23.018394 waagent[2098]: 2025-09-09T04:55:23.018323Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 9 04:55:23.018665 waagent[2098]: 2025-09-09T04:55:23.018446Z INFO ExtHandler ExtHandler OS: flatcar 4452.0.0 Sep 9 04:55:23.018665 waagent[2098]: 2025-09-09T04:55:23.018482Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 9 04:55:23.018665 waagent[2098]: 2025-09-09T04:55:23.018515Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 9 04:55:23.111053 waagent[2098]: 2025-09-09T04:55:23.110969Z INFO ExtHandler ExtHandler Distro: flatcar-4452.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 9 04:55:23.111210 waagent[2098]: 2025-09-09T04:55:23.111186Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:23.111247 waagent[2098]: 2025-09-09T04:55:23.111232Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:23.116457 waagent[2098]: 2025-09-09T04:55:23.116413Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 04:55:23.129102 waagent[2098]: 2025-09-09T04:55:23.129073Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 9 04:55:23.129475 waagent[2098]: 2025-09-09T04:55:23.129445Z INFO ExtHandler Sep 9 04:55:23.129523 waagent[2098]: 2025-09-09T04:55:23.129508Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: dd1fa00c-ccf5-4ccf-8730-ee4aebdbb309 eTag: 16821965095433560132 source: Fabric] Sep 9 04:55:23.129729 waagent[2098]: 2025-09-09T04:55:23.129705Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 9 04:55:23.130136 waagent[2098]: 2025-09-09T04:55:23.130108Z INFO ExtHandler Sep 9 04:55:23.130171 waagent[2098]: 2025-09-09T04:55:23.130157Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 9 04:55:23.132687 waagent[2098]: 2025-09-09T04:55:23.132662Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 9 04:55:23.180300 waagent[2098]: 2025-09-09T04:55:23.180203Z INFO ExtHandler Downloaded certificate {'thumbprint': '4329B1B1A8F2CB3651158D6668040035B3601B1F', 'hasPrivateKey': True} Sep 9 04:55:23.180602 waagent[2098]: 2025-09-09T04:55:23.180570Z INFO ExtHandler Fetch goal state completed Sep 9 04:55:23.191364 waagent[2098]: 2025-09-09T04:55:23.191317Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.2 1 Jul 2025 (Library: OpenSSL 3.4.2 1 Jul 2025) Sep 9 04:55:23.195042 waagent[2098]: 2025-09-09T04:55:23.194997Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2098 Sep 9 04:55:23.195136 waagent[2098]: 2025-09-09T04:55:23.195112Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 9 04:55:23.195367 waagent[2098]: 2025-09-09T04:55:23.195342Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 9 04:55:23.196436 waagent[2098]: 2025-09-09T04:55:23.196403Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4452.0.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 9 04:55:23.196737 waagent[2098]: 2025-09-09T04:55:23.196711Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4452.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 9 04:55:23.196840 waagent[2098]: 2025-09-09T04:55:23.196819Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 9 04:55:23.197273 waagent[2098]: 2025-09-09T04:55:23.197242Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 9 04:55:23.257642 waagent[2098]: 2025-09-09T04:55:23.257605Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 9 04:55:23.257818 waagent[2098]: 2025-09-09T04:55:23.257792Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 9 04:55:23.262527 waagent[2098]: 2025-09-09T04:55:23.262142Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 9 04:55:23.266541 systemd[1]: Reload requested from client PID 2113 ('systemctl') (unit waagent.service)... Sep 9 04:55:23.266554 systemd[1]: Reloading... Sep 9 04:55:23.333004 zram_generator::config[2158]: No configuration found. Sep 9 04:55:23.481137 systemd[1]: Reloading finished in 214 ms. Sep 9 04:55:23.496276 waagent[2098]: 2025-09-09T04:55:23.495084Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 9 04:55:23.496276 waagent[2098]: 2025-09-09T04:55:23.495220Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 9 04:55:23.868821 waagent[2098]: 2025-09-09T04:55:23.868678Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 9 04:55:23.869035 waagent[2098]: 2025-09-09T04:55:23.869004Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 9 04:55:23.869674 waagent[2098]: 2025-09-09T04:55:23.869633Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 9 04:55:23.869930 waagent[2098]: 2025-09-09T04:55:23.869897Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 9 04:55:23.870775 waagent[2098]: 2025-09-09T04:55:23.870159Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:23.870775 waagent[2098]: 2025-09-09T04:55:23.870239Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:23.870775 waagent[2098]: 2025-09-09T04:55:23.870402Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 9 04:55:23.870775 waagent[2098]: 2025-09-09T04:55:23.870547Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 9 04:55:23.870775 waagent[2098]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 9 04:55:23.870775 waagent[2098]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 9 04:55:23.870775 waagent[2098]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 9 04:55:23.870775 waagent[2098]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.870775 waagent[2098]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.870775 waagent[2098]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.871068 waagent[2098]: 2025-09-09T04:55:23.871030Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 9 04:55:23.871168 waagent[2098]: 2025-09-09T04:55:23.871147Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 9 04:55:23.871506 waagent[2098]: 2025-09-09T04:55:23.871430Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 9 04:55:23.871547 waagent[2098]: 2025-09-09T04:55:23.871503Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 9 04:55:23.871681 waagent[2098]: 2025-09-09T04:55:23.871650Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:23.872008 waagent[2098]: 2025-09-09T04:55:23.871955Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 9 04:55:23.872114 waagent[2098]: 2025-09-09T04:55:23.872092Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:23.872310 waagent[2098]: 2025-09-09T04:55:23.872276Z INFO EnvHandler ExtHandler Configure routes Sep 9 04:55:23.874734 waagent[2098]: 2025-09-09T04:55:23.874705Z INFO EnvHandler ExtHandler Gateway:None Sep 9 04:55:23.875374 waagent[2098]: 2025-09-09T04:55:23.875351Z INFO EnvHandler ExtHandler Routes:None Sep 9 04:55:23.876456 waagent[2098]: 2025-09-09T04:55:23.876423Z INFO ExtHandler ExtHandler Sep 9 04:55:23.876774 waagent[2098]: 2025-09-09T04:55:23.876749Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 846d1dff-e291-427c-a401-bc0d29ffef4b correlation 3f3652f5-59b8-4128-89da-af4436073781 created: 2025-09-09T04:54:04.528885Z] Sep 9 04:55:23.877396 waagent[2098]: 2025-09-09T04:55:23.877324Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 9 04:55:23.878273 waagent[2098]: 2025-09-09T04:55:23.878227Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Sep 9 04:55:23.907508 waagent[2098]: 2025-09-09T04:55:23.907376Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 9 04:55:23.907508 waagent[2098]: Try `iptables -h' or 'iptables --help' for more information.) Sep 9 04:55:23.907963 waagent[2098]: 2025-09-09T04:55:23.907931Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: FCEB5DDF-05A1-44FE-B7CC-9240FA0D44E6;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 9 04:55:23.951011 waagent[2098]: 2025-09-09T04:55:23.950679Z INFO MonitorHandler ExtHandler Network interfaces: Sep 9 04:55:23.951011 waagent[2098]: Executing ['ip', '-a', '-o', 'link']: Sep 9 04:55:23.951011 waagent[2098]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 9 04:55:23.951011 waagent[2098]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:c1:ec brd ff:ff:ff:ff:ff:ff Sep 9 04:55:23.951011 waagent[2098]: 3: enP60669s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:c1:ec brd ff:ff:ff:ff:ff:ff\ altname enP60669p0s2 Sep 9 04:55:23.951011 waagent[2098]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 9 04:55:23.951011 waagent[2098]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 9 04:55:23.951011 waagent[2098]: 2: eth0 inet 10.200.20.14/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 9 04:55:23.951011 waagent[2098]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 9 04:55:23.951011 waagent[2098]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 9 04:55:23.951011 waagent[2098]: 2: eth0 inet6 fe80::20d:3aff:fefc:c1ec/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 9 04:55:24.019168 waagent[2098]: 2025-09-09T04:55:24.019123Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 9 04:55:24.019168 waagent[2098]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.019168 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.019168 waagent[2098]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.019168 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.019168 waagent[2098]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.019168 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.019168 waagent[2098]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 04:55:24.019168 waagent[2098]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 04:55:24.019168 waagent[2098]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 04:55:24.022011 waagent[2098]: 2025-09-09T04:55:24.021826Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 9 04:55:24.022011 waagent[2098]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.022011 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.022011 waagent[2098]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.022011 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.022011 waagent[2098]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:24.022011 waagent[2098]: pkts bytes target prot opt in out source destination Sep 9 04:55:24.022011 waagent[2098]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 04:55:24.022011 waagent[2098]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 04:55:24.022011 waagent[2098]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 04:55:24.022180 waagent[2098]: 2025-09-09T04:55:24.022055Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 9 04:55:30.182812 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:55:30.184185 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:30.369991 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:30.372608 (kubelet)[2247]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:30.411346 kubelet[2247]: E0909 04:55:30.411294 2247 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:30.413714 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:30.413819 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:30.416462 systemd[1]: kubelet.service: Consumed 103ms CPU time, 105M memory peak. Sep 9 04:55:40.432548 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:55:40.433797 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:40.778593 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:40.781045 (kubelet)[2262]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:40.805964 kubelet[2262]: E0909 04:55:40.805911 2262 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:40.807850 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:40.808054 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:40.808458 systemd[1]: kubelet.service: Consumed 99ms CPU time, 105.1M memory peak. Sep 9 04:55:42.452884 chronyd[1828]: Selected source PHC0 Sep 9 04:55:50.932399 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 04:55:50.934666 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:51.282381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:51.285033 (kubelet)[2278]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:51.310148 kubelet[2278]: E0909 04:55:51.310100 2278 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:51.312448 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:51.312643 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:51.313206 systemd[1]: kubelet.service: Consumed 102ms CPU time, 107M memory peak. Sep 9 04:55:51.326177 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:55:51.328176 systemd[1]: Started sshd@0-10.200.20.14:22-10.200.16.10:54802.service - OpenSSH per-connection server daemon (10.200.16.10:54802). Sep 9 04:55:52.055200 sshd[2285]: Accepted publickey for core from 10.200.16.10 port 54802 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:52.056202 sshd-session[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:52.059689 systemd-logind[1850]: New session 3 of user core. Sep 9 04:55:52.070187 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:55:52.443074 systemd[1]: Started sshd@1-10.200.20.14:22-10.200.16.10:54808.service - OpenSSH per-connection server daemon (10.200.16.10:54808). Sep 9 04:55:52.846334 sshd[2291]: Accepted publickey for core from 10.200.16.10 port 54808 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:52.847799 sshd-session[2291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:52.851400 systemd-logind[1850]: New session 4 of user core. Sep 9 04:55:52.853074 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:55:53.145517 sshd[2294]: Connection closed by 10.200.16.10 port 54808 Sep 9 04:55:53.147859 sshd-session[2291]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:53.150843 systemd-logind[1850]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:55:53.151050 systemd[1]: sshd@1-10.200.20.14:22-10.200.16.10:54808.service: Deactivated successfully. Sep 9 04:55:53.152284 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:55:53.153931 systemd-logind[1850]: Removed session 4. Sep 9 04:55:53.219165 systemd[1]: Started sshd@2-10.200.20.14:22-10.200.16.10:54820.service - OpenSSH per-connection server daemon (10.200.16.10:54820). Sep 9 04:55:53.630289 sshd[2300]: Accepted publickey for core from 10.200.16.10 port 54820 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:53.631272 sshd-session[2300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:53.634673 systemd-logind[1850]: New session 5 of user core. Sep 9 04:55:53.642246 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:55:53.943566 sshd[2303]: Connection closed by 10.200.16.10 port 54820 Sep 9 04:55:53.943135 sshd-session[2300]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:53.946302 systemd-logind[1850]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:55:53.946858 systemd[1]: sshd@2-10.200.20.14:22-10.200.16.10:54820.service: Deactivated successfully. Sep 9 04:55:53.948164 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:55:53.949160 systemd-logind[1850]: Removed session 5. Sep 9 04:55:54.034422 systemd[1]: Started sshd@3-10.200.20.14:22-10.200.16.10:54836.service - OpenSSH per-connection server daemon (10.200.16.10:54836). Sep 9 04:55:54.447040 sshd[2309]: Accepted publickey for core from 10.200.16.10 port 54836 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:54.448019 sshd-session[2309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:54.451740 systemd-logind[1850]: New session 6 of user core. Sep 9 04:55:54.458085 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:55:54.763913 sshd[2312]: Connection closed by 10.200.16.10 port 54836 Sep 9 04:55:54.764422 sshd-session[2309]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:54.767355 systemd[1]: sshd@3-10.200.20.14:22-10.200.16.10:54836.service: Deactivated successfully. Sep 9 04:55:54.770199 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:55:54.770759 systemd-logind[1850]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:55:54.771887 systemd-logind[1850]: Removed session 6. Sep 9 04:55:54.843753 systemd[1]: Started sshd@4-10.200.20.14:22-10.200.16.10:54838.service - OpenSSH per-connection server daemon (10.200.16.10:54838). Sep 9 04:55:55.261267 sshd[2318]: Accepted publickey for core from 10.200.16.10 port 54838 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:55.262277 sshd-session[2318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:55.265684 systemd-logind[1850]: New session 7 of user core. Sep 9 04:55:55.280084 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:55:55.710461 sudo[2322]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:55:55.710674 sudo[2322]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:55.737236 sudo[2322]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:55.808129 sshd[2321]: Connection closed by 10.200.16.10 port 54838 Sep 9 04:55:55.808721 sshd-session[2318]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:55.811866 systemd[1]: sshd@4-10.200.20.14:22-10.200.16.10:54838.service: Deactivated successfully. Sep 9 04:55:55.813441 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:55:55.814620 systemd-logind[1850]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:55:55.815637 systemd-logind[1850]: Removed session 7. Sep 9 04:55:55.882638 systemd[1]: Started sshd@5-10.200.20.14:22-10.200.16.10:54842.service - OpenSSH per-connection server daemon (10.200.16.10:54842). Sep 9 04:55:56.293521 sshd[2328]: Accepted publickey for core from 10.200.16.10 port 54842 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:56.294548 sshd-session[2328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:56.297995 systemd-logind[1850]: New session 8 of user core. Sep 9 04:55:56.307095 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 04:55:56.529595 sudo[2333]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:55:56.530145 sudo[2333]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:56.535592 sudo[2333]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:56.539228 sudo[2332]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:55:56.539421 sudo[2332]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:56.545649 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:55:56.584604 augenrules[2355]: No rules Sep 9 04:55:56.585722 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:55:56.585911 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:55:56.586933 sudo[2332]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:56.669810 sshd[2331]: Connection closed by 10.200.16.10 port 54842 Sep 9 04:55:56.669720 sshd-session[2328]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:56.673350 systemd-logind[1850]: Session 8 logged out. Waiting for processes to exit. Sep 9 04:55:56.673860 systemd[1]: sshd@5-10.200.20.14:22-10.200.16.10:54842.service: Deactivated successfully. Sep 9 04:55:56.675511 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 04:55:56.677002 systemd-logind[1850]: Removed session 8. Sep 9 04:55:56.748176 systemd[1]: Started sshd@6-10.200.20.14:22-10.200.16.10:54850.service - OpenSSH per-connection server daemon (10.200.16.10:54850). Sep 9 04:55:57.158194 sshd[2364]: Accepted publickey for core from 10.200.16.10 port 54850 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:57.159224 sshd-session[2364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:57.162630 systemd-logind[1850]: New session 9 of user core. Sep 9 04:55:57.173095 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 04:55:57.393622 sudo[2368]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:55:57.393833 sudo[2368]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:58.969250 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:55:58.975254 (dockerd)[2385]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:56:00.195045 dockerd[2385]: time="2025-09-09T04:56:00.194779523Z" level=info msg="Starting up" Sep 9 04:56:00.196742 dockerd[2385]: time="2025-09-09T04:56:00.196715881Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:56:00.204737 dockerd[2385]: time="2025-09-09T04:56:00.204649809Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:56:00.229518 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1166045949-merged.mount: Deactivated successfully. Sep 9 04:56:00.274770 dockerd[2385]: time="2025-09-09T04:56:00.274715378Z" level=info msg="Loading containers: start." Sep 9 04:56:00.340020 kernel: Initializing XFRM netlink socket Sep 9 04:56:00.789449 systemd-networkd[1664]: docker0: Link UP Sep 9 04:56:00.812008 dockerd[2385]: time="2025-09-09T04:56:00.811957124Z" level=info msg="Loading containers: done." Sep 9 04:56:00.830860 dockerd[2385]: time="2025-09-09T04:56:00.830580055Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:56:00.830860 dockerd[2385]: time="2025-09-09T04:56:00.830655410Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:56:00.830860 dockerd[2385]: time="2025-09-09T04:56:00.830727422Z" level=info msg="Initializing buildkit" Sep 9 04:56:00.875470 dockerd[2385]: time="2025-09-09T04:56:00.875400235Z" level=info msg="Completed buildkit initialization" Sep 9 04:56:00.881063 dockerd[2385]: time="2025-09-09T04:56:00.881018964Z" level=info msg="Daemon has completed initialization" Sep 9 04:56:00.882012 dockerd[2385]: time="2025-09-09T04:56:00.881082808Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:56:00.882374 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:56:01.345492 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 9 04:56:01.432274 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 04:56:01.434113 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:01.568152 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:01.574171 (kubelet)[2601]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:56:01.599868 kubelet[2601]: E0909 04:56:01.599738 2601 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:56:01.601812 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:56:01.601909 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:56:01.602369 systemd[1]: kubelet.service: Consumed 103ms CPU time, 104.1M memory peak. Sep 9 04:56:01.661661 containerd[1873]: time="2025-09-09T04:56:01.661625950Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 04:56:02.696509 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount395084867.mount: Deactivated successfully. Sep 9 04:56:03.688479 containerd[1873]: time="2025-09-09T04:56:03.688423423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.690682 containerd[1873]: time="2025-09-09T04:56:03.690658347Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652441" Sep 9 04:56:03.694230 containerd[1873]: time="2025-09-09T04:56:03.694193894Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.698449 containerd[1873]: time="2025-09-09T04:56:03.698400895Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.699018 containerd[1873]: time="2025-09-09T04:56:03.698841947Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 2.037180031s" Sep 9 04:56:03.699018 containerd[1873]: time="2025-09-09T04:56:03.698870057Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 9 04:56:03.700471 containerd[1873]: time="2025-09-09T04:56:03.700448174Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 04:56:04.283337 update_engine[1854]: I20250909 04:56:04.283255 1854 update_attempter.cc:509] Updating boot flags... Sep 9 04:56:04.813853 containerd[1873]: time="2025-09-09T04:56:04.813270787Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.815424 containerd[1873]: time="2025-09-09T04:56:04.815399718Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460309" Sep 9 04:56:04.819020 containerd[1873]: time="2025-09-09T04:56:04.818996053Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.823141 containerd[1873]: time="2025-09-09T04:56:04.823104788Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.823572 containerd[1873]: time="2025-09-09T04:56:04.823549241Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.123075908s" Sep 9 04:56:04.823646 containerd[1873]: time="2025-09-09T04:56:04.823635315Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 9 04:56:04.824183 containerd[1873]: time="2025-09-09T04:56:04.824159091Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 04:56:05.845012 containerd[1873]: time="2025-09-09T04:56:05.844686373Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.847440 containerd[1873]: time="2025-09-09T04:56:05.847290386Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125903" Sep 9 04:56:05.849930 containerd[1873]: time="2025-09-09T04:56:05.849907814Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.853825 containerd[1873]: time="2025-09-09T04:56:05.853793923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.854575 containerd[1873]: time="2025-09-09T04:56:05.854389454Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.030126298s" Sep 9 04:56:05.854575 containerd[1873]: time="2025-09-09T04:56:05.854412101Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 9 04:56:05.855223 containerd[1873]: time="2025-09-09T04:56:05.855203435Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 04:56:06.798060 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2465428367.mount: Deactivated successfully. Sep 9 04:56:07.104829 containerd[1873]: time="2025-09-09T04:56:07.104684907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.107485 containerd[1873]: time="2025-09-09T04:56:07.107446171Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916095" Sep 9 04:56:07.110712 containerd[1873]: time="2025-09-09T04:56:07.110680794Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.114944 containerd[1873]: time="2025-09-09T04:56:07.114912100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.115807 containerd[1873]: time="2025-09-09T04:56:07.115780879Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 1.260552413s" Sep 9 04:56:07.115931 containerd[1873]: time="2025-09-09T04:56:07.115916424Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 9 04:56:07.116410 containerd[1873]: time="2025-09-09T04:56:07.116390023Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 04:56:08.348818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount734088394.mount: Deactivated successfully. Sep 9 04:56:09.230659 containerd[1873]: time="2025-09-09T04:56:09.230601922Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:09.234113 containerd[1873]: time="2025-09-09T04:56:09.234074309Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Sep 9 04:56:09.237078 containerd[1873]: time="2025-09-09T04:56:09.237045049Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:09.242088 containerd[1873]: time="2025-09-09T04:56:09.242054267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:09.243314 containerd[1873]: time="2025-09-09T04:56:09.243280811Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 2.126688918s" Sep 9 04:56:09.243336 containerd[1873]: time="2025-09-09T04:56:09.243320985Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 04:56:09.243846 containerd[1873]: time="2025-09-09T04:56:09.243821911Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:56:09.779064 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2283547709.mount: Deactivated successfully. Sep 9 04:56:09.796837 containerd[1873]: time="2025-09-09T04:56:09.796791358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.799269 containerd[1873]: time="2025-09-09T04:56:09.799236302Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 9 04:56:09.801948 containerd[1873]: time="2025-09-09T04:56:09.801906171Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.805766 containerd[1873]: time="2025-09-09T04:56:09.805725075Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.806817 containerd[1873]: time="2025-09-09T04:56:09.806649259Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 562.796566ms" Sep 9 04:56:09.806817 containerd[1873]: time="2025-09-09T04:56:09.806679273Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:56:09.807389 containerd[1873]: time="2025-09-09T04:56:09.807156648Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 04:56:10.434129 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1103879085.mount: Deactivated successfully. Sep 9 04:56:11.682338 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 9 04:56:11.685156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:11.808495 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:11.815356 (kubelet)[2857]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:56:11.899188 kubelet[2857]: E0909 04:56:11.899137 2857 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:56:11.901319 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:56:11.901427 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:56:11.902715 systemd[1]: kubelet.service: Consumed 114ms CPU time, 105.1M memory peak. Sep 9 04:56:12.985418 containerd[1873]: time="2025-09-09T04:56:12.984703048Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:12.988111 containerd[1873]: time="2025-09-09T04:56:12.988067648Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537161" Sep 9 04:56:12.991304 containerd[1873]: time="2025-09-09T04:56:12.991269696Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:13.002306 containerd[1873]: time="2025-09-09T04:56:13.002241162Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:13.002929 containerd[1873]: time="2025-09-09T04:56:13.002895496Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 3.19571117s" Sep 9 04:56:13.002929 containerd[1873]: time="2025-09-09T04:56:13.002928894Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 9 04:56:16.407245 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:16.407360 systemd[1]: kubelet.service: Consumed 114ms CPU time, 105.1M memory peak. Sep 9 04:56:16.409048 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:16.429358 systemd[1]: Reload requested from client PID 2894 ('systemctl') (unit session-9.scope)... Sep 9 04:56:16.429462 systemd[1]: Reloading... Sep 9 04:56:16.504009 zram_generator::config[2944]: No configuration found. Sep 9 04:56:16.651175 systemd[1]: Reloading finished in 221 ms. Sep 9 04:56:16.737826 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 04:56:16.737915 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 04:56:16.739035 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:16.739098 systemd[1]: kubelet.service: Consumed 61ms CPU time, 89.2M memory peak. Sep 9 04:56:16.740622 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:17.231258 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:17.240829 (kubelet)[3005]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:56:17.273002 kubelet[3005]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:17.273002 kubelet[3005]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 04:56:17.273002 kubelet[3005]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:17.273002 kubelet[3005]: I0909 04:56:17.272118 3005 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:56:17.565841 kubelet[3005]: I0909 04:56:17.565589 3005 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 04:56:17.566100 kubelet[3005]: I0909 04:56:17.566086 3005 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:56:17.566666 kubelet[3005]: I0909 04:56:17.566646 3005 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 04:56:17.584992 kubelet[3005]: E0909 04:56:17.584953 3005 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.14:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.14:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:56:17.585953 kubelet[3005]: I0909 04:56:17.585582 3005 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:56:17.591522 kubelet[3005]: I0909 04:56:17.591507 3005 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:56:17.594661 kubelet[3005]: I0909 04:56:17.594640 3005 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:56:17.595287 kubelet[3005]: I0909 04:56:17.595268 3005 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 04:56:17.595484 kubelet[3005]: I0909 04:56:17.595457 3005 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:56:17.595664 kubelet[3005]: I0909 04:56:17.595534 3005 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-7e0b6f01e2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:56:17.595777 kubelet[3005]: I0909 04:56:17.595765 3005 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:56:17.595827 kubelet[3005]: I0909 04:56:17.595820 3005 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 04:56:17.595967 kubelet[3005]: I0909 04:56:17.595955 3005 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:17.597784 kubelet[3005]: I0909 04:56:17.597766 3005 kubelet.go:408] "Attempting to sync node with API server" Sep 9 04:56:17.597864 kubelet[3005]: I0909 04:56:17.597855 3005 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:56:17.597916 kubelet[3005]: I0909 04:56:17.597910 3005 kubelet.go:314] "Adding apiserver pod source" Sep 9 04:56:17.597964 kubelet[3005]: I0909 04:56:17.597957 3005 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:56:17.601533 kubelet[3005]: W0909 04:56:17.601488 3005 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.14:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452.0.0-n-7e0b6f01e2&limit=500&resourceVersion=0": dial tcp 10.200.20.14:6443: connect: connection refused Sep 9 04:56:17.601533 kubelet[3005]: E0909 04:56:17.601535 3005 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.14:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452.0.0-n-7e0b6f01e2&limit=500&resourceVersion=0\": dial tcp 10.200.20.14:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:56:17.601917 kubelet[3005]: W0909 04:56:17.601884 3005 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.14:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.14:6443: connect: connection refused Sep 9 04:56:17.601917 kubelet[3005]: E0909 04:56:17.601918 3005 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.14:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.14:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:56:17.602194 kubelet[3005]: I0909 04:56:17.602165 3005 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:56:17.602843 kubelet[3005]: I0909 04:56:17.602479 3005 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:56:17.602843 kubelet[3005]: W0909 04:56:17.602517 3005 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:56:17.603669 kubelet[3005]: I0909 04:56:17.603650 3005 server.go:1274] "Started kubelet" Sep 9 04:56:17.608394 kubelet[3005]: E0909 04:56:17.607584 3005 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.14:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.14:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4452.0.0-n-7e0b6f01e2.18638454648bda68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4452.0.0-n-7e0b6f01e2,UID:ci-4452.0.0-n-7e0b6f01e2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4452.0.0-n-7e0b6f01e2,},FirstTimestamp:2025-09-09 04:56:17.60363172 +0000 UTC m=+0.359058086,LastTimestamp:2025-09-09 04:56:17.60363172 +0000 UTC m=+0.359058086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452.0.0-n-7e0b6f01e2,}" Sep 9 04:56:17.609154 kubelet[3005]: I0909 04:56:17.609123 3005 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:56:17.611150 kubelet[3005]: E0909 04:56:17.611134 3005 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:56:17.613047 kubelet[3005]: I0909 04:56:17.613029 3005 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 04:56:17.613137 kubelet[3005]: I0909 04:56:17.613112 3005 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:56:17.613195 kubelet[3005]: E0909 04:56:17.613179 3005 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4452.0.0-n-7e0b6f01e2\" not found" Sep 9 04:56:17.613366 kubelet[3005]: I0909 04:56:17.613348 3005 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 04:56:17.613416 kubelet[3005]: I0909 04:56:17.613407 3005 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:56:17.613909 kubelet[3005]: I0909 04:56:17.613894 3005 server.go:449] "Adding debug handlers to kubelet server" Sep 9 04:56:17.614585 kubelet[3005]: I0909 04:56:17.614549 3005 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:56:17.614870 kubelet[3005]: I0909 04:56:17.614856 3005 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:56:17.615093 kubelet[3005]: I0909 04:56:17.615078 3005 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:56:17.616385 kubelet[3005]: W0909 04:56:17.616356 3005 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.14:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.14:6443: connect: connection refused Sep 9 04:56:17.616480 kubelet[3005]: E0909 04:56:17.616467 3005 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.14:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.14:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:56:17.616658 kubelet[3005]: I0909 04:56:17.616646 3005 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:56:17.616784 kubelet[3005]: I0909 04:56:17.616772 3005 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:56:17.617199 kubelet[3005]: E0909 04:56:17.617179 3005 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-7e0b6f01e2?timeout=10s\": dial tcp 10.200.20.14:6443: connect: connection refused" interval="200ms" Sep 9 04:56:17.618323 kubelet[3005]: I0909 04:56:17.618306 3005 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:56:17.641202 kubelet[3005]: I0909 04:56:17.641177 3005 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 04:56:17.641202 kubelet[3005]: I0909 04:56:17.641188 3005 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 04:56:17.641202 kubelet[3005]: I0909 04:56:17.641203 3005 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:17.651092 kubelet[3005]: I0909 04:56:17.651070 3005 policy_none.go:49] "None policy: Start" Sep 9 04:56:17.651614 kubelet[3005]: I0909 04:56:17.651594 3005 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 04:56:17.651950 kubelet[3005]: I0909 04:56:17.651744 3005 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:56:17.659351 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:56:17.668587 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:56:17.671244 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:56:17.675615 kubelet[3005]: I0909 04:56:17.675579 3005 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:56:17.677067 kubelet[3005]: I0909 04:56:17.677047 3005 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:56:17.677067 kubelet[3005]: I0909 04:56:17.677069 3005 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 04:56:17.677067 kubelet[3005]: I0909 04:56:17.677084 3005 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 04:56:17.677067 kubelet[3005]: E0909 04:56:17.677117 3005 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:56:17.679506 kubelet[3005]: I0909 04:56:17.679479 3005 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:56:17.679651 kubelet[3005]: I0909 04:56:17.679636 3005 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:56:17.679673 kubelet[3005]: I0909 04:56:17.679650 3005 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:56:17.680187 kubelet[3005]: W0909 04:56:17.680032 3005 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.14:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.14:6443: connect: connection refused Sep 9 04:56:17.680187 kubelet[3005]: E0909 04:56:17.680079 3005 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.14:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.14:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:56:17.680297 kubelet[3005]: I0909 04:56:17.680203 3005 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:56:17.683517 kubelet[3005]: E0909 04:56:17.683454 3005 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4452.0.0-n-7e0b6f01e2\" not found" Sep 9 04:56:17.781647 kubelet[3005]: I0909 04:56:17.781041 3005 kubelet_node_status.go:72] "Attempting to register node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.781647 kubelet[3005]: E0909 04:56:17.781559 3005 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.14:6443/api/v1/nodes\": dial tcp 10.200.20.14:6443: connect: connection refused" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.787416 systemd[1]: Created slice kubepods-burstable-podd19edae541aa0b0f7e65edddf6b965a4.slice - libcontainer container kubepods-burstable-podd19edae541aa0b0f7e65edddf6b965a4.slice. Sep 9 04:56:17.804008 systemd[1]: Created slice kubepods-burstable-poddb74a5afe0418303cab7a1a4701132e2.slice - libcontainer container kubepods-burstable-poddb74a5afe0418303cab7a1a4701132e2.slice. Sep 9 04:56:17.811102 systemd[1]: Created slice kubepods-burstable-podef750e0200571935cddf989173dfc5fe.slice - libcontainer container kubepods-burstable-podef750e0200571935cddf989173dfc5fe.slice. Sep 9 04:56:17.817894 kubelet[3005]: E0909 04:56:17.817754 3005 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-7e0b6f01e2?timeout=10s\": dial tcp 10.200.20.14:6443: connect: connection refused" interval="400ms" Sep 9 04:56:17.915206 kubelet[3005]: I0909 04:56:17.915031 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915206 kubelet[3005]: I0909 04:56:17.915071 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915206 kubelet[3005]: I0909 04:56:17.915084 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915206 kubelet[3005]: I0909 04:56:17.915097 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915206 kubelet[3005]: I0909 04:56:17.915108 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915414 kubelet[3005]: I0909 04:56:17.915117 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ef750e0200571935cddf989173dfc5fe-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"ef750e0200571935cddf989173dfc5fe\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915414 kubelet[3005]: I0909 04:56:17.915126 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915414 kubelet[3005]: I0909 04:56:17.915135 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.915414 kubelet[3005]: I0909 04:56:17.915145 3005 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.983661 kubelet[3005]: I0909 04:56:17.983614 3005 kubelet_node_status.go:72] "Attempting to register node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:17.984236 kubelet[3005]: E0909 04:56:17.984210 3005 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.14:6443/api/v1/nodes\": dial tcp 10.200.20.14:6443: connect: connection refused" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:18.102519 containerd[1873]: time="2025-09-09T04:56:18.102390532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-7e0b6f01e2,Uid:d19edae541aa0b0f7e65edddf6b965a4,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.110031 containerd[1873]: time="2025-09-09T04:56:18.109999634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2,Uid:db74a5afe0418303cab7a1a4701132e2,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.112897 containerd[1873]: time="2025-09-09T04:56:18.112865752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-7e0b6f01e2,Uid:ef750e0200571935cddf989173dfc5fe,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.181689 containerd[1873]: time="2025-09-09T04:56:18.181159187Z" level=info msg="connecting to shim bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616" address="unix:///run/containerd/s/2c27a104bd29d08a88d057cfe02b1aeae4d2a04c2c22ee94e3d87fe2ff5fe300" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.200170 systemd[1]: Started cri-containerd-bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616.scope - libcontainer container bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616. Sep 9 04:56:18.216766 containerd[1873]: time="2025-09-09T04:56:18.216213042Z" level=info msg="connecting to shim f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374" address="unix:///run/containerd/s/2690f08f5d8ea0f729181541697e671c249e30a068ec3aba9490ada60b140dcc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.217414 containerd[1873]: time="2025-09-09T04:56:18.217140852Z" level=info msg="connecting to shim 5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de" address="unix:///run/containerd/s/1a100e3c1df341f351e6fc4e70cf8bd51d422259de06ac4c712f665453f4aa70" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.218798 kubelet[3005]: E0909 04:56:18.218747 3005 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-7e0b6f01e2?timeout=10s\": dial tcp 10.200.20.14:6443: connect: connection refused" interval="800ms" Sep 9 04:56:18.243128 systemd[1]: Started cri-containerd-5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de.scope - libcontainer container 5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de. Sep 9 04:56:18.246233 systemd[1]: Started cri-containerd-f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374.scope - libcontainer container f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374. Sep 9 04:56:18.284740 containerd[1873]: time="2025-09-09T04:56:18.284556346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2,Uid:db74a5afe0418303cab7a1a4701132e2,Namespace:kube-system,Attempt:0,} returns sandbox id \"bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616\"" Sep 9 04:56:18.291785 containerd[1873]: time="2025-09-09T04:56:18.291752577Z" level=info msg="CreateContainer within sandbox \"bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:56:18.295506 containerd[1873]: time="2025-09-09T04:56:18.295443839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-7e0b6f01e2,Uid:ef750e0200571935cddf989173dfc5fe,Namespace:kube-system,Attempt:0,} returns sandbox id \"5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de\"" Sep 9 04:56:18.297425 containerd[1873]: time="2025-09-09T04:56:18.297365941Z" level=info msg="CreateContainer within sandbox \"5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:56:18.306922 containerd[1873]: time="2025-09-09T04:56:18.306883978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-7e0b6f01e2,Uid:d19edae541aa0b0f7e65edddf6b965a4,Namespace:kube-system,Attempt:0,} returns sandbox id \"f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374\"" Sep 9 04:56:18.309291 containerd[1873]: time="2025-09-09T04:56:18.308757748Z" level=info msg="CreateContainer within sandbox \"f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:56:18.317558 containerd[1873]: time="2025-09-09T04:56:18.317526933Z" level=info msg="Container 5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:18.327048 containerd[1873]: time="2025-09-09T04:56:18.327011628Z" level=info msg="Container 0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:18.353030 containerd[1873]: time="2025-09-09T04:56:18.352923785Z" level=info msg="CreateContainer within sandbox \"bcb0a02692892e5824b73d1eb16f327d0a8024da4bda4d95a1a1b81259656616\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd\"" Sep 9 04:56:18.353292 containerd[1873]: time="2025-09-09T04:56:18.353271556Z" level=info msg="Container 37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:18.353871 containerd[1873]: time="2025-09-09T04:56:18.353764359Z" level=info msg="StartContainer for \"5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd\"" Sep 9 04:56:18.354929 containerd[1873]: time="2025-09-09T04:56:18.354899700Z" level=info msg="connecting to shim 5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd" address="unix:///run/containerd/s/2c27a104bd29d08a88d057cfe02b1aeae4d2a04c2c22ee94e3d87fe2ff5fe300" protocol=ttrpc version=3 Sep 9 04:56:18.361559 containerd[1873]: time="2025-09-09T04:56:18.361510765Z" level=info msg="CreateContainer within sandbox \"5736562533e95d34f2f4efda20ec2e9f120257529b440903b34cf5cd2e8626de\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6\"" Sep 9 04:56:18.362361 containerd[1873]: time="2025-09-09T04:56:18.362328445Z" level=info msg="StartContainer for \"0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6\"" Sep 9 04:56:18.364030 containerd[1873]: time="2025-09-09T04:56:18.363881769Z" level=info msg="connecting to shim 0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6" address="unix:///run/containerd/s/1a100e3c1df341f351e6fc4e70cf8bd51d422259de06ac4c712f665453f4aa70" protocol=ttrpc version=3 Sep 9 04:56:18.371064 systemd[1]: Started cri-containerd-5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd.scope - libcontainer container 5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd. Sep 9 04:56:18.373672 containerd[1873]: time="2025-09-09T04:56:18.373631360Z" level=info msg="CreateContainer within sandbox \"f2d8b772552abcd5e7a40236a910db1d9b69ad890061a1288b5f53c3bdd4d374\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25\"" Sep 9 04:56:18.374480 containerd[1873]: time="2025-09-09T04:56:18.374428985Z" level=info msg="StartContainer for \"37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25\"" Sep 9 04:56:18.375873 containerd[1873]: time="2025-09-09T04:56:18.375834894Z" level=info msg="connecting to shim 37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25" address="unix:///run/containerd/s/2690f08f5d8ea0f729181541697e671c249e30a068ec3aba9490ada60b140dcc" protocol=ttrpc version=3 Sep 9 04:56:18.386470 kubelet[3005]: I0909 04:56:18.386383 3005 kubelet_node_status.go:72] "Attempting to register node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:18.387105 kubelet[3005]: E0909 04:56:18.387036 3005 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.200.20.14:6443/api/v1/nodes\": dial tcp 10.200.20.14:6443: connect: connection refused" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:18.387139 systemd[1]: Started cri-containerd-0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6.scope - libcontainer container 0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6. Sep 9 04:56:18.406175 systemd[1]: Started cri-containerd-37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25.scope - libcontainer container 37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25. Sep 9 04:56:18.425406 containerd[1873]: time="2025-09-09T04:56:18.425342591Z" level=info msg="StartContainer for \"5564aab578da110d39731a80fa40872a1102bc358c49f52cb68fc23011cd5afd\" returns successfully" Sep 9 04:56:18.466743 containerd[1873]: time="2025-09-09T04:56:18.466697363Z" level=info msg="StartContainer for \"37b4daa79a5b929b240ec84b10ee8a2ac6542ab0b1b78bf284a24d7c33e5ff25\" returns successfully" Sep 9 04:56:18.467183 containerd[1873]: time="2025-09-09T04:56:18.467070765Z" level=info msg="StartContainer for \"0461759bcc6430713d2b6b3420e4cc06274cc67e4ff45fe1cbbe334f15d77ac6\" returns successfully" Sep 9 04:56:19.188557 kubelet[3005]: I0909 04:56:19.188526 3005 kubelet_node_status.go:72] "Attempting to register node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:19.722590 kubelet[3005]: E0909 04:56:19.722548 3005 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4452.0.0-n-7e0b6f01e2\" not found" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:19.803553 kubelet[3005]: I0909 04:56:19.803369 3005 kubelet_node_status.go:75] "Successfully registered node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:20.604388 kubelet[3005]: I0909 04:56:20.604116 3005 apiserver.go:52] "Watching apiserver" Sep 9 04:56:20.614322 kubelet[3005]: I0909 04:56:20.614289 3005 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 04:56:20.713047 kubelet[3005]: W0909 04:56:20.712932 3005 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:22.173149 systemd[1]: Reload requested from client PID 3276 ('systemctl') (unit session-9.scope)... Sep 9 04:56:22.173164 systemd[1]: Reloading... Sep 9 04:56:22.257027 zram_generator::config[3323]: No configuration found. Sep 9 04:56:22.413735 systemd[1]: Reloading finished in 240 ms. Sep 9 04:56:22.450837 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:22.459429 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:56:22.461019 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:22.461075 systemd[1]: kubelet.service: Consumed 616ms CPU time, 127.5M memory peak. Sep 9 04:56:22.463525 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:22.576544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:22.584312 (kubelet)[3387]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:56:22.618941 kubelet[3387]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:22.619829 kubelet[3387]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 04:56:22.619829 kubelet[3387]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:22.619829 kubelet[3387]: I0909 04:56:22.619406 3387 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:56:22.627484 kubelet[3387]: I0909 04:56:22.627447 3387 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 04:56:22.627484 kubelet[3387]: I0909 04:56:22.627477 3387 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:56:22.627679 kubelet[3387]: I0909 04:56:22.627661 3387 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 04:56:22.628719 kubelet[3387]: I0909 04:56:22.628693 3387 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 04:56:22.630508 kubelet[3387]: I0909 04:56:22.630282 3387 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:56:22.633833 kubelet[3387]: I0909 04:56:22.633802 3387 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:56:22.637437 kubelet[3387]: I0909 04:56:22.636759 3387 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:56:22.637437 kubelet[3387]: I0909 04:56:22.636883 3387 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 04:56:22.637437 kubelet[3387]: I0909 04:56:22.636961 3387 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:56:22.637437 kubelet[3387]: I0909 04:56:22.637004 3387 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-7e0b6f01e2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637192 3387 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637200 3387 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637230 3387 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637331 3387 kubelet.go:408] "Attempting to sync node with API server" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637340 3387 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637356 3387 kubelet.go:314] "Adding apiserver pod source" Sep 9 04:56:22.637648 kubelet[3387]: I0909 04:56:22.637366 3387 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:56:22.643638 kubelet[3387]: I0909 04:56:22.643537 3387 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:56:22.643940 kubelet[3387]: I0909 04:56:22.643874 3387 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:56:22.645091 kubelet[3387]: I0909 04:56:22.645061 3387 server.go:1274] "Started kubelet" Sep 9 04:56:22.647200 kubelet[3387]: I0909 04:56:22.647135 3387 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:56:22.652819 kubelet[3387]: I0909 04:56:22.650851 3387 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:56:22.655489 kubelet[3387]: I0909 04:56:22.654346 3387 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:56:22.655918 kubelet[3387]: I0909 04:56:22.655878 3387 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:56:22.656571 kubelet[3387]: I0909 04:56:22.656385 3387 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:56:22.657102 kubelet[3387]: I0909 04:56:22.657075 3387 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 04:56:22.657374 kubelet[3387]: E0909 04:56:22.657269 3387 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4452.0.0-n-7e0b6f01e2\" not found" Sep 9 04:56:22.658485 kubelet[3387]: I0909 04:56:22.658313 3387 server.go:449] "Adding debug handlers to kubelet server" Sep 9 04:56:22.661140 kubelet[3387]: I0909 04:56:22.660948 3387 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 04:56:22.661140 kubelet[3387]: I0909 04:56:22.661098 3387 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:56:22.663194 kubelet[3387]: I0909 04:56:22.662951 3387 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:56:22.664888 kubelet[3387]: I0909 04:56:22.664771 3387 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:56:22.664888 kubelet[3387]: I0909 04:56:22.664800 3387 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 04:56:22.664888 kubelet[3387]: I0909 04:56:22.664818 3387 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 04:56:22.664888 kubelet[3387]: E0909 04:56:22.664865 3387 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:56:22.681634 kubelet[3387]: I0909 04:56:22.681597 3387 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:56:22.681634 kubelet[3387]: I0909 04:56:22.681619 3387 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:56:22.681803 kubelet[3387]: I0909 04:56:22.681708 3387 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:56:22.682155 kubelet[3387]: E0909 04:56:22.682135 3387 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721753 3387 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721773 3387 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721797 3387 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721945 3387 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721953 3387 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.721969 3387 policy_none.go:49] "None policy: Start" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.722672 3387 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.722695 3387 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:56:22.722931 kubelet[3387]: I0909 04:56:22.722851 3387 state_mem.go:75] "Updated machine memory state" Sep 9 04:56:22.727247 kubelet[3387]: I0909 04:56:22.727221 3387 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:56:22.728707 kubelet[3387]: I0909 04:56:22.728681 3387 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:56:22.728902 kubelet[3387]: I0909 04:56:22.728704 3387 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:56:22.729550 kubelet[3387]: I0909 04:56:22.729511 3387 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:56:22.784320 kubelet[3387]: W0909 04:56:22.784282 3387 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:22.784941 kubelet[3387]: W0909 04:56:22.784591 3387 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:22.784941 kubelet[3387]: W0909 04:56:22.784782 3387 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:22.784941 kubelet[3387]: E0909 04:56:22.784821 3387 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" already exists" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.832329 kubelet[3387]: I0909 04:56:22.832308 3387 kubelet_node_status.go:72] "Attempting to register node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.845761 kubelet[3387]: I0909 04:56:22.845725 3387 kubelet_node_status.go:111] "Node was previously registered" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.845910 kubelet[3387]: I0909 04:56:22.845821 3387 kubelet_node_status.go:75] "Successfully registered node" node="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962123 kubelet[3387]: I0909 04:56:22.961910 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ef750e0200571935cddf989173dfc5fe-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"ef750e0200571935cddf989173dfc5fe\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962123 kubelet[3387]: I0909 04:56:22.961938 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962123 kubelet[3387]: I0909 04:56:22.961956 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962123 kubelet[3387]: I0909 04:56:22.961966 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962123 kubelet[3387]: I0909 04:56:22.961990 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962321 kubelet[3387]: I0909 04:56:22.962002 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d19edae541aa0b0f7e65edddf6b965a4-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"d19edae541aa0b0f7e65edddf6b965a4\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962321 kubelet[3387]: I0909 04:56:22.962011 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962321 kubelet[3387]: I0909 04:56:22.962021 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:22.962321 kubelet[3387]: I0909 04:56:22.962031 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db74a5afe0418303cab7a1a4701132e2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2\" (UID: \"db74a5afe0418303cab7a1a4701132e2\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:23.642081 kubelet[3387]: I0909 04:56:23.642004 3387 apiserver.go:52] "Watching apiserver" Sep 9 04:56:23.661666 kubelet[3387]: I0909 04:56:23.661623 3387 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 04:56:23.724066 kubelet[3387]: W0909 04:56:23.724029 3387 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:23.724350 kubelet[3387]: E0909 04:56:23.724244 3387 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4452.0.0-n-7e0b6f01e2\" already exists" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:23.724797 kubelet[3387]: W0909 04:56:23.724688 3387 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 04:56:23.724797 kubelet[3387]: E0909 04:56:23.724727 3387 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4452.0.0-n-7e0b6f01e2\" already exists" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:23.743731 kubelet[3387]: I0909 04:56:23.743611 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7e0b6f01e2" podStartSLOduration=3.743562669 podStartE2EDuration="3.743562669s" podCreationTimestamp="2025-09-09 04:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:23.743459187 +0000 UTC m=+1.155599539" watchObservedRunningTime="2025-09-09 04:56:23.743562669 +0000 UTC m=+1.155703029" Sep 9 04:56:23.770263 kubelet[3387]: I0909 04:56:23.770193 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7e0b6f01e2" podStartSLOduration=1.770176515 podStartE2EDuration="1.770176515s" podCreationTimestamp="2025-09-09 04:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:23.757633502 +0000 UTC m=+1.169773854" watchObservedRunningTime="2025-09-09 04:56:23.770176515 +0000 UTC m=+1.182316867" Sep 9 04:56:28.104262 kubelet[3387]: I0909 04:56:28.104225 3387 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:56:28.104607 containerd[1873]: time="2025-09-09T04:56:28.104472939Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:56:28.104752 kubelet[3387]: I0909 04:56:28.104619 3387 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:56:28.557535 kubelet[3387]: I0909 04:56:28.557481 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7e0b6f01e2" podStartSLOduration=6.557465389 podStartE2EDuration="6.557465389s" podCreationTimestamp="2025-09-09 04:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:23.770748801 +0000 UTC m=+1.182889153" watchObservedRunningTime="2025-09-09 04:56:28.557465389 +0000 UTC m=+5.969605741" Sep 9 04:56:28.565671 systemd[1]: Created slice kubepods-besteffort-pod231ba4aa_0d48_4970_a4f8_20fa954f207d.slice - libcontainer container kubepods-besteffort-pod231ba4aa_0d48_4970_a4f8_20fa954f207d.slice. Sep 9 04:56:28.594067 kubelet[3387]: I0909 04:56:28.594009 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/231ba4aa-0d48-4970-a4f8-20fa954f207d-xtables-lock\") pod \"kube-proxy-2rct9\" (UID: \"231ba4aa-0d48-4970-a4f8-20fa954f207d\") " pod="kube-system/kube-proxy-2rct9" Sep 9 04:56:28.594067 kubelet[3387]: I0909 04:56:28.594055 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/231ba4aa-0d48-4970-a4f8-20fa954f207d-lib-modules\") pod \"kube-proxy-2rct9\" (UID: \"231ba4aa-0d48-4970-a4f8-20fa954f207d\") " pod="kube-system/kube-proxy-2rct9" Sep 9 04:56:28.594264 kubelet[3387]: I0909 04:56:28.594207 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdsz\" (UniqueName: \"kubernetes.io/projected/231ba4aa-0d48-4970-a4f8-20fa954f207d-kube-api-access-hzdsz\") pod \"kube-proxy-2rct9\" (UID: \"231ba4aa-0d48-4970-a4f8-20fa954f207d\") " pod="kube-system/kube-proxy-2rct9" Sep 9 04:56:28.594324 kubelet[3387]: I0909 04:56:28.594249 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/231ba4aa-0d48-4970-a4f8-20fa954f207d-kube-proxy\") pod \"kube-proxy-2rct9\" (UID: \"231ba4aa-0d48-4970-a4f8-20fa954f207d\") " pod="kube-system/kube-proxy-2rct9" Sep 9 04:56:28.882133 containerd[1873]: time="2025-09-09T04:56:28.881154542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2rct9,Uid:231ba4aa-0d48-4970-a4f8-20fa954f207d,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:28.945429 containerd[1873]: time="2025-09-09T04:56:28.945376140Z" level=info msg="connecting to shim e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9" address="unix:///run/containerd/s/55ff50b43aa86f9b17b55bf8006f878d42e377ae94ddd2b505fa60cf4a6c4237" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:28.968255 systemd[1]: Started cri-containerd-e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9.scope - libcontainer container e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9. Sep 9 04:56:28.991224 containerd[1873]: time="2025-09-09T04:56:28.991123726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2rct9,Uid:231ba4aa-0d48-4970-a4f8-20fa954f207d,Namespace:kube-system,Attempt:0,} returns sandbox id \"e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9\"" Sep 9 04:56:28.994636 containerd[1873]: time="2025-09-09T04:56:28.994602847Z" level=info msg="CreateContainer within sandbox \"e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:56:29.020011 containerd[1873]: time="2025-09-09T04:56:29.019357212Z" level=info msg="Container e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:29.021235 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount473242520.mount: Deactivated successfully. Sep 9 04:56:29.036318 containerd[1873]: time="2025-09-09T04:56:29.036287595Z" level=info msg="CreateContainer within sandbox \"e09b7a88df02e5584a87579d73dd416ddc782386b4ac2dd3fdc30aff6ced6cb9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457\"" Sep 9 04:56:29.037100 containerd[1873]: time="2025-09-09T04:56:29.036999177Z" level=info msg="StartContainer for \"e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457\"" Sep 9 04:56:29.038207 containerd[1873]: time="2025-09-09T04:56:29.038163331Z" level=info msg="connecting to shim e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457" address="unix:///run/containerd/s/55ff50b43aa86f9b17b55bf8006f878d42e377ae94ddd2b505fa60cf4a6c4237" protocol=ttrpc version=3 Sep 9 04:56:29.056082 systemd[1]: Started cri-containerd-e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457.scope - libcontainer container e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457. Sep 9 04:56:29.084435 containerd[1873]: time="2025-09-09T04:56:29.084335924Z" level=info msg="StartContainer for \"e726dab29f54344d22e539063dbcee2cb1ea80f62e91513fcc8963904a22d457\" returns successfully" Sep 9 04:56:29.140846 systemd[1]: Created slice kubepods-besteffort-pod742b762c_7506_458d_9e0b_bb380a73913c.slice - libcontainer container kubepods-besteffort-pod742b762c_7506_458d_9e0b_bb380a73913c.slice. Sep 9 04:56:29.197563 kubelet[3387]: I0909 04:56:29.197527 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/742b762c-7506-458d-9e0b-bb380a73913c-var-lib-calico\") pod \"tigera-operator-58fc44c59b-6bx8x\" (UID: \"742b762c-7506-458d-9e0b-bb380a73913c\") " pod="tigera-operator/tigera-operator-58fc44c59b-6bx8x" Sep 9 04:56:29.197563 kubelet[3387]: I0909 04:56:29.197563 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2nzh\" (UniqueName: \"kubernetes.io/projected/742b762c-7506-458d-9e0b-bb380a73913c-kube-api-access-k2nzh\") pod \"tigera-operator-58fc44c59b-6bx8x\" (UID: \"742b762c-7506-458d-9e0b-bb380a73913c\") " pod="tigera-operator/tigera-operator-58fc44c59b-6bx8x" Sep 9 04:56:29.447130 containerd[1873]: time="2025-09-09T04:56:29.447090791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-6bx8x,Uid:742b762c-7506-458d-9e0b-bb380a73913c,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:56:29.498335 containerd[1873]: time="2025-09-09T04:56:29.498299531Z" level=info msg="connecting to shim 94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed" address="unix:///run/containerd/s/64a11aa37504db5e102f4be98791ece0d8f5b7babfc2270571acfea6ea3d1e69" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:29.517099 systemd[1]: Started cri-containerd-94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed.scope - libcontainer container 94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed. Sep 9 04:56:29.554081 containerd[1873]: time="2025-09-09T04:56:29.554039074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-6bx8x,Uid:742b762c-7506-458d-9e0b-bb380a73913c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed\"" Sep 9 04:56:29.556255 containerd[1873]: time="2025-09-09T04:56:29.556070977Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:56:31.294356 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount709978769.mount: Deactivated successfully. Sep 9 04:56:31.604132 containerd[1873]: time="2025-09-09T04:56:31.603890768Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.606219 containerd[1873]: time="2025-09-09T04:56:31.606093072Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:56:31.608674 containerd[1873]: time="2025-09-09T04:56:31.608651013Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.613931 containerd[1873]: time="2025-09-09T04:56:31.613906648Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.614328 containerd[1873]: time="2025-09-09T04:56:31.614215015Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.058118583s" Sep 9 04:56:31.614328 containerd[1873]: time="2025-09-09T04:56:31.614240246Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:56:31.616889 containerd[1873]: time="2025-09-09T04:56:31.616830505Z" level=info msg="CreateContainer within sandbox \"94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:56:31.641541 containerd[1873]: time="2025-09-09T04:56:31.641188061Z" level=info msg="Container 46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:31.653450 containerd[1873]: time="2025-09-09T04:56:31.653422484Z" level=info msg="CreateContainer within sandbox \"94c4a04c1ae64db27bc9aeb5cb18eb6af7bef1a22b4360999de69a1c1d4766ed\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5\"" Sep 9 04:56:31.653943 containerd[1873]: time="2025-09-09T04:56:31.653835317Z" level=info msg="StartContainer for \"46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5\"" Sep 9 04:56:31.654483 containerd[1873]: time="2025-09-09T04:56:31.654450844Z" level=info msg="connecting to shim 46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5" address="unix:///run/containerd/s/64a11aa37504db5e102f4be98791ece0d8f5b7babfc2270571acfea6ea3d1e69" protocol=ttrpc version=3 Sep 9 04:56:31.671094 systemd[1]: Started cri-containerd-46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5.scope - libcontainer container 46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5. Sep 9 04:56:31.697620 containerd[1873]: time="2025-09-09T04:56:31.697597539Z" level=info msg="StartContainer for \"46090cd92e0b3febac506ae20e166bc8e1866d2d4beab1e06503c91693d072a5\" returns successfully" Sep 9 04:56:31.734493 kubelet[3387]: I0909 04:56:31.734359 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-2rct9" podStartSLOduration=3.734343662 podStartE2EDuration="3.734343662s" podCreationTimestamp="2025-09-09 04:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:29.731566401 +0000 UTC m=+7.143706753" watchObservedRunningTime="2025-09-09 04:56:31.734343662 +0000 UTC m=+9.146484014" Sep 9 04:56:34.893757 kubelet[3387]: I0909 04:56:34.893057 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-6bx8x" podStartSLOduration=3.833119907 podStartE2EDuration="5.893043339s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="2025-09-09 04:56:29.555099827 +0000 UTC m=+6.967240179" lastFinishedPulling="2025-09-09 04:56:31.615023259 +0000 UTC m=+9.027163611" observedRunningTime="2025-09-09 04:56:31.735635624 +0000 UTC m=+9.147775984" watchObservedRunningTime="2025-09-09 04:56:34.893043339 +0000 UTC m=+12.305183691" Sep 9 04:56:36.842170 sudo[2368]: pam_unix(sudo:session): session closed for user root Sep 9 04:56:36.912068 sshd[2367]: Connection closed by 10.200.16.10 port 54850 Sep 9 04:56:36.914154 sshd-session[2364]: pam_unix(sshd:session): session closed for user core Sep 9 04:56:36.917204 systemd-logind[1850]: Session 9 logged out. Waiting for processes to exit. Sep 9 04:56:36.918249 systemd[1]: sshd@6-10.200.20.14:22-10.200.16.10:54850.service: Deactivated successfully. Sep 9 04:56:36.921439 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 04:56:36.921938 systemd[1]: session-9.scope: Consumed 3.255s CPU time, 217.7M memory peak. Sep 9 04:56:36.924928 systemd-logind[1850]: Removed session 9. Sep 9 04:56:40.303957 systemd[1]: Created slice kubepods-besteffort-podad67c92b_1b5e_4d3c_aa98_1d8d12a207ff.slice - libcontainer container kubepods-besteffort-podad67c92b_1b5e_4d3c_aa98_1d8d12a207ff.slice. Sep 9 04:56:40.354501 kubelet[3387]: I0909 04:56:40.354371 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff-tigera-ca-bundle\") pod \"calico-typha-5bb8b545-k77vj\" (UID: \"ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff\") " pod="calico-system/calico-typha-5bb8b545-k77vj" Sep 9 04:56:40.354501 kubelet[3387]: I0909 04:56:40.354413 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr7k\" (UniqueName: \"kubernetes.io/projected/ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff-kube-api-access-sbr7k\") pod \"calico-typha-5bb8b545-k77vj\" (UID: \"ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff\") " pod="calico-system/calico-typha-5bb8b545-k77vj" Sep 9 04:56:40.354501 kubelet[3387]: I0909 04:56:40.354425 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff-typha-certs\") pod \"calico-typha-5bb8b545-k77vj\" (UID: \"ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff\") " pod="calico-system/calico-typha-5bb8b545-k77vj" Sep 9 04:56:40.425960 systemd[1]: Created slice kubepods-besteffort-pod8efd1199_c04b_4e98_8b19_23a3251529a7.slice - libcontainer container kubepods-besteffort-pod8efd1199_c04b_4e98_8b19_23a3251529a7.slice. Sep 9 04:56:40.456640 kubelet[3387]: I0909 04:56:40.456539 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-var-lib-calico\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.456640 kubelet[3387]: I0909 04:56:40.456580 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-lib-modules\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.456640 kubelet[3387]: I0909 04:56:40.456594 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-flexvol-driver-host\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.456640 kubelet[3387]: I0909 04:56:40.456605 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-cni-log-dir\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.456640 kubelet[3387]: I0909 04:56:40.456617 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-cni-bin-dir\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457079 kubelet[3387]: I0909 04:56:40.456627 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8efd1199-c04b-4e98-8b19-23a3251529a7-node-certs\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457079 kubelet[3387]: I0909 04:56:40.456647 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-cni-net-dir\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457079 kubelet[3387]: I0909 04:56:40.456662 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8efd1199-c04b-4e98-8b19-23a3251529a7-tigera-ca-bundle\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457079 kubelet[3387]: I0909 04:56:40.456674 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-var-run-calico\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457079 kubelet[3387]: I0909 04:56:40.456684 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-xtables-lock\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457159 kubelet[3387]: I0909 04:56:40.456698 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8efd1199-c04b-4e98-8b19-23a3251529a7-policysync\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.457159 kubelet[3387]: I0909 04:56:40.456708 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnb6\" (UniqueName: \"kubernetes.io/projected/8efd1199-c04b-4e98-8b19-23a3251529a7-kube-api-access-9rnb6\") pod \"calico-node-v4xzh\" (UID: \"8efd1199-c04b-4e98-8b19-23a3251529a7\") " pod="calico-system/calico-node-v4xzh" Sep 9 04:56:40.559731 kubelet[3387]: E0909 04:56:40.559632 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.560366 kubelet[3387]: W0909 04:56:40.560166 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.560366 kubelet[3387]: E0909 04:56:40.560192 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.564727 kubelet[3387]: E0909 04:56:40.564530 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.564727 kubelet[3387]: W0909 04:56:40.564547 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.564727 kubelet[3387]: E0909 04:56:40.564565 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.565302 kubelet[3387]: E0909 04:56:40.564751 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.565302 kubelet[3387]: W0909 04:56:40.564765 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.565302 kubelet[3387]: E0909 04:56:40.564841 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.565302 kubelet[3387]: E0909 04:56:40.565250 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.565302 kubelet[3387]: W0909 04:56:40.565262 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.566179 kubelet[3387]: E0909 04:56:40.565342 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.566179 kubelet[3387]: E0909 04:56:40.565671 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.566179 kubelet[3387]: W0909 04:56:40.565685 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.566179 kubelet[3387]: E0909 04:56:40.565771 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.567105 kubelet[3387]: E0909 04:56:40.567076 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.567105 kubelet[3387]: W0909 04:56:40.567097 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.567181 kubelet[3387]: E0909 04:56:40.567110 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.574197 kubelet[3387]: E0909 04:56:40.574164 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:40.585266 kubelet[3387]: E0909 04:56:40.585246 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.585384 kubelet[3387]: W0909 04:56:40.585341 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.585384 kubelet[3387]: E0909 04:56:40.585360 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.610554 containerd[1873]: time="2025-09-09T04:56:40.610515741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bb8b545-k77vj,Uid:ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:40.656274 kubelet[3387]: E0909 04:56:40.656193 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.656274 kubelet[3387]: W0909 04:56:40.656222 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.656274 kubelet[3387]: E0909 04:56:40.656241 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.656571 kubelet[3387]: E0909 04:56:40.656548 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.656571 kubelet[3387]: W0909 04:56:40.656559 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.656762 kubelet[3387]: E0909 04:56:40.656647 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.659103 kubelet[3387]: E0909 04:56:40.659039 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.659103 kubelet[3387]: W0909 04:56:40.659053 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.659274 kubelet[3387]: E0909 04:56:40.659203 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.659767 kubelet[3387]: E0909 04:56:40.659492 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.659767 kubelet[3387]: W0909 04:56:40.659504 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.659767 kubelet[3387]: E0909 04:56:40.659514 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.660114 kubelet[3387]: E0909 04:56:40.660022 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.661032 kubelet[3387]: W0909 04:56:40.660189 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.661032 kubelet[3387]: E0909 04:56:40.660206 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.661396 kubelet[3387]: E0909 04:56:40.661370 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.662130 kubelet[3387]: W0909 04:56:40.661556 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.662130 kubelet[3387]: E0909 04:56:40.661576 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.662789 kubelet[3387]: E0909 04:56:40.662609 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.662789 kubelet[3387]: W0909 04:56:40.662718 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.662789 kubelet[3387]: E0909 04:56:40.662732 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.663601 kubelet[3387]: E0909 04:56:40.663558 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.663601 kubelet[3387]: W0909 04:56:40.663575 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.663803 kubelet[3387]: E0909 04:56:40.663681 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.664864 kubelet[3387]: E0909 04:56:40.664044 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.665075 kubelet[3387]: W0909 04:56:40.664961 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.665075 kubelet[3387]: E0909 04:56:40.665012 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.665255 kubelet[3387]: E0909 04:56:40.665243 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.665396 kubelet[3387]: W0909 04:56:40.665316 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.665396 kubelet[3387]: E0909 04:56:40.665329 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.665555 kubelet[3387]: E0909 04:56:40.665543 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.665694 kubelet[3387]: W0909 04:56:40.665614 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.665694 kubelet[3387]: E0909 04:56:40.665626 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.665851 kubelet[3387]: E0909 04:56:40.665839 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.665938 kubelet[3387]: W0909 04:56:40.665894 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.665938 kubelet[3387]: E0909 04:56:40.665921 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.667462 containerd[1873]: time="2025-09-09T04:56:40.667109633Z" level=info msg="connecting to shim aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d" address="unix:///run/containerd/s/82b461f04d2a86356388d8c162669dc2887f76f9d8fc0297a55ceb979693220a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:40.667718 kubelet[3387]: E0909 04:56:40.667604 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.667718 kubelet[3387]: W0909 04:56:40.667634 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.667718 kubelet[3387]: E0909 04:56:40.667651 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.667943 kubelet[3387]: E0909 04:56:40.667914 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.668069 kubelet[3387]: W0909 04:56:40.668005 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.668069 kubelet[3387]: E0909 04:56:40.668020 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.668969 kubelet[3387]: E0909 04:56:40.668880 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.668969 kubelet[3387]: W0909 04:56:40.668892 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.668969 kubelet[3387]: E0909 04:56:40.668910 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.670060 kubelet[3387]: E0909 04:56:40.670044 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.670221 kubelet[3387]: W0909 04:56:40.670128 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.670221 kubelet[3387]: E0909 04:56:40.670143 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.670426 kubelet[3387]: E0909 04:56:40.670380 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.670426 kubelet[3387]: W0909 04:56:40.670390 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.670426 kubelet[3387]: E0909 04:56:40.670399 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.670723 kubelet[3387]: E0909 04:56:40.670675 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.670723 kubelet[3387]: W0909 04:56:40.670685 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.670723 kubelet[3387]: E0909 04:56:40.670695 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.671861 kubelet[3387]: E0909 04:56:40.671847 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.672028 kubelet[3387]: W0909 04:56:40.671890 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.672028 kubelet[3387]: E0909 04:56:40.671903 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.673332 kubelet[3387]: E0909 04:56:40.673226 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.673332 kubelet[3387]: W0909 04:56:40.673237 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.673332 kubelet[3387]: E0909 04:56:40.673247 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.673670 kubelet[3387]: E0909 04:56:40.673642 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.673670 kubelet[3387]: W0909 04:56:40.673654 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.673818 kubelet[3387]: E0909 04:56:40.673746 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.673818 kubelet[3387]: I0909 04:56:40.673780 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19-kubelet-dir\") pod \"csi-node-driver-v9nht\" (UID: \"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19\") " pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:40.675176 kubelet[3387]: E0909 04:56:40.675132 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.675176 kubelet[3387]: W0909 04:56:40.675147 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.675659 kubelet[3387]: E0909 04:56:40.675161 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.675799 kubelet[3387]: I0909 04:56:40.675488 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19-socket-dir\") pod \"csi-node-driver-v9nht\" (UID: \"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19\") " pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:40.676656 kubelet[3387]: E0909 04:56:40.676640 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.677117 kubelet[3387]: W0909 04:56:40.676747 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.677288 kubelet[3387]: E0909 04:56:40.677216 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.677495 kubelet[3387]: I0909 04:56:40.677364 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n77j\" (UniqueName: \"kubernetes.io/projected/2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19-kube-api-access-5n77j\") pod \"csi-node-driver-v9nht\" (UID: \"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19\") " pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:40.677738 kubelet[3387]: E0909 04:56:40.677725 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.678288 kubelet[3387]: W0909 04:56:40.677991 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.678776 kubelet[3387]: E0909 04:56:40.678717 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.679727 kubelet[3387]: E0909 04:56:40.679523 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.679727 kubelet[3387]: W0909 04:56:40.679543 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.679727 kubelet[3387]: E0909 04:56:40.679668 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.680128 kubelet[3387]: E0909 04:56:40.680108 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.680473 kubelet[3387]: W0909 04:56:40.680296 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.680473 kubelet[3387]: E0909 04:56:40.680427 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.680610 kubelet[3387]: I0909 04:56:40.680537 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19-varrun\") pod \"csi-node-driver-v9nht\" (UID: \"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19\") " pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:40.681049 kubelet[3387]: E0909 04:56:40.681034 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.681208 kubelet[3387]: W0909 04:56:40.681125 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.681208 kubelet[3387]: E0909 04:56:40.681165 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.681666 kubelet[3387]: E0909 04:56:40.681649 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.682113 kubelet[3387]: W0909 04:56:40.681770 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.682260 kubelet[3387]: E0909 04:56:40.682198 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684122 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.684592 kubelet[3387]: W0909 04:56:40.684136 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684159 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684276 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.684592 kubelet[3387]: W0909 04:56:40.684282 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684290 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684505 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.684592 kubelet[3387]: W0909 04:56:40.684513 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.684592 kubelet[3387]: E0909 04:56:40.684522 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.684891 kubelet[3387]: E0909 04:56:40.684834 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.684891 kubelet[3387]: W0909 04:56:40.684844 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.684891 kubelet[3387]: E0909 04:56:40.684853 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.685796 kubelet[3387]: E0909 04:56:40.685621 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.685796 kubelet[3387]: W0909 04:56:40.685634 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.685796 kubelet[3387]: E0909 04:56:40.685650 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.685796 kubelet[3387]: I0909 04:56:40.685669 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19-registration-dir\") pod \"csi-node-driver-v9nht\" (UID: \"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19\") " pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:40.685899 kubelet[3387]: E0909 04:56:40.685803 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.685899 kubelet[3387]: W0909 04:56:40.685811 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.685899 kubelet[3387]: E0909 04:56:40.685845 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.686578 kubelet[3387]: E0909 04:56:40.686562 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.686578 kubelet[3387]: W0909 04:56:40.686575 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.686669 kubelet[3387]: E0909 04:56:40.686585 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.702177 systemd[1]: Started cri-containerd-aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d.scope - libcontainer container aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d. Sep 9 04:56:40.732727 containerd[1873]: time="2025-09-09T04:56:40.732660512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4xzh,Uid:8efd1199-c04b-4e98-8b19-23a3251529a7,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:40.747831 containerd[1873]: time="2025-09-09T04:56:40.747751276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bb8b545-k77vj,Uid:ad67c92b-1b5e-4d3c-aa98-1d8d12a207ff,Namespace:calico-system,Attempt:0,} returns sandbox id \"aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d\"" Sep 9 04:56:40.750065 containerd[1873]: time="2025-09-09T04:56:40.750019063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:56:40.784997 containerd[1873]: time="2025-09-09T04:56:40.784962319Z" level=info msg="connecting to shim 8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de" address="unix:///run/containerd/s/1e3ce8f890cda1ac543dcc2450bda33cdb9a93c4bdcb8db79b59f5f88eb19418" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:40.787319 kubelet[3387]: E0909 04:56:40.787161 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.787319 kubelet[3387]: W0909 04:56:40.787180 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.787595 kubelet[3387]: E0909 04:56:40.787474 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.787735 kubelet[3387]: E0909 04:56:40.787717 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.787735 kubelet[3387]: W0909 04:56:40.787732 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.787815 kubelet[3387]: E0909 04:56:40.787749 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.788215 kubelet[3387]: E0909 04:56:40.787902 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.788215 kubelet[3387]: W0909 04:56:40.787910 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.788215 kubelet[3387]: E0909 04:56:40.787918 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.788215 kubelet[3387]: E0909 04:56:40.788030 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.788215 kubelet[3387]: W0909 04:56:40.788037 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.788215 kubelet[3387]: E0909 04:56:40.788050 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.789293 kubelet[3387]: E0909 04:56:40.788254 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.789293 kubelet[3387]: W0909 04:56:40.788261 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.789293 kubelet[3387]: E0909 04:56:40.788268 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.789293 kubelet[3387]: E0909 04:56:40.789101 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.789293 kubelet[3387]: W0909 04:56:40.789113 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.789293 kubelet[3387]: E0909 04:56:40.789124 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.790546 kubelet[3387]: E0909 04:56:40.789480 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.790546 kubelet[3387]: W0909 04:56:40.789490 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.790546 kubelet[3387]: E0909 04:56:40.789505 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.792988 kubelet[3387]: E0909 04:56:40.791417 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793140 kubelet[3387]: W0909 04:56:40.793069 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793140 kubelet[3387]: E0909 04:56:40.793095 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.793287 kubelet[3387]: E0909 04:56:40.793255 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793287 kubelet[3387]: W0909 04:56:40.793268 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793287 kubelet[3387]: E0909 04:56:40.793282 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.793411 kubelet[3387]: E0909 04:56:40.793388 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793411 kubelet[3387]: W0909 04:56:40.793395 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793411 kubelet[3387]: E0909 04:56:40.793404 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.793529 kubelet[3387]: E0909 04:56:40.793520 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793529 kubelet[3387]: W0909 04:56:40.793528 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793619 kubelet[3387]: E0909 04:56:40.793537 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.793658 kubelet[3387]: E0909 04:56:40.793632 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793658 kubelet[3387]: W0909 04:56:40.793637 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793658 kubelet[3387]: E0909 04:56:40.793645 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.793799 kubelet[3387]: E0909 04:56:40.793730 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.793799 kubelet[3387]: W0909 04:56:40.793735 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.793799 kubelet[3387]: E0909 04:56:40.793742 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.794110 kubelet[3387]: E0909 04:56:40.794012 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.794110 kubelet[3387]: W0909 04:56:40.794024 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.794110 kubelet[3387]: E0909 04:56:40.794040 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.794549 kubelet[3387]: E0909 04:56:40.794439 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.794549 kubelet[3387]: W0909 04:56:40.794451 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.794885 kubelet[3387]: E0909 04:56:40.794818 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.795246 kubelet[3387]: E0909 04:56:40.794921 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.795246 kubelet[3387]: W0909 04:56:40.794933 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.795246 kubelet[3387]: E0909 04:56:40.794947 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.795829 kubelet[3387]: E0909 04:56:40.795730 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.796224 kubelet[3387]: W0909 04:56:40.795916 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.796224 kubelet[3387]: E0909 04:56:40.795941 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.798401 kubelet[3387]: E0909 04:56:40.798105 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.798401 kubelet[3387]: W0909 04:56:40.798123 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.799422 kubelet[3387]: E0909 04:56:40.799089 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.799422 kubelet[3387]: W0909 04:56:40.799102 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.799422 kubelet[3387]: E0909 04:56:40.799237 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.799422 kubelet[3387]: W0909 04:56:40.799243 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.799422 kubelet[3387]: E0909 04:56:40.799253 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.801155 kubelet[3387]: E0909 04:56:40.800958 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.801155 kubelet[3387]: W0909 04:56:40.801026 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.801155 kubelet[3387]: E0909 04:56:40.801044 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.802860 kubelet[3387]: E0909 04:56:40.802756 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.802860 kubelet[3387]: W0909 04:56:40.802777 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.802860 kubelet[3387]: E0909 04:56:40.802792 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.802860 kubelet[3387]: E0909 04:56:40.802825 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.804401 kubelet[3387]: E0909 04:56:40.804330 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.804401 kubelet[3387]: W0909 04:56:40.804353 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.804401 kubelet[3387]: E0909 04:56:40.804379 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.805271 kubelet[3387]: E0909 04:56:40.805251 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.805271 kubelet[3387]: W0909 04:56:40.805267 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.805673 kubelet[3387]: E0909 04:56:40.805279 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.805673 kubelet[3387]: E0909 04:56:40.805303 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.806105 kubelet[3387]: E0909 04:56:40.806036 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.806105 kubelet[3387]: W0909 04:56:40.806048 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.806105 kubelet[3387]: E0909 04:56:40.806057 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.808608 kubelet[3387]: E0909 04:56:40.807962 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:40.808608 kubelet[3387]: W0909 04:56:40.808092 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:40.808608 kubelet[3387]: E0909 04:56:40.808106 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:40.808122 systemd[1]: Started cri-containerd-8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de.scope - libcontainer container 8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de. Sep 9 04:56:40.837029 containerd[1873]: time="2025-09-09T04:56:40.835449672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4xzh,Uid:8efd1199-c04b-4e98-8b19-23a3251529a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\"" Sep 9 04:56:42.138489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3747323998.mount: Deactivated successfully. Sep 9 04:56:42.478687 containerd[1873]: time="2025-09-09T04:56:42.478221639Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:42.480574 containerd[1873]: time="2025-09-09T04:56:42.480550230Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:56:42.483185 containerd[1873]: time="2025-09-09T04:56:42.483165307Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:42.486992 containerd[1873]: time="2025-09-09T04:56:42.486943952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:42.487354 containerd[1873]: time="2025-09-09T04:56:42.487257892Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.737191864s" Sep 9 04:56:42.487354 containerd[1873]: time="2025-09-09T04:56:42.487284331Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:56:42.490029 containerd[1873]: time="2025-09-09T04:56:42.488665045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:56:42.500703 containerd[1873]: time="2025-09-09T04:56:42.500677329Z" level=info msg="CreateContainer within sandbox \"aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:56:42.524184 containerd[1873]: time="2025-09-09T04:56:42.524149547Z" level=info msg="Container 5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:42.526421 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3607632712.mount: Deactivated successfully. Sep 9 04:56:42.541812 containerd[1873]: time="2025-09-09T04:56:42.541210925Z" level=info msg="CreateContainer within sandbox \"aad042e69264b076bbfe2ea5d1d798e180033100002970dec67d12af86057e9d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70\"" Sep 9 04:56:42.543311 containerd[1873]: time="2025-09-09T04:56:42.543289636Z" level=info msg="StartContainer for \"5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70\"" Sep 9 04:56:42.544935 containerd[1873]: time="2025-09-09T04:56:42.544908255Z" level=info msg="connecting to shim 5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70" address="unix:///run/containerd/s/82b461f04d2a86356388d8c162669dc2887f76f9d8fc0297a55ceb979693220a" protocol=ttrpc version=3 Sep 9 04:56:42.564088 systemd[1]: Started cri-containerd-5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70.scope - libcontainer container 5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70. Sep 9 04:56:42.597118 containerd[1873]: time="2025-09-09T04:56:42.597083974Z" level=info msg="StartContainer for \"5f8340d1e5a1af4b1dacdddc82d158bf36d664a186e674d1a2010e9e1fc34b70\" returns successfully" Sep 9 04:56:42.666850 kubelet[3387]: E0909 04:56:42.665992 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:42.773080 kubelet[3387]: I0909 04:56:42.772880 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bb8b545-k77vj" podStartSLOduration=1.034150723 podStartE2EDuration="2.772866645s" podCreationTimestamp="2025-09-09 04:56:40 +0000 UTC" firstStartedPulling="2025-09-09 04:56:40.749344625 +0000 UTC m=+18.161484977" lastFinishedPulling="2025-09-09 04:56:42.488060547 +0000 UTC m=+19.900200899" observedRunningTime="2025-09-09 04:56:42.769720953 +0000 UTC m=+20.181861305" watchObservedRunningTime="2025-09-09 04:56:42.772866645 +0000 UTC m=+20.185006997" Sep 9 04:56:42.786119 kubelet[3387]: E0909 04:56:42.786046 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.786119 kubelet[3387]: W0909 04:56:42.786066 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.786119 kubelet[3387]: E0909 04:56:42.786082 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.786473 kubelet[3387]: E0909 04:56:42.786420 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.786473 kubelet[3387]: W0909 04:56:42.786432 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.786473 kubelet[3387]: E0909 04:56:42.786442 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.787505 kubelet[3387]: E0909 04:56:42.787449 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.787505 kubelet[3387]: W0909 04:56:42.787463 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.787505 kubelet[3387]: E0909 04:56:42.787474 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.787835 kubelet[3387]: E0909 04:56:42.787766 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.787835 kubelet[3387]: W0909 04:56:42.787778 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.787835 kubelet[3387]: E0909 04:56:42.787787 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.788760 kubelet[3387]: E0909 04:56:42.788728 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.788760 kubelet[3387]: W0909 04:56:42.788742 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.788995 kubelet[3387]: E0909 04:56:42.788926 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.789304 kubelet[3387]: E0909 04:56:42.789242 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.789683 kubelet[3387]: W0909 04:56:42.789422 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.789683 kubelet[3387]: E0909 04:56:42.789444 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.790611 kubelet[3387]: E0909 04:56:42.790544 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.790611 kubelet[3387]: W0909 04:56:42.790563 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.790611 kubelet[3387]: E0909 04:56:42.790574 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.790846 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.791939 kubelet[3387]: W0909 04:56:42.790855 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.790865 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.791034 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.791939 kubelet[3387]: W0909 04:56:42.791042 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.791051 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.791188 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.791939 kubelet[3387]: W0909 04:56:42.791197 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.791206 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.791939 kubelet[3387]: E0909 04:56:42.791707 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.792783 kubelet[3387]: W0909 04:56:42.791717 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.792783 kubelet[3387]: E0909 04:56:42.791727 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.792783 kubelet[3387]: E0909 04:56:42.792351 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.792783 kubelet[3387]: W0909 04:56:42.792362 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.792783 kubelet[3387]: E0909 04:56:42.792373 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.793890 kubelet[3387]: E0909 04:56:42.793424 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.793890 kubelet[3387]: W0909 04:56:42.793435 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.794073 kubelet[3387]: E0909 04:56:42.793446 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.795079 kubelet[3387]: E0909 04:56:42.795058 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.796710 kubelet[3387]: W0909 04:56:42.795159 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.796710 kubelet[3387]: E0909 04:56:42.795175 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.796710 kubelet[3387]: E0909 04:56:42.795380 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.796710 kubelet[3387]: W0909 04:56:42.795388 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.796710 kubelet[3387]: E0909 04:56:42.795398 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.809206 kubelet[3387]: E0909 04:56:42.809159 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.809206 kubelet[3387]: W0909 04:56:42.809174 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.809206 kubelet[3387]: E0909 04:56:42.809190 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.809513 kubelet[3387]: E0909 04:56:42.809501 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.809648 kubelet[3387]: W0909 04:56:42.809544 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.809648 kubelet[3387]: E0909 04:56:42.809561 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.809711 kubelet[3387]: E0909 04:56:42.809693 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.809744 kubelet[3387]: W0909 04:56:42.809720 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.809744 kubelet[3387]: E0909 04:56:42.809738 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.809862 kubelet[3387]: E0909 04:56:42.809852 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.809902 kubelet[3387]: W0909 04:56:42.809889 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.809956 kubelet[3387]: E0909 04:56:42.809906 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.810078 kubelet[3387]: E0909 04:56:42.810066 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.810078 kubelet[3387]: W0909 04:56:42.810076 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.810166 kubelet[3387]: E0909 04:56:42.810088 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.810242 kubelet[3387]: E0909 04:56:42.810232 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.810242 kubelet[3387]: W0909 04:56:42.810241 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.810299 kubelet[3387]: E0909 04:56:42.810251 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.810732 kubelet[3387]: E0909 04:56:42.810697 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.810732 kubelet[3387]: W0909 04:56:42.810710 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.810894 kubelet[3387]: E0909 04:56:42.810819 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.811032 kubelet[3387]: E0909 04:56:42.810930 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.811032 kubelet[3387]: W0909 04:56:42.810953 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.811032 kubelet[3387]: E0909 04:56:42.810967 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.811699 kubelet[3387]: E0909 04:56:42.811097 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.811699 kubelet[3387]: W0909 04:56:42.811103 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.811699 kubelet[3387]: E0909 04:56:42.811111 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.811699 kubelet[3387]: E0909 04:56:42.811222 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.811699 kubelet[3387]: W0909 04:56:42.811230 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.811699 kubelet[3387]: E0909 04:56:42.811239 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.811877 kubelet[3387]: E0909 04:56:42.811864 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.811877 kubelet[3387]: W0909 04:56:42.811876 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.811950 kubelet[3387]: E0909 04:56:42.811890 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.812219 kubelet[3387]: E0909 04:56:42.812151 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.812219 kubelet[3387]: W0909 04:56:42.812162 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.812219 kubelet[3387]: E0909 04:56:42.812178 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.812600 kubelet[3387]: E0909 04:56:42.812472 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.812600 kubelet[3387]: W0909 04:56:42.812484 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.812600 kubelet[3387]: E0909 04:56:42.812501 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.813101 kubelet[3387]: E0909 04:56:42.812963 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.813356 kubelet[3387]: W0909 04:56:42.813262 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.813356 kubelet[3387]: E0909 04:56:42.813284 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.815013 kubelet[3387]: E0909 04:56:42.814170 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.815013 kubelet[3387]: W0909 04:56:42.814186 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.815013 kubelet[3387]: E0909 04:56:42.814219 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.815344 kubelet[3387]: E0909 04:56:42.815278 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.815344 kubelet[3387]: W0909 04:56:42.815290 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.815344 kubelet[3387]: E0909 04:56:42.815307 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.815666 kubelet[3387]: E0909 04:56:42.815655 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.815756 kubelet[3387]: W0909 04:56:42.815732 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.815876 kubelet[3387]: E0909 04:56:42.815747 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:42.816380 kubelet[3387]: E0909 04:56:42.816345 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:42.816890 kubelet[3387]: W0909 04:56:42.816868 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:42.816997 kubelet[3387]: E0909 04:56:42.816941 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.702674 containerd[1873]: time="2025-09-09T04:56:43.702585995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.704854 containerd[1873]: time="2025-09-09T04:56:43.704809521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:56:43.707751 containerd[1873]: time="2025-09-09T04:56:43.707716004Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.711793 containerd[1873]: time="2025-09-09T04:56:43.711247816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.711793 containerd[1873]: time="2025-09-09T04:56:43.711602706Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.222913558s" Sep 9 04:56:43.711793 containerd[1873]: time="2025-09-09T04:56:43.711622536Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:56:43.714620 containerd[1873]: time="2025-09-09T04:56:43.714592528Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:56:43.732149 containerd[1873]: time="2025-09-09T04:56:43.732096806Z" level=info msg="Container ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:43.735004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1190717198.mount: Deactivated successfully. Sep 9 04:56:43.746636 containerd[1873]: time="2025-09-09T04:56:43.746603486Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\"" Sep 9 04:56:43.747059 containerd[1873]: time="2025-09-09T04:56:43.746995294Z" level=info msg="StartContainer for \"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\"" Sep 9 04:56:43.749065 containerd[1873]: time="2025-09-09T04:56:43.749034063Z" level=info msg="connecting to shim ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3" address="unix:///run/containerd/s/1e3ce8f890cda1ac543dcc2450bda33cdb9a93c4bdcb8db79b59f5f88eb19418" protocol=ttrpc version=3 Sep 9 04:56:43.781097 systemd[1]: Started cri-containerd-ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3.scope - libcontainer container ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3. Sep 9 04:56:43.800197 kubelet[3387]: E0909 04:56:43.800103 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.800197 kubelet[3387]: W0909 04:56:43.800122 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.800197 kubelet[3387]: E0909 04:56:43.800140 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.801048 kubelet[3387]: E0909 04:56:43.800821 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.801048 kubelet[3387]: W0909 04:56:43.800833 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.801048 kubelet[3387]: E0909 04:56:43.800844 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.801412 kubelet[3387]: E0909 04:56:43.801294 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.801412 kubelet[3387]: W0909 04:56:43.801342 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.801412 kubelet[3387]: E0909 04:56:43.801354 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.801931 kubelet[3387]: E0909 04:56:43.801816 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.801931 kubelet[3387]: W0909 04:56:43.801829 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.801931 kubelet[3387]: E0909 04:56:43.801839 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.802710 kubelet[3387]: E0909 04:56:43.802547 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.802710 kubelet[3387]: W0909 04:56:43.802561 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.802710 kubelet[3387]: E0909 04:56:43.802571 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.802983 kubelet[3387]: E0909 04:56:43.802960 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.803091 kubelet[3387]: W0909 04:56:43.803028 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.803091 kubelet[3387]: E0909 04:56:43.803047 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.803361 kubelet[3387]: E0909 04:56:43.803348 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.803740 kubelet[3387]: W0909 04:56:43.803442 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.803740 kubelet[3387]: E0909 04:56:43.803458 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.803939 kubelet[3387]: E0909 04:56:43.803928 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.804080 kubelet[3387]: W0909 04:56:43.804000 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.804080 kubelet[3387]: E0909 04:56:43.804014 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.804254 kubelet[3387]: E0909 04:56:43.804244 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.804363 kubelet[3387]: W0909 04:56:43.804309 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.804363 kubelet[3387]: E0909 04:56:43.804330 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.804642 kubelet[3387]: E0909 04:56:43.804624 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.804797 kubelet[3387]: W0909 04:56:43.804722 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.804797 kubelet[3387]: E0909 04:56:43.804749 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.805222 kubelet[3387]: E0909 04:56:43.805210 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.805371 kubelet[3387]: W0909 04:56:43.805300 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.805371 kubelet[3387]: E0909 04:56:43.805316 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.805607 kubelet[3387]: E0909 04:56:43.805569 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.805607 kubelet[3387]: W0909 04:56:43.805579 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.805731 kubelet[3387]: E0909 04:56:43.805588 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.805947 kubelet[3387]: E0909 04:56:43.805938 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.806104 kubelet[3387]: W0909 04:56:43.806015 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.806104 kubelet[3387]: E0909 04:56:43.806029 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.806369 kubelet[3387]: E0909 04:56:43.806297 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.806369 kubelet[3387]: W0909 04:56:43.806325 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.806369 kubelet[3387]: E0909 04:56:43.806336 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.806667 kubelet[3387]: E0909 04:56:43.806658 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.806810 kubelet[3387]: W0909 04:56:43.806724 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.806810 kubelet[3387]: E0909 04:56:43.806736 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.816066 kubelet[3387]: E0909 04:56:43.815966 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.816066 kubelet[3387]: W0909 04:56:43.815989 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.816066 kubelet[3387]: E0909 04:56:43.815999 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.816659 kubelet[3387]: E0909 04:56:43.816646 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.816775 kubelet[3387]: W0909 04:56:43.816706 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.816775 kubelet[3387]: E0909 04:56:43.816728 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.816943 kubelet[3387]: E0909 04:56:43.816933 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.817155 kubelet[3387]: W0909 04:56:43.817017 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.817155 kubelet[3387]: E0909 04:56:43.817077 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.817568 kubelet[3387]: E0909 04:56:43.817555 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.817711 kubelet[3387]: W0909 04:56:43.817631 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.817778 kubelet[3387]: E0909 04:56:43.817759 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.818114 kubelet[3387]: E0909 04:56:43.818033 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.818114 kubelet[3387]: W0909 04:56:43.818045 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.818114 kubelet[3387]: E0909 04:56:43.818077 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.818942 kubelet[3387]: E0909 04:56:43.818387 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.818942 kubelet[3387]: W0909 04:56:43.818398 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.818942 kubelet[3387]: E0909 04:56:43.818423 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.819181 kubelet[3387]: E0909 04:56:43.819170 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.819320 kubelet[3387]: W0909 04:56:43.819239 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.819320 kubelet[3387]: E0909 04:56:43.819256 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.819461 kubelet[3387]: E0909 04:56:43.819452 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.819519 kubelet[3387]: W0909 04:56:43.819504 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.819612 kubelet[3387]: E0909 04:56:43.819557 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.819767 kubelet[3387]: E0909 04:56:43.819757 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.819901 kubelet[3387]: W0909 04:56:43.819819 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.819901 kubelet[3387]: E0909 04:56:43.819845 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.820228 kubelet[3387]: E0909 04:56:43.820218 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.820345 kubelet[3387]: W0909 04:56:43.820278 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.820345 kubelet[3387]: E0909 04:56:43.820298 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.820464 kubelet[3387]: E0909 04:56:43.820450 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.820464 kubelet[3387]: W0909 04:56:43.820462 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.820635 kubelet[3387]: E0909 04:56:43.820476 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.820780 kubelet[3387]: E0909 04:56:43.820765 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.820780 kubelet[3387]: W0909 04:56:43.820778 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.820838 kubelet[3387]: E0909 04:56:43.820792 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.820838 kubelet[3387]: E0909 04:56:43.820924 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.820838 kubelet[3387]: W0909 04:56:43.820931 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.820838 kubelet[3387]: E0909 04:56:43.820938 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.821387 kubelet[3387]: E0909 04:56:43.821292 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.821387 kubelet[3387]: W0909 04:56:43.821301 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.821387 kubelet[3387]: E0909 04:56:43.821324 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.821458 kubelet[3387]: E0909 04:56:43.821443 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.821458 kubelet[3387]: W0909 04:56:43.821454 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.821534 kubelet[3387]: E0909 04:56:43.821473 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.821623 kubelet[3387]: E0909 04:56:43.821592 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.821623 kubelet[3387]: W0909 04:56:43.821622 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.821672 kubelet[3387]: E0909 04:56:43.821633 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.822053 kubelet[3387]: E0909 04:56:43.822039 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.822053 kubelet[3387]: W0909 04:56:43.822050 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.822150 kubelet[3387]: E0909 04:56:43.822061 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.822891 kubelet[3387]: E0909 04:56:43.822875 3387 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:43.822891 kubelet[3387]: W0909 04:56:43.822886 3387 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:43.822956 kubelet[3387]: E0909 04:56:43.822896 3387 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:43.841062 containerd[1873]: time="2025-09-09T04:56:43.841028079Z" level=info msg="StartContainer for \"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\" returns successfully" Sep 9 04:56:43.850105 systemd[1]: cri-containerd-ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3.scope: Deactivated successfully. Sep 9 04:56:43.853521 containerd[1873]: time="2025-09-09T04:56:43.853482151Z" level=info msg="received exit event container_id:\"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\" id:\"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\" pid:4049 exited_at:{seconds:1757393803 nanos:853198593}" Sep 9 04:56:43.853820 containerd[1873]: time="2025-09-09T04:56:43.853507550Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\" id:\"ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3\" pid:4049 exited_at:{seconds:1757393803 nanos:853198593}" Sep 9 04:56:43.868090 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac6c6138a7141943ca69cc21e761c668a79ea78a01a2cd9c79fc98bd96d9e3f3-rootfs.mount: Deactivated successfully. Sep 9 04:56:44.666794 kubelet[3387]: E0909 04:56:44.666489 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:45.768855 containerd[1873]: time="2025-09-09T04:56:45.768578114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:56:46.666401 kubelet[3387]: E0909 04:56:46.666088 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:47.969013 containerd[1873]: time="2025-09-09T04:56:47.968856794Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:47.972074 containerd[1873]: time="2025-09-09T04:56:47.972046764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:56:47.975280 containerd[1873]: time="2025-09-09T04:56:47.975254268Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:47.979311 containerd[1873]: time="2025-09-09T04:56:47.979276381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:47.979693 containerd[1873]: time="2025-09-09T04:56:47.979670007Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.211059087s" Sep 9 04:56:47.979754 containerd[1873]: time="2025-09-09T04:56:47.979695333Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:56:47.982921 containerd[1873]: time="2025-09-09T04:56:47.982885726Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:56:48.002931 containerd[1873]: time="2025-09-09T04:56:48.002903828Z" level=info msg="Container 776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:48.005524 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount623292820.mount: Deactivated successfully. Sep 9 04:56:48.020454 containerd[1873]: time="2025-09-09T04:56:48.020422448Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\"" Sep 9 04:56:48.022080 containerd[1873]: time="2025-09-09T04:56:48.020926779Z" level=info msg="StartContainer for \"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\"" Sep 9 04:56:48.022080 containerd[1873]: time="2025-09-09T04:56:48.021844470Z" level=info msg="connecting to shim 776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f" address="unix:///run/containerd/s/1e3ce8f890cda1ac543dcc2450bda33cdb9a93c4bdcb8db79b59f5f88eb19418" protocol=ttrpc version=3 Sep 9 04:56:48.039110 systemd[1]: Started cri-containerd-776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f.scope - libcontainer container 776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f. Sep 9 04:56:48.082297 containerd[1873]: time="2025-09-09T04:56:48.082214363Z" level=info msg="StartContainer for \"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\" returns successfully" Sep 9 04:56:48.666065 kubelet[3387]: E0909 04:56:48.665478 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:49.523190 containerd[1873]: time="2025-09-09T04:56:49.523137700Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:56:49.525444 systemd[1]: cri-containerd-776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f.scope: Deactivated successfully. Sep 9 04:56:49.525802 systemd[1]: cri-containerd-776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f.scope: Consumed 312ms CPU time, 188.7M memory peak, 165.8M written to disk. Sep 9 04:56:49.526915 containerd[1873]: time="2025-09-09T04:56:49.526575343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\" id:\"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\" pid:4142 exited_at:{seconds:1757393809 nanos:526269337}" Sep 9 04:56:49.527401 kubelet[3387]: I0909 04:56:49.527341 3387 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 04:56:49.527541 containerd[1873]: time="2025-09-09T04:56:49.527055868Z" level=info msg="received exit event container_id:\"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\" id:\"776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f\" pid:4142 exited_at:{seconds:1757393809 nanos:526269337}" Sep 9 04:56:49.550690 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-776f041335ff00dac17d7dc24f0222c90be1039b34538e0fbc10f7a76ff1fe2f-rootfs.mount: Deactivated successfully. Sep 9 04:56:49.576041 systemd[1]: Created slice kubepods-burstable-pode7cb1906_b53a_45cd_a94c_8e7d1fc0c6f8.slice - libcontainer container kubepods-burstable-pode7cb1906_b53a_45cd_a94c_8e7d1fc0c6f8.slice. Sep 9 04:56:49.593495 systemd[1]: Created slice kubepods-burstable-podfe83b551_6b9a_4438_9381_a5d918caf80a.slice - libcontainer container kubepods-burstable-podfe83b551_6b9a_4438_9381_a5d918caf80a.slice. Sep 9 04:56:49.803817 kubelet[3387]: I0909 04:56:49.654800 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/693e272e-6eb8-4ead-a6ff-fc7821a57b3b-calico-apiserver-certs\") pod \"calico-apiserver-56c46cdc74-xwbfc\" (UID: \"693e272e-6eb8-4ead-a6ff-fc7821a57b3b\") " pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" Sep 9 04:56:49.803817 kubelet[3387]: I0909 04:56:49.654827 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66brm\" (UniqueName: \"kubernetes.io/projected/cfb6d47c-68ad-4986-818d-92b496c6dc6d-kube-api-access-66brm\") pod \"calico-kube-controllers-bc47587f9-bvp5k\" (UID: \"cfb6d47c-68ad-4986-818d-92b496c6dc6d\") " pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" Sep 9 04:56:49.803817 kubelet[3387]: I0909 04:56:49.654842 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c950c8a-2ffb-4b00-901f-b69f23e1fce8-goldmane-ca-bundle\") pod \"goldmane-7988f88666-8gzsh\" (UID: \"6c950c8a-2ffb-4b00-901f-b69f23e1fce8\") " pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:49.803817 kubelet[3387]: I0909 04:56:49.654865 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/6c950c8a-2ffb-4b00-901f-b69f23e1fce8-goldmane-key-pair\") pod \"goldmane-7988f88666-8gzsh\" (UID: \"6c950c8a-2ffb-4b00-901f-b69f23e1fce8\") " pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:49.803817 kubelet[3387]: I0909 04:56:49.654877 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/411faeac-2159-4d6a-b25d-bb2781b3f626-calico-apiserver-certs\") pod \"calico-apiserver-56c46cdc74-flmnt\" (UID: \"411faeac-2159-4d6a-b25d-bb2781b3f626\") " pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" Sep 9 04:56:49.600711 systemd[1]: Created slice kubepods-besteffort-pod6c950c8a_2ffb_4b00_901f_b69f23e1fce8.slice - libcontainer container kubepods-besteffort-pod6c950c8a_2ffb_4b00_901f_b69f23e1fce8.slice. Sep 9 04:56:49.804207 kubelet[3387]: I0909 04:56:49.654888 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfsc\" (UniqueName: \"kubernetes.io/projected/411faeac-2159-4d6a-b25d-bb2781b3f626-kube-api-access-jqfsc\") pod \"calico-apiserver-56c46cdc74-flmnt\" (UID: \"411faeac-2159-4d6a-b25d-bb2781b3f626\") " pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" Sep 9 04:56:49.804207 kubelet[3387]: I0909 04:56:49.654898 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe83b551-6b9a-4438-9381-a5d918caf80a-config-volume\") pod \"coredns-7c65d6cfc9-z29nv\" (UID: \"fe83b551-6b9a-4438-9381-a5d918caf80a\") " pod="kube-system/coredns-7c65d6cfc9-z29nv" Sep 9 04:56:49.804207 kubelet[3387]: I0909 04:56:49.654912 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l5r5\" (UniqueName: \"kubernetes.io/projected/693e272e-6eb8-4ead-a6ff-fc7821a57b3b-kube-api-access-7l5r5\") pod \"calico-apiserver-56c46cdc74-xwbfc\" (UID: \"693e272e-6eb8-4ead-a6ff-fc7821a57b3b\") " pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" Sep 9 04:56:49.804207 kubelet[3387]: I0909 04:56:49.654924 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-backend-key-pair\") pod \"whisker-86b6b76f5c-tsdrt\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " pod="calico-system/whisker-86b6b76f5c-tsdrt" Sep 9 04:56:49.804207 kubelet[3387]: I0909 04:56:49.654934 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjwxp\" (UniqueName: \"kubernetes.io/projected/fe83b551-6b9a-4438-9381-a5d918caf80a-kube-api-access-hjwxp\") pod \"coredns-7c65d6cfc9-z29nv\" (UID: \"fe83b551-6b9a-4438-9381-a5d918caf80a\") " pod="kube-system/coredns-7c65d6cfc9-z29nv" Sep 9 04:56:49.606666 systemd[1]: Created slice kubepods-besteffort-pod411faeac_2159_4d6a_b25d_bb2781b3f626.slice - libcontainer container kubepods-besteffort-pod411faeac_2159_4d6a_b25d_bb2781b3f626.slice. Sep 9 04:56:49.804320 kubelet[3387]: I0909 04:56:49.654946 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfb6d47c-68ad-4986-818d-92b496c6dc6d-tigera-ca-bundle\") pod \"calico-kube-controllers-bc47587f9-bvp5k\" (UID: \"cfb6d47c-68ad-4986-818d-92b496c6dc6d\") " pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" Sep 9 04:56:49.804320 kubelet[3387]: I0909 04:56:49.654957 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-ca-bundle\") pod \"whisker-86b6b76f5c-tsdrt\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " pod="calico-system/whisker-86b6b76f5c-tsdrt" Sep 9 04:56:49.804320 kubelet[3387]: I0909 04:56:49.654968 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6fpf\" (UniqueName: \"kubernetes.io/projected/18b4c698-66c7-4999-a24d-e70e80598cef-kube-api-access-s6fpf\") pod \"whisker-86b6b76f5c-tsdrt\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " pod="calico-system/whisker-86b6b76f5c-tsdrt" Sep 9 04:56:49.804320 kubelet[3387]: I0909 04:56:49.655510 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5q7j\" (UniqueName: \"kubernetes.io/projected/e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8-kube-api-access-v5q7j\") pod \"coredns-7c65d6cfc9-xxf7j\" (UID: \"e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8\") " pod="kube-system/coredns-7c65d6cfc9-xxf7j" Sep 9 04:56:49.804320 kubelet[3387]: I0909 04:56:49.655533 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c950c8a-2ffb-4b00-901f-b69f23e1fce8-config\") pod \"goldmane-7988f88666-8gzsh\" (UID: \"6c950c8a-2ffb-4b00-901f-b69f23e1fce8\") " pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:49.611576 systemd[1]: Created slice kubepods-besteffort-pod693e272e_6eb8_4ead_a6ff_fc7821a57b3b.slice - libcontainer container kubepods-besteffort-pod693e272e_6eb8_4ead_a6ff_fc7821a57b3b.slice. Sep 9 04:56:49.804422 kubelet[3387]: I0909 04:56:49.655549 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gjl9\" (UniqueName: \"kubernetes.io/projected/6c950c8a-2ffb-4b00-901f-b69f23e1fce8-kube-api-access-6gjl9\") pod \"goldmane-7988f88666-8gzsh\" (UID: \"6c950c8a-2ffb-4b00-901f-b69f23e1fce8\") " pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:49.804422 kubelet[3387]: I0909 04:56:49.655559 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8-config-volume\") pod \"coredns-7c65d6cfc9-xxf7j\" (UID: \"e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8\") " pod="kube-system/coredns-7c65d6cfc9-xxf7j" Sep 9 04:56:49.618787 systemd[1]: Created slice kubepods-besteffort-pod18b4c698_66c7_4999_a24d_e70e80598cef.slice - libcontainer container kubepods-besteffort-pod18b4c698_66c7_4999_a24d_e70e80598cef.slice. Sep 9 04:56:49.624101 systemd[1]: Created slice kubepods-besteffort-podcfb6d47c_68ad_4986_818d_92b496c6dc6d.slice - libcontainer container kubepods-besteffort-podcfb6d47c_68ad_4986_818d_92b496c6dc6d.slice. Sep 9 04:56:50.107063 containerd[1873]: time="2025-09-09T04:56:50.106924420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xxf7j,Uid:e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:50.111356 containerd[1873]: time="2025-09-09T04:56:50.111240029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-xwbfc,Uid:693e272e-6eb8-4ead-a6ff-fc7821a57b3b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:56:50.111815 containerd[1873]: time="2025-09-09T04:56:50.111484095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86b6b76f5c-tsdrt,Uid:18b4c698-66c7-4999-a24d-e70e80598cef,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.113019 containerd[1873]: time="2025-09-09T04:56:50.111930381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-z29nv,Uid:fe83b551-6b9a-4438-9381-a5d918caf80a,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:50.119535 containerd[1873]: time="2025-09-09T04:56:50.119443975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-8gzsh,Uid:6c950c8a-2ffb-4b00-901f-b69f23e1fce8,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.125997 containerd[1873]: time="2025-09-09T04:56:50.125800227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-flmnt,Uid:411faeac-2159-4d6a-b25d-bb2781b3f626,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:56:50.125997 containerd[1873]: time="2025-09-09T04:56:50.125891669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc47587f9-bvp5k,Uid:cfb6d47c-68ad-4986-818d-92b496c6dc6d,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.190548 containerd[1873]: time="2025-09-09T04:56:50.190239991Z" level=error msg="Failed to destroy network for sandbox \"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.196130 containerd[1873]: time="2025-09-09T04:56:50.195944424Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xxf7j,Uid:e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.196713 kubelet[3387]: E0909 04:56:50.196368 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.196713 kubelet[3387]: E0909 04:56:50.196446 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xxf7j" Sep 9 04:56:50.196713 kubelet[3387]: E0909 04:56:50.196462 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xxf7j" Sep 9 04:56:50.197792 kubelet[3387]: E0909 04:56:50.196494 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xxf7j_kube-system(e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xxf7j_kube-system(e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3310a56294709c5066ce6585244b9fd4ac7c9cf67c193b8916a7cc0a1fea64ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xxf7j" podUID="e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8" Sep 9 04:56:50.238476 containerd[1873]: time="2025-09-09T04:56:50.238434901Z" level=error msg="Failed to destroy network for sandbox \"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.241386 containerd[1873]: time="2025-09-09T04:56:50.241344998Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-z29nv,Uid:fe83b551-6b9a-4438-9381-a5d918caf80a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.242627 kubelet[3387]: E0909 04:56:50.241958 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.242627 kubelet[3387]: E0909 04:56:50.242048 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-z29nv" Sep 9 04:56:50.242627 kubelet[3387]: E0909 04:56:50.242066 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-z29nv" Sep 9 04:56:50.242750 kubelet[3387]: E0909 04:56:50.242111 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-z29nv_kube-system(fe83b551-6b9a-4438-9381-a5d918caf80a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-z29nv_kube-system(fe83b551-6b9a-4438-9381-a5d918caf80a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0331bba98f4ac27ac658a381ad431df6d82145ee708a7838b21af7ffc81f10d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-z29nv" podUID="fe83b551-6b9a-4438-9381-a5d918caf80a" Sep 9 04:56:50.249243 containerd[1873]: time="2025-09-09T04:56:50.249186477Z" level=error msg="Failed to destroy network for sandbox \"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.252893 containerd[1873]: time="2025-09-09T04:56:50.252848531Z" level=error msg="Failed to destroy network for sandbox \"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.253049 containerd[1873]: time="2025-09-09T04:56:50.253031625Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-xwbfc,Uid:693e272e-6eb8-4ead-a6ff-fc7821a57b3b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.253716 kubelet[3387]: E0909 04:56:50.253372 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.253716 kubelet[3387]: E0909 04:56:50.253622 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" Sep 9 04:56:50.253716 kubelet[3387]: E0909 04:56:50.253640 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" Sep 9 04:56:50.253854 kubelet[3387]: E0909 04:56:50.253689 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56c46cdc74-xwbfc_calico-apiserver(693e272e-6eb8-4ead-a6ff-fc7821a57b3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56c46cdc74-xwbfc_calico-apiserver(693e272e-6eb8-4ead-a6ff-fc7821a57b3b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e3848044f13350937515ee73a793033d2f7e49589a048be0999e0bf695d890c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" podUID="693e272e-6eb8-4ead-a6ff-fc7821a57b3b" Sep 9 04:56:50.256357 containerd[1873]: time="2025-09-09T04:56:50.256091569Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-8gzsh,Uid:6c950c8a-2ffb-4b00-901f-b69f23e1fce8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.256435 kubelet[3387]: E0909 04:56:50.256292 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.256435 kubelet[3387]: E0909 04:56:50.256324 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:50.256435 kubelet[3387]: E0909 04:56:50.256336 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-8gzsh" Sep 9 04:56:50.256567 kubelet[3387]: E0909 04:56:50.256546 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-8gzsh_calico-system(6c950c8a-2ffb-4b00-901f-b69f23e1fce8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-8gzsh_calico-system(6c950c8a-2ffb-4b00-901f-b69f23e1fce8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6cb964ed075e439e2673393e718776ca10969019314402ec4d673b45ef81bba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-8gzsh" podUID="6c950c8a-2ffb-4b00-901f-b69f23e1fce8" Sep 9 04:56:50.258369 containerd[1873]: time="2025-09-09T04:56:50.258075736Z" level=error msg="Failed to destroy network for sandbox \"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.261370 containerd[1873]: time="2025-09-09T04:56:50.261332861Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86b6b76f5c-tsdrt,Uid:18b4c698-66c7-4999-a24d-e70e80598cef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.262222 kubelet[3387]: E0909 04:56:50.262200 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.262501 kubelet[3387]: E0909 04:56:50.262482 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86b6b76f5c-tsdrt" Sep 9 04:56:50.262669 kubelet[3387]: E0909 04:56:50.262597 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86b6b76f5c-tsdrt" Sep 9 04:56:50.262669 kubelet[3387]: E0909 04:56:50.262643 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-86b6b76f5c-tsdrt_calico-system(18b4c698-66c7-4999-a24d-e70e80598cef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-86b6b76f5c-tsdrt_calico-system(18b4c698-66c7-4999-a24d-e70e80598cef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f6867182042ee9d8bbb14cbc33d25c772e2f77729d02068a89829619a9cacb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-86b6b76f5c-tsdrt" podUID="18b4c698-66c7-4999-a24d-e70e80598cef" Sep 9 04:56:50.266317 containerd[1873]: time="2025-09-09T04:56:50.265742936Z" level=error msg="Failed to destroy network for sandbox \"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.268989 containerd[1873]: time="2025-09-09T04:56:50.268925674Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc47587f9-bvp5k,Uid:cfb6d47c-68ad-4986-818d-92b496c6dc6d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.269542 kubelet[3387]: E0909 04:56:50.269326 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.269638 kubelet[3387]: E0909 04:56:50.269621 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" Sep 9 04:56:50.270556 kubelet[3387]: E0909 04:56:50.270246 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" Sep 9 04:56:50.270556 kubelet[3387]: E0909 04:56:50.270295 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-bc47587f9-bvp5k_calico-system(cfb6d47c-68ad-4986-818d-92b496c6dc6d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-bc47587f9-bvp5k_calico-system(cfb6d47c-68ad-4986-818d-92b496c6dc6d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9c9d209c34a2d02f7e6dc3c4b72551dd5e6e25a40387e1b9a78099086c60ea8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" podUID="cfb6d47c-68ad-4986-818d-92b496c6dc6d" Sep 9 04:56:50.278549 containerd[1873]: time="2025-09-09T04:56:50.278414674Z" level=error msg="Failed to destroy network for sandbox \"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.281315 containerd[1873]: time="2025-09-09T04:56:50.281283182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-flmnt,Uid:411faeac-2159-4d6a-b25d-bb2781b3f626,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.281718 kubelet[3387]: E0909 04:56:50.281448 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.281718 kubelet[3387]: E0909 04:56:50.281480 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" Sep 9 04:56:50.281718 kubelet[3387]: E0909 04:56:50.281496 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" Sep 9 04:56:50.281819 kubelet[3387]: E0909 04:56:50.281519 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56c46cdc74-flmnt_calico-apiserver(411faeac-2159-4d6a-b25d-bb2781b3f626)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56c46cdc74-flmnt_calico-apiserver(411faeac-2159-4d6a-b25d-bb2781b3f626)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"03984f1a48366d052fe1684c77831e206637f35a19d0a37154c81ca3ba8cb331\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" podUID="411faeac-2159-4d6a-b25d-bb2781b3f626" Sep 9 04:56:50.670085 systemd[1]: Created slice kubepods-besteffort-pod2b9bdbbb_719b_4b79_8d23_89b6ba8a1c19.slice - libcontainer container kubepods-besteffort-pod2b9bdbbb_719b_4b79_8d23_89b6ba8a1c19.slice. Sep 9 04:56:50.671743 containerd[1873]: time="2025-09-09T04:56:50.671710333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v9nht,Uid:2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.710859 containerd[1873]: time="2025-09-09T04:56:50.710818244Z" level=error msg="Failed to destroy network for sandbox \"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.712354 systemd[1]: run-netns-cni\x2dd5278994\x2d7d47\x2ddf1d\x2de3f3\x2d9e88c0a91067.mount: Deactivated successfully. Sep 9 04:56:50.714221 containerd[1873]: time="2025-09-09T04:56:50.714187043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v9nht,Uid:2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.714523 kubelet[3387]: E0909 04:56:50.714357 3387 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.714523 kubelet[3387]: E0909 04:56:50.714411 3387 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:50.714523 kubelet[3387]: E0909 04:56:50.714424 3387 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-v9nht" Sep 9 04:56:50.714669 kubelet[3387]: E0909 04:56:50.714647 3387 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-v9nht_calico-system(2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-v9nht_calico-system(2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8396a03d2bfc440abd3f0ef54b38f2d9752d4f97b52c1b22d7cafba03c30f719\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-v9nht" podUID="2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19" Sep 9 04:56:50.784990 containerd[1873]: time="2025-09-09T04:56:50.784950636Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:56:54.622402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3085996808.mount: Deactivated successfully. Sep 9 04:56:55.217363 containerd[1873]: time="2025-09-09T04:56:55.217241847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.219498 containerd[1873]: time="2025-09-09T04:56:55.219477124Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:56:55.222456 containerd[1873]: time="2025-09-09T04:56:55.222406209Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.226316 containerd[1873]: time="2025-09-09T04:56:55.225901725Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.226316 containerd[1873]: time="2025-09-09T04:56:55.226216498Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.441172667s" Sep 9 04:56:55.226316 containerd[1873]: time="2025-09-09T04:56:55.226236713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:56:55.237562 containerd[1873]: time="2025-09-09T04:56:55.237527221Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:56:55.296208 containerd[1873]: time="2025-09-09T04:56:55.296171335Z" level=info msg="Container 9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:55.299554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4174936676.mount: Deactivated successfully. Sep 9 04:56:55.317405 containerd[1873]: time="2025-09-09T04:56:55.317372295Z" level=info msg="CreateContainer within sandbox \"8476308f26cbceeb155d447a344a9dba7fab104e72b4756b61be2a57ffb611de\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\"" Sep 9 04:56:55.318313 containerd[1873]: time="2025-09-09T04:56:55.318247948Z" level=info msg="StartContainer for \"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\"" Sep 9 04:56:55.319367 containerd[1873]: time="2025-09-09T04:56:55.319317197Z" level=info msg="connecting to shim 9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37" address="unix:///run/containerd/s/1e3ce8f890cda1ac543dcc2450bda33cdb9a93c4bdcb8db79b59f5f88eb19418" protocol=ttrpc version=3 Sep 9 04:56:55.333102 systemd[1]: Started cri-containerd-9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37.scope - libcontainer container 9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37. Sep 9 04:56:55.367214 containerd[1873]: time="2025-09-09T04:56:55.367173583Z" level=info msg="StartContainer for \"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" returns successfully" Sep 9 04:56:55.724808 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:56:55.724920 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:56:55.829431 kubelet[3387]: I0909 04:56:55.829368 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v4xzh" podStartSLOduration=1.440795732 podStartE2EDuration="15.82935357s" podCreationTimestamp="2025-09-09 04:56:40 +0000 UTC" firstStartedPulling="2025-09-09 04:56:40.838352771 +0000 UTC m=+18.250493123" lastFinishedPulling="2025-09-09 04:56:55.226910609 +0000 UTC m=+32.639050961" observedRunningTime="2025-09-09 04:56:55.829067067 +0000 UTC m=+33.241207427" watchObservedRunningTime="2025-09-09 04:56:55.82935357 +0000 UTC m=+33.241493922" Sep 9 04:56:55.883998 containerd[1873]: time="2025-09-09T04:56:55.883218972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"3ab3ef33a803c3072da943aa9c5cdc089fe665d308fe1de51234ebde44122286\" pid:4468 exit_status:1 exited_at:{seconds:1757393815 nanos:882923142}" Sep 9 04:56:55.897280 kubelet[3387]: I0909 04:56:55.897243 3387 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6fpf\" (UniqueName: \"kubernetes.io/projected/18b4c698-66c7-4999-a24d-e70e80598cef-kube-api-access-s6fpf\") pod \"18b4c698-66c7-4999-a24d-e70e80598cef\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " Sep 9 04:56:55.897280 kubelet[3387]: I0909 04:56:55.897288 3387 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-ca-bundle\") pod \"18b4c698-66c7-4999-a24d-e70e80598cef\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " Sep 9 04:56:55.897450 kubelet[3387]: I0909 04:56:55.897304 3387 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-backend-key-pair\") pod \"18b4c698-66c7-4999-a24d-e70e80598cef\" (UID: \"18b4c698-66c7-4999-a24d-e70e80598cef\") " Sep 9 04:56:55.900668 kubelet[3387]: I0909 04:56:55.899994 3387 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "18b4c698-66c7-4999-a24d-e70e80598cef" (UID: "18b4c698-66c7-4999-a24d-e70e80598cef"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 04:56:55.903649 systemd[1]: var-lib-kubelet-pods-18b4c698\x2d66c7\x2d4999\x2da24d\x2de70e80598cef-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:56:55.905111 systemd[1]: var-lib-kubelet-pods-18b4c698\x2d66c7\x2d4999\x2da24d\x2de70e80598cef-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds6fpf.mount: Deactivated successfully. Sep 9 04:56:55.906793 kubelet[3387]: I0909 04:56:55.906759 3387 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "18b4c698-66c7-4999-a24d-e70e80598cef" (UID: "18b4c698-66c7-4999-a24d-e70e80598cef"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 04:56:55.907507 kubelet[3387]: I0909 04:56:55.907465 3387 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b4c698-66c7-4999-a24d-e70e80598cef-kube-api-access-s6fpf" (OuterVolumeSpecName: "kube-api-access-s6fpf") pod "18b4c698-66c7-4999-a24d-e70e80598cef" (UID: "18b4c698-66c7-4999-a24d-e70e80598cef"). InnerVolumeSpecName "kube-api-access-s6fpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 04:56:55.997854 kubelet[3387]: I0909 04:56:55.997565 3387 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6fpf\" (UniqueName: \"kubernetes.io/projected/18b4c698-66c7-4999-a24d-e70e80598cef-kube-api-access-s6fpf\") on node \"ci-4452.0.0-n-7e0b6f01e2\" DevicePath \"\"" Sep 9 04:56:55.997854 kubelet[3387]: I0909 04:56:55.997606 3387 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-ca-bundle\") on node \"ci-4452.0.0-n-7e0b6f01e2\" DevicePath \"\"" Sep 9 04:56:55.997854 kubelet[3387]: I0909 04:56:55.997616 3387 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/18b4c698-66c7-4999-a24d-e70e80598cef-whisker-backend-key-pair\") on node \"ci-4452.0.0-n-7e0b6f01e2\" DevicePath \"\"" Sep 9 04:56:56.672046 systemd[1]: Removed slice kubepods-besteffort-pod18b4c698_66c7_4999_a24d_e70e80598cef.slice - libcontainer container kubepods-besteffort-pod18b4c698_66c7_4999_a24d_e70e80598cef.slice. Sep 9 04:56:56.889044 systemd[1]: Created slice kubepods-besteffort-pod58366981_7df8_4025_91f1_81364c38fccf.slice - libcontainer container kubepods-besteffort-pod58366981_7df8_4025_91f1_81364c38fccf.slice. Sep 9 04:56:56.904418 kubelet[3387]: I0909 04:56:56.903875 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58366981-7df8-4025-91f1-81364c38fccf-whisker-ca-bundle\") pod \"whisker-75c4b4cbcb-jslnb\" (UID: \"58366981-7df8-4025-91f1-81364c38fccf\") " pod="calico-system/whisker-75c4b4cbcb-jslnb" Sep 9 04:56:56.904418 kubelet[3387]: I0909 04:56:56.904373 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/58366981-7df8-4025-91f1-81364c38fccf-whisker-backend-key-pair\") pod \"whisker-75c4b4cbcb-jslnb\" (UID: \"58366981-7df8-4025-91f1-81364c38fccf\") " pod="calico-system/whisker-75c4b4cbcb-jslnb" Sep 9 04:56:56.904418 kubelet[3387]: I0909 04:56:56.904394 3387 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kpj\" (UniqueName: \"kubernetes.io/projected/58366981-7df8-4025-91f1-81364c38fccf-kube-api-access-k6kpj\") pod \"whisker-75c4b4cbcb-jslnb\" (UID: \"58366981-7df8-4025-91f1-81364c38fccf\") " pod="calico-system/whisker-75c4b4cbcb-jslnb" Sep 9 04:56:56.904798 containerd[1873]: time="2025-09-09T04:56:56.904320402Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"52e16b75fa562e46d22b70df4d2c5cd31bebc1c7b8a9efdccf153811fb7b356a\" pid:4513 exit_status:1 exited_at:{seconds:1757393816 nanos:904003925}" Sep 9 04:56:57.197804 containerd[1873]: time="2025-09-09T04:56:57.197764810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75c4b4cbcb-jslnb,Uid:58366981-7df8-4025-91f1-81364c38fccf,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:57.359584 systemd-networkd[1664]: calie9bab569063: Link UP Sep 9 04:56:57.360003 systemd-networkd[1664]: calie9bab569063: Gained carrier Sep 9 04:56:57.378923 containerd[1873]: 2025-09-09 04:56:57.230 [INFO][4630] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:56:57.378923 containerd[1873]: 2025-09-09 04:56:57.263 [INFO][4630] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0 whisker-75c4b4cbcb- calico-system 58366981-7df8-4025-91f1-81364c38fccf 860 0 2025-09-09 04:56:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:75c4b4cbcb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 whisker-75c4b4cbcb-jslnb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie9bab569063 [] [] }} ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-" Sep 9 04:56:57.378923 containerd[1873]: 2025-09-09 04:56:57.264 [INFO][4630] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.378923 containerd[1873]: 2025-09-09 04:56:57.298 [INFO][4652] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" HandleID="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.298 [INFO][4652] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" HandleID="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3930), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"whisker-75c4b4cbcb-jslnb", "timestamp":"2025-09-09 04:56:57.29857293 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.298 [INFO][4652] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.298 [INFO][4652] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.298 [INFO][4652] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.304 [INFO][4652] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.310 [INFO][4652] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.314 [INFO][4652] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.316 [INFO][4652] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379479 containerd[1873]: 2025-09-09 04:56:57.319 [INFO][4652] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.319 [INFO][4652] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.321 [INFO][4652] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143 Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.331 [INFO][4652] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.336 [INFO][4652] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.1/26] block=192.168.55.0/26 handle="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.336 [INFO][4652] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.1/26] handle="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.336 [INFO][4652] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:56:57.379615 containerd[1873]: 2025-09-09 04:56:57.336 [INFO][4652] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.1/26] IPv6=[] ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" HandleID="k8s-pod-network.7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.379713 containerd[1873]: 2025-09-09 04:56:57.339 [INFO][4630] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0", GenerateName:"whisker-75c4b4cbcb-", Namespace:"calico-system", SelfLink:"", UID:"58366981-7df8-4025-91f1-81364c38fccf", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"75c4b4cbcb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"whisker-75c4b4cbcb-jslnb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9bab569063", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:56:57.379713 containerd[1873]: 2025-09-09 04:56:57.339 [INFO][4630] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.1/32] ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.379761 containerd[1873]: 2025-09-09 04:56:57.339 [INFO][4630] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9bab569063 ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.379761 containerd[1873]: 2025-09-09 04:56:57.359 [INFO][4630] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.379788 containerd[1873]: 2025-09-09 04:56:57.360 [INFO][4630] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0", GenerateName:"whisker-75c4b4cbcb-", Namespace:"calico-system", SelfLink:"", UID:"58366981-7df8-4025-91f1-81364c38fccf", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"75c4b4cbcb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143", Pod:"whisker-75c4b4cbcb-jslnb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9bab569063", MAC:"52:8e:3d:36:36:b4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:56:57.379822 containerd[1873]: 2025-09-09 04:56:57.376 [INFO][4630] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" Namespace="calico-system" Pod="whisker-75c4b4cbcb-jslnb" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-whisker--75c4b4cbcb--jslnb-eth0" Sep 9 04:56:57.419558 containerd[1873]: time="2025-09-09T04:56:57.419526737Z" level=info msg="connecting to shim 7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143" address="unix:///run/containerd/s/39667d4454d88fa9266b6cedb94882b7df653fa82dca443538e75a916c2fa1da" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:57.439100 systemd[1]: Started cri-containerd-7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143.scope - libcontainer container 7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143. Sep 9 04:56:57.482359 containerd[1873]: time="2025-09-09T04:56:57.482310881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75c4b4cbcb-jslnb,Uid:58366981-7df8-4025-91f1-81364c38fccf,Namespace:calico-system,Attempt:0,} returns sandbox id \"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143\"" Sep 9 04:56:57.484079 containerd[1873]: time="2025-09-09T04:56:57.484048036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:56:57.610578 systemd-networkd[1664]: vxlan.calico: Link UP Sep 9 04:56:57.610584 systemd-networkd[1664]: vxlan.calico: Gained carrier Sep 9 04:56:58.670337 kubelet[3387]: I0909 04:56:58.670247 3387 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b4c698-66c7-4999-a24d-e70e80598cef" path="/var/lib/kubelet/pods/18b4c698-66c7-4999-a24d-e70e80598cef/volumes" Sep 9 04:56:58.676610 containerd[1873]: time="2025-09-09T04:56:58.676555050Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.679298 containerd[1873]: time="2025-09-09T04:56:58.679252404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:56:58.682730 containerd[1873]: time="2025-09-09T04:56:58.682658797Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.686128 containerd[1873]: time="2025-09-09T04:56:58.686037815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.686534 containerd[1873]: time="2025-09-09T04:56:58.686386939Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.202295577s" Sep 9 04:56:58.686534 containerd[1873]: time="2025-09-09T04:56:58.686416593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:56:58.688481 containerd[1873]: time="2025-09-09T04:56:58.688450530Z" level=info msg="CreateContainer within sandbox \"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:56:58.705423 containerd[1873]: time="2025-09-09T04:56:58.705133539Z" level=info msg="Container 4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:58.722128 containerd[1873]: time="2025-09-09T04:56:58.722095027Z" level=info msg="CreateContainer within sandbox \"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab\"" Sep 9 04:56:58.723019 containerd[1873]: time="2025-09-09T04:56:58.722993510Z" level=info msg="StartContainer for \"4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab\"" Sep 9 04:56:58.723946 containerd[1873]: time="2025-09-09T04:56:58.723919400Z" level=info msg="connecting to shim 4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab" address="unix:///run/containerd/s/39667d4454d88fa9266b6cedb94882b7df653fa82dca443538e75a916c2fa1da" protocol=ttrpc version=3 Sep 9 04:56:58.745110 systemd[1]: Started cri-containerd-4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab.scope - libcontainer container 4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab. Sep 9 04:56:58.775360 containerd[1873]: time="2025-09-09T04:56:58.775323010Z" level=info msg="StartContainer for \"4c71212a4cfe5aa6866414f025e39b35035b788f20c83efca25fd084343b4aab\" returns successfully" Sep 9 04:56:58.776339 containerd[1873]: time="2025-09-09T04:56:58.776312064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:56:58.961102 systemd-networkd[1664]: calie9bab569063: Gained IPv6LL Sep 9 04:56:59.601151 systemd-networkd[1664]: vxlan.calico: Gained IPv6LL Sep 9 04:57:00.667449 containerd[1873]: time="2025-09-09T04:57:00.667187894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-flmnt,Uid:411faeac-2159-4d6a-b25d-bb2781b3f626,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:57:00.771505 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3257324736.mount: Deactivated successfully. Sep 9 04:57:00.836232 containerd[1873]: time="2025-09-09T04:57:00.836048059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.838432 containerd[1873]: time="2025-09-09T04:57:00.838396418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:57:00.845115 containerd[1873]: time="2025-09-09T04:57:00.845051732Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.849995 containerd[1873]: time="2025-09-09T04:57:00.849583131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.850136 containerd[1873]: time="2025-09-09T04:57:00.850114916Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.073773382s" Sep 9 04:57:00.850210 containerd[1873]: time="2025-09-09T04:57:00.850198543Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:57:00.853229 containerd[1873]: time="2025-09-09T04:57:00.853207520Z" level=info msg="CreateContainer within sandbox \"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:57:00.862128 systemd-networkd[1664]: calib4b5ccefb12: Link UP Sep 9 04:57:00.863075 systemd-networkd[1664]: calib4b5ccefb12: Gained carrier Sep 9 04:57:00.879589 containerd[1873]: 2025-09-09 04:57:00.793 [INFO][4835] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0 calico-apiserver-56c46cdc74- calico-apiserver 411faeac-2159-4d6a-b25d-bb2781b3f626 793 0 2025-09-09 04:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56c46cdc74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 calico-apiserver-56c46cdc74-flmnt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib4b5ccefb12 [] [] }} ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-" Sep 9 04:57:00.879589 containerd[1873]: 2025-09-09 04:57:00.793 [INFO][4835] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.879589 containerd[1873]: 2025-09-09 04:57:00.814 [INFO][4847] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" HandleID="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.815 [INFO][4847] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" HandleID="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"calico-apiserver-56c46cdc74-flmnt", "timestamp":"2025-09-09 04:57:00.814876865 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.816 [INFO][4847] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.816 [INFO][4847] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.816 [INFO][4847] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.823 [INFO][4847] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.828 [INFO][4847] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.832 [INFO][4847] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.835 [INFO][4847] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880736 containerd[1873]: 2025-09-09 04:57:00.839 [INFO][4847] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.839 [INFO][4847] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.840 [INFO][4847] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569 Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.848 [INFO][4847] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.855 [INFO][4847] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.2/26] block=192.168.55.0/26 handle="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.855 [INFO][4847] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.2/26] handle="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.855 [INFO][4847] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:00.880894 containerd[1873]: 2025-09-09 04:57:00.855 [INFO][4847] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.2/26] IPv6=[] ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" HandleID="k8s-pod-network.87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.882475 containerd[1873]: 2025-09-09 04:57:00.859 [INFO][4835] cni-plugin/k8s.go 418: Populated endpoint ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0", GenerateName:"calico-apiserver-56c46cdc74-", Namespace:"calico-apiserver", SelfLink:"", UID:"411faeac-2159-4d6a-b25d-bb2781b3f626", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56c46cdc74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"calico-apiserver-56c46cdc74-flmnt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib4b5ccefb12", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:00.882532 containerd[1873]: 2025-09-09 04:57:00.859 [INFO][4835] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.2/32] ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.882532 containerd[1873]: 2025-09-09 04:57:00.859 [INFO][4835] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib4b5ccefb12 ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.882532 containerd[1873]: 2025-09-09 04:57:00.863 [INFO][4835] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.882577 containerd[1873]: 2025-09-09 04:57:00.863 [INFO][4835] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0", GenerateName:"calico-apiserver-56c46cdc74-", Namespace:"calico-apiserver", SelfLink:"", UID:"411faeac-2159-4d6a-b25d-bb2781b3f626", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56c46cdc74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569", Pod:"calico-apiserver-56c46cdc74-flmnt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib4b5ccefb12", MAC:"de:a2:13:62:30:3b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:00.882618 containerd[1873]: 2025-09-09 04:57:00.877 [INFO][4835] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-flmnt" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--flmnt-eth0" Sep 9 04:57:00.886640 containerd[1873]: time="2025-09-09T04:57:00.885339568Z" level=info msg="Container 1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:00.890687 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount539824686.mount: Deactivated successfully. Sep 9 04:57:00.904109 containerd[1873]: time="2025-09-09T04:57:00.904062865Z" level=info msg="CreateContainer within sandbox \"7b854c57b2fc6004f0cc49aa196973b52f9b679900c7122747191cf354abf143\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2\"" Sep 9 04:57:00.904792 containerd[1873]: time="2025-09-09T04:57:00.904762201Z" level=info msg="StartContainer for \"1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2\"" Sep 9 04:57:00.906363 containerd[1873]: time="2025-09-09T04:57:00.906338236Z" level=info msg="connecting to shim 1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2" address="unix:///run/containerd/s/39667d4454d88fa9266b6cedb94882b7df653fa82dca443538e75a916c2fa1da" protocol=ttrpc version=3 Sep 9 04:57:00.923143 systemd[1]: Started cri-containerd-1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2.scope - libcontainer container 1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2. Sep 9 04:57:00.939331 containerd[1873]: time="2025-09-09T04:57:00.939282822Z" level=info msg="connecting to shim 87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569" address="unix:///run/containerd/s/c5e87cab60ce6f98f29917ef1bb37804eb7c914adf61cdc1df891e2e3190ba2e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:00.968123 systemd[1]: Started cri-containerd-87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569.scope - libcontainer container 87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569. Sep 9 04:57:00.970911 containerd[1873]: time="2025-09-09T04:57:00.970871871Z" level=info msg="StartContainer for \"1a96a38252bdeb39d9457479051b3a7a1d83f8a65d0e2264105a914322d4baf2\" returns successfully" Sep 9 04:57:01.019074 containerd[1873]: time="2025-09-09T04:57:01.019010559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-flmnt,Uid:411faeac-2159-4d6a-b25d-bb2781b3f626,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569\"" Sep 9 04:57:01.021196 containerd[1873]: time="2025-09-09T04:57:01.021161258Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:57:01.828246 kubelet[3387]: I0909 04:57:01.827671 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-75c4b4cbcb-jslnb" podStartSLOduration=2.460234813 podStartE2EDuration="5.827659271s" podCreationTimestamp="2025-09-09 04:56:56 +0000 UTC" firstStartedPulling="2025-09-09 04:56:57.483403169 +0000 UTC m=+34.895543521" lastFinishedPulling="2025-09-09 04:57:00.850827627 +0000 UTC m=+38.262967979" observedRunningTime="2025-09-09 04:57:01.82678621 +0000 UTC m=+39.238926570" watchObservedRunningTime="2025-09-09 04:57:01.827659271 +0000 UTC m=+39.239799623" Sep 9 04:57:02.481227 systemd-networkd[1664]: calib4b5ccefb12: Gained IPv6LL Sep 9 04:57:02.666721 containerd[1873]: time="2025-09-09T04:57:02.666591624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xxf7j,Uid:e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8,Namespace:kube-system,Attempt:0,}" Sep 9 04:57:03.004609 systemd-networkd[1664]: cali515f462f8b3: Link UP Sep 9 04:57:03.005714 systemd-networkd[1664]: cali515f462f8b3: Gained carrier Sep 9 04:57:03.024412 containerd[1873]: 2025-09-09 04:57:02.716 [INFO][4949] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0 coredns-7c65d6cfc9- kube-system e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8 783 0 2025-09-09 04:56:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 coredns-7c65d6cfc9-xxf7j eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali515f462f8b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-" Sep 9 04:57:03.024412 containerd[1873]: 2025-09-09 04:57:02.716 [INFO][4949] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.024412 containerd[1873]: 2025-09-09 04:57:02.751 [INFO][4961] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" HandleID="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.752 [INFO][4961] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" HandleID="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"coredns-7c65d6cfc9-xxf7j", "timestamp":"2025-09-09 04:57:02.751837414 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.752 [INFO][4961] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.752 [INFO][4961] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.752 [INFO][4961] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.759 [INFO][4961] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.764 [INFO][4961] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.770 [INFO][4961] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.773 [INFO][4961] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024708 containerd[1873]: 2025-09-09 04:57:02.775 [INFO][4961] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.775 [INFO][4961] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.779 [INFO][4961] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031 Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.785 [INFO][4961] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.799 [INFO][4961] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.3/26] block=192.168.55.0/26 handle="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.799 [INFO][4961] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.3/26] handle="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.799 [INFO][4961] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:03.024952 containerd[1873]: 2025-09-09 04:57:02.799 [INFO][4961] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.3/26] IPv6=[] ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" HandleID="k8s-pod-network.c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.001 [INFO][4949] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"coredns-7c65d6cfc9-xxf7j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali515f462f8b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.001 [INFO][4949] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.3/32] ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.001 [INFO][4949] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali515f462f8b3 ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.008 [INFO][4949] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.008 [INFO][4949] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031", Pod:"coredns-7c65d6cfc9-xxf7j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali515f462f8b3", MAC:"f6:a2:12:0c:f4:55", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:03.025242 containerd[1873]: 2025-09-09 04:57:03.021 [INFO][4949] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xxf7j" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--xxf7j-eth0" Sep 9 04:57:03.394693 containerd[1873]: time="2025-09-09T04:57:03.394190934Z" level=info msg="connecting to shim c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031" address="unix:///run/containerd/s/1ef9a8cc8242bc75629c44ee53e9ccc4404900344467ff5ee3199776fa5d6658" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:03.413371 containerd[1873]: time="2025-09-09T04:57:03.413312473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:03.415774 containerd[1873]: time="2025-09-09T04:57:03.415680597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 04:57:03.416111 systemd[1]: Started cri-containerd-c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031.scope - libcontainer container c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031. Sep 9 04:57:03.419376 containerd[1873]: time="2025-09-09T04:57:03.418537765Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:03.423733 containerd[1873]: time="2025-09-09T04:57:03.423689126Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:03.424025 containerd[1873]: time="2025-09-09T04:57:03.423999828Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.402805564s" Sep 9 04:57:03.424025 containerd[1873]: time="2025-09-09T04:57:03.424024874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:57:03.427835 containerd[1873]: time="2025-09-09T04:57:03.427790429Z" level=info msg="CreateContainer within sandbox \"87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:57:03.454477 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2144847307.mount: Deactivated successfully. Sep 9 04:57:03.456109 containerd[1873]: time="2025-09-09T04:57:03.455510621Z" level=info msg="Container c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:03.456109 containerd[1873]: time="2025-09-09T04:57:03.455769254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xxf7j,Uid:e7cb1906-b53a-45cd-a94c-8e7d1fc0c6f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031\"" Sep 9 04:57:03.460133 containerd[1873]: time="2025-09-09T04:57:03.460107758Z" level=info msg="CreateContainer within sandbox \"c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:57:03.472886 containerd[1873]: time="2025-09-09T04:57:03.472851408Z" level=info msg="CreateContainer within sandbox \"87875e22bd8322f07059b513db793fbb6b79d4fb2f457a5047bee030ccc1e569\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941\"" Sep 9 04:57:03.473811 containerd[1873]: time="2025-09-09T04:57:03.473344067Z" level=info msg="StartContainer for \"c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941\"" Sep 9 04:57:03.474426 containerd[1873]: time="2025-09-09T04:57:03.474383934Z" level=info msg="connecting to shim c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941" address="unix:///run/containerd/s/c5e87cab60ce6f98f29917ef1bb37804eb7c914adf61cdc1df891e2e3190ba2e" protocol=ttrpc version=3 Sep 9 04:57:03.492692 containerd[1873]: time="2025-09-09T04:57:03.492660690Z" level=info msg="Container 88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:03.496115 systemd[1]: Started cri-containerd-c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941.scope - libcontainer container c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941. Sep 9 04:57:03.514476 containerd[1873]: time="2025-09-09T04:57:03.514436217Z" level=info msg="CreateContainer within sandbox \"c564e907fa59a06569a0ece91a8f228357adb7cac7744c5713fa0d9eb38f7031\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14\"" Sep 9 04:57:03.516195 containerd[1873]: time="2025-09-09T04:57:03.516166283Z" level=info msg="StartContainer for \"88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14\"" Sep 9 04:57:03.516774 containerd[1873]: time="2025-09-09T04:57:03.516747241Z" level=info msg="connecting to shim 88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14" address="unix:///run/containerd/s/1ef9a8cc8242bc75629c44ee53e9ccc4404900344467ff5ee3199776fa5d6658" protocol=ttrpc version=3 Sep 9 04:57:03.542218 systemd[1]: Started cri-containerd-88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14.scope - libcontainer container 88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14. Sep 9 04:57:03.553935 containerd[1873]: time="2025-09-09T04:57:03.553903702Z" level=info msg="StartContainer for \"c33fad9df91ffde470ad91a229c9485b16be83b461169166e5783bd4b88fd941\" returns successfully" Sep 9 04:57:03.585045 containerd[1873]: time="2025-09-09T04:57:03.585010535Z" level=info msg="StartContainer for \"88f6cb98d0c694a3421838d8ae97d41ba9a4878117abbdcd47413b27737bcf14\" returns successfully" Sep 9 04:57:03.666543 containerd[1873]: time="2025-09-09T04:57:03.666038157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-z29nv,Uid:fe83b551-6b9a-4438-9381-a5d918caf80a,Namespace:kube-system,Attempt:0,}" Sep 9 04:57:03.774834 systemd-networkd[1664]: cali9f396b56f28: Link UP Sep 9 04:57:03.777595 systemd-networkd[1664]: cali9f396b56f28: Gained carrier Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.706 [INFO][5099] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0 coredns-7c65d6cfc9- kube-system fe83b551-6b9a-4438-9381-a5d918caf80a 794 0 2025-09-09 04:56:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 coredns-7c65d6cfc9-z29nv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9f396b56f28 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.706 [INFO][5099] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.728 [INFO][5111] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" HandleID="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.729 [INFO][5111] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" HandleID="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"coredns-7c65d6cfc9-z29nv", "timestamp":"2025-09-09 04:57:03.728953094 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.729 [INFO][5111] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.729 [INFO][5111] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.729 [INFO][5111] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.735 [INFO][5111] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.740 [INFO][5111] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.746 [INFO][5111] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.748 [INFO][5111] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.751 [INFO][5111] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.751 [INFO][5111] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.752 [INFO][5111] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.760 [INFO][5111] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.766 [INFO][5111] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.4/26] block=192.168.55.0/26 handle="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.766 [INFO][5111] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.4/26] handle="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.766 [INFO][5111] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:03.792398 containerd[1873]: 2025-09-09 04:57:03.766 [INFO][5111] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.4/26] IPv6=[] ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" HandleID="k8s-pod-network.c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.769 [INFO][5099] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fe83b551-6b9a-4438-9381-a5d918caf80a", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"coredns-7c65d6cfc9-z29nv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9f396b56f28", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.771 [INFO][5099] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.4/32] ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.771 [INFO][5099] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9f396b56f28 ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.775 [INFO][5099] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.775 [INFO][5099] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fe83b551-6b9a-4438-9381-a5d918caf80a", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f", Pod:"coredns-7c65d6cfc9-z29nv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9f396b56f28", MAC:"4a:d9:eb:12:44:1d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:03.792947 containerd[1873]: 2025-09-09 04:57:03.790 [INFO][5099] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-z29nv" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-coredns--7c65d6cfc9--z29nv-eth0" Sep 9 04:57:03.846071 containerd[1873]: time="2025-09-09T04:57:03.845996869Z" level=info msg="connecting to shim c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f" address="unix:///run/containerd/s/ecb84d0c3f101723ab1b3d585ed7ba1c479a4281a3cda48b48359c9871feb65b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:03.860308 kubelet[3387]: I0909 04:57:03.860161 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56c46cdc74-flmnt" podStartSLOduration=24.455327622 podStartE2EDuration="26.860145884s" podCreationTimestamp="2025-09-09 04:56:37 +0000 UTC" firstStartedPulling="2025-09-09 04:57:01.020305868 +0000 UTC m=+38.432446220" lastFinishedPulling="2025-09-09 04:57:03.42512413 +0000 UTC m=+40.837264482" observedRunningTime="2025-09-09 04:57:03.839278225 +0000 UTC m=+41.251418577" watchObservedRunningTime="2025-09-09 04:57:03.860145884 +0000 UTC m=+41.272286244" Sep 9 04:57:03.880210 systemd[1]: Started cri-containerd-c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f.scope - libcontainer container c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f. Sep 9 04:57:03.925609 containerd[1873]: time="2025-09-09T04:57:03.925034321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-z29nv,Uid:fe83b551-6b9a-4438-9381-a5d918caf80a,Namespace:kube-system,Attempt:0,} returns sandbox id \"c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f\"" Sep 9 04:57:03.929721 containerd[1873]: time="2025-09-09T04:57:03.929654657Z" level=info msg="CreateContainer within sandbox \"c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:57:03.949913 containerd[1873]: time="2025-09-09T04:57:03.949524152Z" level=info msg="Container 19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:03.965689 containerd[1873]: time="2025-09-09T04:57:03.965655986Z" level=info msg="CreateContainer within sandbox \"c1f9212702327efbb56ea8c93d43cc8f2e12e354885d19730fedba708024a98f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628\"" Sep 9 04:57:03.966404 containerd[1873]: time="2025-09-09T04:57:03.966366040Z" level=info msg="StartContainer for \"19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628\"" Sep 9 04:57:03.967911 containerd[1873]: time="2025-09-09T04:57:03.967886039Z" level=info msg="connecting to shim 19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628" address="unix:///run/containerd/s/ecb84d0c3f101723ab1b3d585ed7ba1c479a4281a3cda48b48359c9871feb65b" protocol=ttrpc version=3 Sep 9 04:57:03.987115 systemd[1]: Started cri-containerd-19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628.scope - libcontainer container 19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628. Sep 9 04:57:04.030856 containerd[1873]: time="2025-09-09T04:57:04.030800072Z" level=info msg="StartContainer for \"19fbf4e7787751e2baf973f2a954b29d2142acd1fce62704a32baf30025ae628\" returns successfully" Sep 9 04:57:04.337144 systemd-networkd[1664]: cali515f462f8b3: Gained IPv6LL Sep 9 04:57:04.667300 containerd[1873]: time="2025-09-09T04:57:04.667234588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc47587f9-bvp5k,Uid:cfb6d47c-68ad-4986-818d-92b496c6dc6d,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:04.834381 kubelet[3387]: I0909 04:57:04.833578 3387 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:04.845507 kubelet[3387]: I0909 04:57:04.845458 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xxf7j" podStartSLOduration=35.845445138 podStartE2EDuration="35.845445138s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:57:03.862751963 +0000 UTC m=+41.274892315" watchObservedRunningTime="2025-09-09 04:57:04.845445138 +0000 UTC m=+42.257585490" Sep 9 04:57:04.863911 kubelet[3387]: I0909 04:57:04.862831 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-z29nv" podStartSLOduration=35.862817676 podStartE2EDuration="35.862817676s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:57:04.846654867 +0000 UTC m=+42.258795219" watchObservedRunningTime="2025-09-09 04:57:04.862817676 +0000 UTC m=+42.274958028" Sep 9 04:57:05.049652 systemd-networkd[1664]: calicf8d5fd2422: Link UP Sep 9 04:57:05.050239 systemd-networkd[1664]: calicf8d5fd2422: Gained carrier Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:04.987 [INFO][5214] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0 calico-kube-controllers-bc47587f9- calico-system cfb6d47c-68ad-4986-818d-92b496c6dc6d 795 0 2025-09-09 04:56:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:bc47587f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 calico-kube-controllers-bc47587f9-bvp5k eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calicf8d5fd2422 [] [] }} ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:04.987 [INFO][5214] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.009 [INFO][5226] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" HandleID="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.010 [INFO][5226] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" HandleID="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"calico-kube-controllers-bc47587f9-bvp5k", "timestamp":"2025-09-09 04:57:05.00984815 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.010 [INFO][5226] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.010 [INFO][5226] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.010 [INFO][5226] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.015 [INFO][5226] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.020 [INFO][5226] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.024 [INFO][5226] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.025 [INFO][5226] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.029 [INFO][5226] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.029 [INFO][5226] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.030 [INFO][5226] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0 Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.035 [INFO][5226] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.045 [INFO][5226] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.5/26] block=192.168.55.0/26 handle="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.045 [INFO][5226] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.5/26] handle="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.045 [INFO][5226] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:05.067584 containerd[1873]: 2025-09-09 04:57:05.045 [INFO][5226] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.5/26] IPv6=[] ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" HandleID="k8s-pod-network.8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.047 [INFO][5214] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0", GenerateName:"calico-kube-controllers-bc47587f9-", Namespace:"calico-system", SelfLink:"", UID:"cfb6d47c-68ad-4986-818d-92b496c6dc6d", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bc47587f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"calico-kube-controllers-bc47587f9-bvp5k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicf8d5fd2422", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.047 [INFO][5214] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.5/32] ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.047 [INFO][5214] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicf8d5fd2422 ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.050 [INFO][5214] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.050 [INFO][5214] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0", GenerateName:"calico-kube-controllers-bc47587f9-", Namespace:"calico-system", SelfLink:"", UID:"cfb6d47c-68ad-4986-818d-92b496c6dc6d", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"bc47587f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0", Pod:"calico-kube-controllers-bc47587f9-bvp5k", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicf8d5fd2422", MAC:"e2:6f:1b:b9:e8:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.068675 containerd[1873]: 2025-09-09 04:57:05.062 [INFO][5214] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" Namespace="calico-system" Pod="calico-kube-controllers-bc47587f9-bvp5k" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--kube--controllers--bc47587f9--bvp5k-eth0" Sep 9 04:57:05.105090 systemd-networkd[1664]: cali9f396b56f28: Gained IPv6LL Sep 9 04:57:05.116020 containerd[1873]: time="2025-09-09T04:57:05.115959728Z" level=info msg="connecting to shim 8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0" address="unix:///run/containerd/s/14eac813932262becec842d1743aae331e25d09d52f4c91bd7381f127b7fbe4b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:05.137110 systemd[1]: Started cri-containerd-8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0.scope - libcontainer container 8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0. Sep 9 04:57:05.170868 containerd[1873]: time="2025-09-09T04:57:05.170832050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-bc47587f9-bvp5k,Uid:cfb6d47c-68ad-4986-818d-92b496c6dc6d,Namespace:calico-system,Attempt:0,} returns sandbox id \"8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0\"" Sep 9 04:57:05.172634 containerd[1873]: time="2025-09-09T04:57:05.172541525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 04:57:05.665675 containerd[1873]: time="2025-09-09T04:57:05.665634550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-xwbfc,Uid:693e272e-6eb8-4ead-a6ff-fc7821a57b3b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:57:05.665953 containerd[1873]: time="2025-09-09T04:57:05.665908814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-8gzsh,Uid:6c950c8a-2ffb-4b00-901f-b69f23e1fce8,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:05.666067 containerd[1873]: time="2025-09-09T04:57:05.665680812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v9nht,Uid:2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:05.806926 systemd-networkd[1664]: calife998d69743: Link UP Sep 9 04:57:05.809575 systemd-networkd[1664]: calife998d69743: Gained carrier Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.720 [INFO][5287] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0 goldmane-7988f88666- calico-system 6c950c8a-2ffb-4b00-901f-b69f23e1fce8 789 0 2025-09-09 04:56:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 goldmane-7988f88666-8gzsh eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calife998d69743 [] [] }} ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.720 [INFO][5287] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.758 [INFO][5322] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" HandleID="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.758 [INFO][5322] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" HandleID="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000255770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"goldmane-7988f88666-8gzsh", "timestamp":"2025-09-09 04:57:05.758795267 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.758 [INFO][5322] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.759 [INFO][5322] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.759 [INFO][5322] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.768 [INFO][5322] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.773 [INFO][5322] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.779 [INFO][5322] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.782 [INFO][5322] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.784 [INFO][5322] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.784 [INFO][5322] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.786 [INFO][5322] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2 Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.792 [INFO][5322] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.800 [INFO][5322] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.6/26] block=192.168.55.0/26 handle="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.800 [INFO][5322] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.6/26] handle="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.800 [INFO][5322] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:05.835008 containerd[1873]: 2025-09-09 04:57:05.800 [INFO][5322] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.6/26] IPv6=[] ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" HandleID="k8s-pod-network.fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.803 [INFO][5287] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"6c950c8a-2ffb-4b00-901f-b69f23e1fce8", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"goldmane-7988f88666-8gzsh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calife998d69743", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.803 [INFO][5287] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.6/32] ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.803 [INFO][5287] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calife998d69743 ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.811 [INFO][5287] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.812 [INFO][5287] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"6c950c8a-2ffb-4b00-901f-b69f23e1fce8", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2", Pod:"goldmane-7988f88666-8gzsh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calife998d69743", MAC:"76:cf:27:2d:fa:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.835408 containerd[1873]: 2025-09-09 04:57:05.832 [INFO][5287] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" Namespace="calico-system" Pod="goldmane-7988f88666-8gzsh" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-goldmane--7988f88666--8gzsh-eth0" Sep 9 04:57:05.895221 containerd[1873]: time="2025-09-09T04:57:05.895177808Z" level=info msg="connecting to shim fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2" address="unix:///run/containerd/s/43577247195c95d4206aa902ab6c319ea73178496f1de8161e165088eab62022" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:05.919563 systemd-networkd[1664]: calia75a141d1bb: Link UP Sep 9 04:57:05.921444 systemd-networkd[1664]: calia75a141d1bb: Gained carrier Sep 9 04:57:05.940190 systemd[1]: Started cri-containerd-fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2.scope - libcontainer container fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2. Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.741 [INFO][5297] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0 calico-apiserver-56c46cdc74- calico-apiserver 693e272e-6eb8-4ead-a6ff-fc7821a57b3b 792 0 2025-09-09 04:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56c46cdc74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 calico-apiserver-56c46cdc74-xwbfc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia75a141d1bb [] [] }} ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.741 [INFO][5297] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.790 [INFO][5331] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" HandleID="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.790 [INFO][5331] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" HandleID="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"calico-apiserver-56c46cdc74-xwbfc", "timestamp":"2025-09-09 04:57:05.790510208 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.790 [INFO][5331] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.801 [INFO][5331] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.801 [INFO][5331] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.869 [INFO][5331] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.874 [INFO][5331] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.878 [INFO][5331] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.881 [INFO][5331] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.889 [INFO][5331] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.889 [INFO][5331] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.892 [INFO][5331] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6 Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.896 [INFO][5331] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.909 [INFO][5331] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.7/26] block=192.168.55.0/26 handle="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.909 [INFO][5331] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.7/26] handle="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.909 [INFO][5331] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:05.945305 containerd[1873]: 2025-09-09 04:57:05.909 [INFO][5331] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.7/26] IPv6=[] ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" HandleID="k8s-pod-network.e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.913 [INFO][5297] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0", GenerateName:"calico-apiserver-56c46cdc74-", Namespace:"calico-apiserver", SelfLink:"", UID:"693e272e-6eb8-4ead-a6ff-fc7821a57b3b", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56c46cdc74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"calico-apiserver-56c46cdc74-xwbfc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia75a141d1bb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.913 [INFO][5297] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.7/32] ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.913 [INFO][5297] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia75a141d1bb ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.922 [INFO][5297] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.925 [INFO][5297] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0", GenerateName:"calico-apiserver-56c46cdc74-", Namespace:"calico-apiserver", SelfLink:"", UID:"693e272e-6eb8-4ead-a6ff-fc7821a57b3b", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56c46cdc74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6", Pod:"calico-apiserver-56c46cdc74-xwbfc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia75a141d1bb", MAC:"16:97:7a:7c:88:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:05.946361 containerd[1873]: 2025-09-09 04:57:05.940 [INFO][5297] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" Namespace="calico-apiserver" Pod="calico-apiserver-56c46cdc74-xwbfc" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-calico--apiserver--56c46cdc74--xwbfc-eth0" Sep 9 04:57:05.990832 containerd[1873]: time="2025-09-09T04:57:05.990795276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-8gzsh,Uid:6c950c8a-2ffb-4b00-901f-b69f23e1fce8,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2\"" Sep 9 04:57:06.004903 containerd[1873]: time="2025-09-09T04:57:06.004827394Z" level=info msg="connecting to shim e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6" address="unix:///run/containerd/s/654674c168836ae50d3d4a2f4faa3bf73220f9a6f1ae641063e9ed947f65bc85" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:06.018055 systemd-networkd[1664]: cali66a7b6d3f56: Link UP Sep 9 04:57:06.019258 systemd-networkd[1664]: cali66a7b6d3f56: Gained carrier Sep 9 04:57:06.036117 systemd[1]: Started cri-containerd-e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6.scope - libcontainer container e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6. Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.755 [INFO][5308] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0 csi-node-driver- calico-system 2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19 682 0 2025-09-09 04:56:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4452.0.0-n-7e0b6f01e2 csi-node-driver-v9nht eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali66a7b6d3f56 [] [] }} ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.755 [INFO][5308] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.792 [INFO][5337] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" HandleID="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.792 [INFO][5337] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" HandleID="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7e0b6f01e2", "pod":"csi-node-driver-v9nht", "timestamp":"2025-09-09 04:57:05.792445214 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7e0b6f01e2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.792 [INFO][5337] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.910 [INFO][5337] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.911 [INFO][5337] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7e0b6f01e2' Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.970 [INFO][5337] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.978 [INFO][5337] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.982 [INFO][5337] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.984 [INFO][5337] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.986 [INFO][5337] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.987 [INFO][5337] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.988 [INFO][5337] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0 Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:05.996 [INFO][5337] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:06.006 [INFO][5337] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.8/26] block=192.168.55.0/26 handle="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:06.006 [INFO][5337] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.8/26] handle="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" host="ci-4452.0.0-n-7e0b6f01e2" Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:06.006 [INFO][5337] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:06.049387 containerd[1873]: 2025-09-09 04:57:06.006 [INFO][5337] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.8/26] IPv6=[] ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" HandleID="k8s-pod-network.1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Workload="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.009 [INFO][5308] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"", Pod:"csi-node-driver-v9nht", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali66a7b6d3f56", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.011 [INFO][5308] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.8/32] ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.011 [INFO][5308] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66a7b6d3f56 ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.019 [INFO][5308] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.020 [INFO][5308] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7e0b6f01e2", ContainerID:"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0", Pod:"csi-node-driver-v9nht", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali66a7b6d3f56", MAC:"fe:48:aa:cd:c0:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.049772 containerd[1873]: 2025-09-09 04:57:06.045 [INFO][5308] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" Namespace="calico-system" Pod="csi-node-driver-v9nht" WorkloadEndpoint="ci--4452.0.0--n--7e0b6f01e2-k8s-csi--node--driver--v9nht-eth0" Sep 9 04:57:06.080420 containerd[1873]: time="2025-09-09T04:57:06.080383923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56c46cdc74-xwbfc,Uid:693e272e-6eb8-4ead-a6ff-fc7821a57b3b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6\"" Sep 9 04:57:06.083999 containerd[1873]: time="2025-09-09T04:57:06.083958640Z" level=info msg="CreateContainer within sandbox \"e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:57:06.101412 containerd[1873]: time="2025-09-09T04:57:06.101371695Z" level=info msg="connecting to shim 1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0" address="unix:///run/containerd/s/c0e610c9d524dde847b264a51b42c0cb08b124a82692383a114425ca079dacad" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:06.117110 systemd[1]: Started cri-containerd-1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0.scope - libcontainer container 1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0. Sep 9 04:57:06.134032 containerd[1873]: time="2025-09-09T04:57:06.133992095Z" level=info msg="Container ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:06.153609 containerd[1873]: time="2025-09-09T04:57:06.153577383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v9nht,Uid:2b9bdbbb-719b-4b79-8d23-89b6ba8a1c19,Namespace:calico-system,Attempt:0,} returns sandbox id \"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0\"" Sep 9 04:57:06.172572 containerd[1873]: time="2025-09-09T04:57:06.172420825Z" level=info msg="CreateContainer within sandbox \"e06fb66d85998bd3981243d6b82a93e990507d68dad187cc217088987b9a5af6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e\"" Sep 9 04:57:06.173991 containerd[1873]: time="2025-09-09T04:57:06.173714149Z" level=info msg="StartContainer for \"ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e\"" Sep 9 04:57:06.176326 containerd[1873]: time="2025-09-09T04:57:06.176301677Z" level=info msg="connecting to shim ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e" address="unix:///run/containerd/s/654674c168836ae50d3d4a2f4faa3bf73220f9a6f1ae641063e9ed947f65bc85" protocol=ttrpc version=3 Sep 9 04:57:06.195108 systemd[1]: Started cri-containerd-ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e.scope - libcontainer container ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e. Sep 9 04:57:06.231634 containerd[1873]: time="2025-09-09T04:57:06.231592287Z" level=info msg="StartContainer for \"ca365091706a56c4f42077cd49dbf4f47ed428fa0978f0bd00c3efc8868a2f0e\" returns successfully" Sep 9 04:57:06.962524 systemd-networkd[1664]: calicf8d5fd2422: Gained IPv6LL Sep 9 04:57:07.153275 systemd-networkd[1664]: calia75a141d1bb: Gained IPv6LL Sep 9 04:57:07.276705 containerd[1873]: time="2025-09-09T04:57:07.276591330Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:07.279662 containerd[1873]: time="2025-09-09T04:57:07.279630111Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 04:57:07.282808 containerd[1873]: time="2025-09-09T04:57:07.282758111Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:07.288945 containerd[1873]: time="2025-09-09T04:57:07.288677891Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:07.289537 containerd[1873]: time="2025-09-09T04:57:07.289504218Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.116666022s" Sep 9 04:57:07.289705 containerd[1873]: time="2025-09-09T04:57:07.289614436Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 04:57:07.290889 containerd[1873]: time="2025-09-09T04:57:07.290864818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 04:57:07.298138 containerd[1873]: time="2025-09-09T04:57:07.298115383Z" level=info msg="CreateContainer within sandbox \"8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 04:57:07.330078 containerd[1873]: time="2025-09-09T04:57:07.329730251Z" level=info msg="Container e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:07.348740 containerd[1873]: time="2025-09-09T04:57:07.348612995Z" level=info msg="CreateContainer within sandbox \"8ae65df0337f842fc69f3a1b0dcb46021de90c7c86e25c71fbb0a6567e9fdfc0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\"" Sep 9 04:57:07.349659 containerd[1873]: time="2025-09-09T04:57:07.349507510Z" level=info msg="StartContainer for \"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\"" Sep 9 04:57:07.350707 containerd[1873]: time="2025-09-09T04:57:07.350685545Z" level=info msg="connecting to shim e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116" address="unix:///run/containerd/s/14eac813932262becec842d1743aae331e25d09d52f4c91bd7381f127b7fbe4b" protocol=ttrpc version=3 Sep 9 04:57:07.394851 systemd[1]: Started cri-containerd-e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116.scope - libcontainer container e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116. Sep 9 04:57:07.409613 systemd-networkd[1664]: calife998d69743: Gained IPv6LL Sep 9 04:57:07.443242 containerd[1873]: time="2025-09-09T04:57:07.443145127Z" level=info msg="StartContainer for \"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" returns successfully" Sep 9 04:57:07.854231 kubelet[3387]: I0909 04:57:07.854197 3387 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:07.857429 systemd-networkd[1664]: cali66a7b6d3f56: Gained IPv6LL Sep 9 04:57:07.871185 kubelet[3387]: I0909 04:57:07.871106 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56c46cdc74-xwbfc" podStartSLOduration=30.871045117 podStartE2EDuration="30.871045117s" podCreationTimestamp="2025-09-09 04:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:57:06.869289424 +0000 UTC m=+44.281429776" watchObservedRunningTime="2025-09-09 04:57:07.871045117 +0000 UTC m=+45.283185477" Sep 9 04:57:07.872268 kubelet[3387]: I0909 04:57:07.872077 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-bc47587f9-bvp5k" podStartSLOduration=25.753673321 podStartE2EDuration="27.872068905s" podCreationTimestamp="2025-09-09 04:56:40 +0000 UTC" firstStartedPulling="2025-09-09 04:57:05.17223412 +0000 UTC m=+42.584374472" lastFinishedPulling="2025-09-09 04:57:07.290629696 +0000 UTC m=+44.702770056" observedRunningTime="2025-09-09 04:57:07.870215918 +0000 UTC m=+45.282356318" watchObservedRunningTime="2025-09-09 04:57:07.872068905 +0000 UTC m=+45.284209265" Sep 9 04:57:07.888328 containerd[1873]: time="2025-09-09T04:57:07.888242137Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"345da2757a655f29073ab8022dd47d8c211ad2123493ce6701f1711166b9288c\" pid:5614 exited_at:{seconds:1757393827 nanos:888007511}" Sep 9 04:57:10.748117 kubelet[3387]: I0909 04:57:10.747959 3387 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:11.330936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3219709195.mount: Deactivated successfully. Sep 9 04:57:12.235659 containerd[1873]: time="2025-09-09T04:57:12.235610734Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:12.239282 containerd[1873]: time="2025-09-09T04:57:12.239254699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 04:57:12.242442 containerd[1873]: time="2025-09-09T04:57:12.242418858Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:12.247154 containerd[1873]: time="2025-09-09T04:57:12.247106540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:12.247524 containerd[1873]: time="2025-09-09T04:57:12.247359166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 4.956467789s" Sep 9 04:57:12.247524 containerd[1873]: time="2025-09-09T04:57:12.247387364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 04:57:12.248419 containerd[1873]: time="2025-09-09T04:57:12.248391396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 04:57:12.250037 containerd[1873]: time="2025-09-09T04:57:12.250015553Z" level=info msg="CreateContainer within sandbox \"fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 04:57:12.267143 containerd[1873]: time="2025-09-09T04:57:12.267115421Z" level=info msg="Container a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:12.283338 containerd[1873]: time="2025-09-09T04:57:12.283307564Z" level=info msg="CreateContainer within sandbox \"fb98a02b0ec3b0999a9cb6487bfa194f36df992ec263f376ac7f602e2ca9e5c2\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\"" Sep 9 04:57:12.285103 containerd[1873]: time="2025-09-09T04:57:12.284682231Z" level=info msg="StartContainer for \"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\"" Sep 9 04:57:12.285609 containerd[1873]: time="2025-09-09T04:57:12.285518544Z" level=info msg="connecting to shim a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c" address="unix:///run/containerd/s/43577247195c95d4206aa902ab6c319ea73178496f1de8161e165088eab62022" protocol=ttrpc version=3 Sep 9 04:57:12.304084 systemd[1]: Started cri-containerd-a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c.scope - libcontainer container a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c. Sep 9 04:57:12.335522 containerd[1873]: time="2025-09-09T04:57:12.335497014Z" level=info msg="StartContainer for \"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" returns successfully" Sep 9 04:57:12.884069 kubelet[3387]: I0909 04:57:12.882889 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-8gzsh" podStartSLOduration=26.627376858 podStartE2EDuration="32.882876917s" podCreationTimestamp="2025-09-09 04:56:40 +0000 UTC" firstStartedPulling="2025-09-09 04:57:05.992716587 +0000 UTC m=+43.404856939" lastFinishedPulling="2025-09-09 04:57:12.248216638 +0000 UTC m=+49.660356998" observedRunningTime="2025-09-09 04:57:12.88235937 +0000 UTC m=+50.294499730" watchObservedRunningTime="2025-09-09 04:57:12.882876917 +0000 UTC m=+50.295017269" Sep 9 04:57:12.924688 containerd[1873]: time="2025-09-09T04:57:12.924649445Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"46f7396f2d0755c8bbcb875e4b30e4b9f3bf8b37592ae184fdb04a5c136781f9\" pid:5691 exit_status:1 exited_at:{seconds:1757393832 nanos:924179720}" Sep 9 04:57:13.580866 containerd[1873]: time="2025-09-09T04:57:13.580811762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.584216 containerd[1873]: time="2025-09-09T04:57:13.584173126Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 04:57:13.587076 containerd[1873]: time="2025-09-09T04:57:13.586938604Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.594062 containerd[1873]: time="2025-09-09T04:57:13.594031575Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.595731 containerd[1873]: time="2025-09-09T04:57:13.595704866Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.347285439s" Sep 9 04:57:13.595764 containerd[1873]: time="2025-09-09T04:57:13.595733984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 04:57:13.601126 containerd[1873]: time="2025-09-09T04:57:13.601091132Z" level=info msg="CreateContainer within sandbox \"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 04:57:13.623965 containerd[1873]: time="2025-09-09T04:57:13.623935639Z" level=info msg="Container 6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:13.653940 containerd[1873]: time="2025-09-09T04:57:13.653845599Z" level=info msg="CreateContainer within sandbox \"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2\"" Sep 9 04:57:13.657088 containerd[1873]: time="2025-09-09T04:57:13.657062659Z" level=info msg="StartContainer for \"6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2\"" Sep 9 04:57:13.659641 containerd[1873]: time="2025-09-09T04:57:13.659613589Z" level=info msg="connecting to shim 6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2" address="unix:///run/containerd/s/c0e610c9d524dde847b264a51b42c0cb08b124a82692383a114425ca079dacad" protocol=ttrpc version=3 Sep 9 04:57:13.673137 systemd[1]: Started cri-containerd-6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2.scope - libcontainer container 6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2. Sep 9 04:57:13.703894 containerd[1873]: time="2025-09-09T04:57:13.703858347Z" level=info msg="StartContainer for \"6c17ec8e730f8b4c89f34894c743da5fb3b39ad6410b9c6d0451df8f1e26c6d2\" returns successfully" Sep 9 04:57:13.705142 containerd[1873]: time="2025-09-09T04:57:13.705119645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 04:57:13.925460 containerd[1873]: time="2025-09-09T04:57:13.925300503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"4c62225438afff61b796df65f6a00006d0eafd0a629af99d7a3529d8ded3a6c9\" pid:5752 exit_status:1 exited_at:{seconds:1757393833 nanos:924453167}" Sep 9 04:57:14.937919 containerd[1873]: time="2025-09-09T04:57:14.937866911Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:14.941440 containerd[1873]: time="2025-09-09T04:57:14.941392658Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 04:57:14.944696 containerd[1873]: time="2025-09-09T04:57:14.944643076Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:14.949908 containerd[1873]: time="2025-09-09T04:57:14.949849025Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:14.950381 containerd[1873]: time="2025-09-09T04:57:14.950123290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.244977134s" Sep 9 04:57:14.950381 containerd[1873]: time="2025-09-09T04:57:14.950149976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 04:57:14.952255 containerd[1873]: time="2025-09-09T04:57:14.952141241Z" level=info msg="CreateContainer within sandbox \"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 04:57:14.969154 containerd[1873]: time="2025-09-09T04:57:14.969125027Z" level=info msg="Container 392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:14.985633 containerd[1873]: time="2025-09-09T04:57:14.985599482Z" level=info msg="CreateContainer within sandbox \"1382cb85b6096951816faa3ee70fe67e5a34ee9bfcce7f7fb9c1b61a849c88e0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd\"" Sep 9 04:57:14.986217 containerd[1873]: time="2025-09-09T04:57:14.986187194Z" level=info msg="StartContainer for \"392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd\"" Sep 9 04:57:14.987533 containerd[1873]: time="2025-09-09T04:57:14.987510688Z" level=info msg="connecting to shim 392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd" address="unix:///run/containerd/s/c0e610c9d524dde847b264a51b42c0cb08b124a82692383a114425ca079dacad" protocol=ttrpc version=3 Sep 9 04:57:15.019104 systemd[1]: Started cri-containerd-392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd.scope - libcontainer container 392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd. Sep 9 04:57:15.052724 containerd[1873]: time="2025-09-09T04:57:15.052644334Z" level=info msg="StartContainer for \"392b1b67b859bb5f58664f940338e48330f54727bcaee5c919956955f57525dd\" returns successfully" Sep 9 04:57:15.774541 kubelet[3387]: I0909 04:57:15.774466 3387 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 04:57:15.778374 kubelet[3387]: I0909 04:57:15.778303 3387 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 04:57:15.894349 kubelet[3387]: I0909 04:57:15.894292 3387 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-v9nht" podStartSLOduration=27.098287092 podStartE2EDuration="35.894277074s" podCreationTimestamp="2025-09-09 04:56:40 +0000 UTC" firstStartedPulling="2025-09-09 04:57:06.15473633 +0000 UTC m=+43.566876682" lastFinishedPulling="2025-09-09 04:57:14.950726312 +0000 UTC m=+52.362866664" observedRunningTime="2025-09-09 04:57:15.893928133 +0000 UTC m=+53.306068485" watchObservedRunningTime="2025-09-09 04:57:15.894277074 +0000 UTC m=+53.306417426" Sep 9 04:57:21.186520 containerd[1873]: time="2025-09-09T04:57:21.186478983Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"f60f942d7a0f2abc42850c836383ab78399019f7b79bfd26443f42acf9988199\" pid:5818 exited_at:{seconds:1757393841 nanos:186085710}" Sep 9 04:57:21.501393 containerd[1873]: time="2025-09-09T04:57:21.501192216Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"58c3311e612e192251eab1e32c59c1362c6f36eb79748b2eb6422b84cd1e4607\" pid:5837 exited_at:{seconds:1757393841 nanos:500901065}" Sep 9 04:57:21.856000 containerd[1873]: time="2025-09-09T04:57:21.855774630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"94f3b2b950cb4419ba47ed46a899c0284cf8ad981418e678cfaa2b892343a363\" pid:5859 exited_at:{seconds:1757393841 nanos:855562762}" Sep 9 04:57:24.537690 containerd[1873]: time="2025-09-09T04:57:24.537649492Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"be1bb830d8b5d2f02fbbcecbb75e1f36ee3632d9734e6c5eb70703f8ec5c76d2\" pid:5886 exited_at:{seconds:1757393844 nanos:536531638}" Sep 9 04:57:32.829232 kubelet[3387]: I0909 04:57:32.829195 3387 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:51.219876 containerd[1873]: time="2025-09-09T04:57:51.219695527Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"4960e76aa6bdf1e179de67e475fbc41acf9ef64cbcf4d2e32770d4d95bb82076\" pid:5926 exited_at:{seconds:1757393871 nanos:219415606}" Sep 9 04:57:51.862750 containerd[1873]: time="2025-09-09T04:57:51.862705464Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"852fb4fbd35180f6c404063adaa0982f85180670a3ea27191449018d827ad101\" pid:5948 exited_at:{seconds:1757393871 nanos:862395520}" Sep 9 04:57:54.592584 containerd[1873]: time="2025-09-09T04:57:54.592536376Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"316b456ad5b7da2b52afc4c622d48e88f9da68e36ed5cbb1229941bc471654a1\" pid:5973 exited_at:{seconds:1757393874 nanos:592224968}" Sep 9 04:58:06.532764 containerd[1873]: time="2025-09-09T04:58:06.532721729Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"baeb3f592554faf5bb77ae062f220cf8805c65188608aa2d923f1785f98732f7\" pid:5999 exited_at:{seconds:1757393886 nanos:532397004}" Sep 9 04:58:09.235353 systemd[1]: Started sshd@7-10.200.20.14:22-10.200.16.10:53570.service - OpenSSH per-connection server daemon (10.200.16.10:53570). Sep 9 04:58:09.661835 sshd[6015]: Accepted publickey for core from 10.200.16.10 port 53570 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:09.663672 sshd-session[6015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:09.667723 systemd-logind[1850]: New session 10 of user core. Sep 9 04:58:09.676077 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 04:58:10.012003 sshd[6018]: Connection closed by 10.200.16.10 port 53570 Sep 9 04:58:10.012504 sshd-session[6015]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:10.015192 systemd-logind[1850]: Session 10 logged out. Waiting for processes to exit. Sep 9 04:58:10.016645 systemd[1]: sshd@7-10.200.20.14:22-10.200.16.10:53570.service: Deactivated successfully. Sep 9 04:58:10.018489 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 04:58:10.020375 systemd-logind[1850]: Removed session 10. Sep 9 04:58:15.090145 systemd[1]: Started sshd@8-10.200.20.14:22-10.200.16.10:55850.service - OpenSSH per-connection server daemon (10.200.16.10:55850). Sep 9 04:58:15.506787 sshd[6031]: Accepted publickey for core from 10.200.16.10 port 55850 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:15.507903 sshd-session[6031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:15.511901 systemd-logind[1850]: New session 11 of user core. Sep 9 04:58:15.518099 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 04:58:15.856084 sshd[6034]: Connection closed by 10.200.16.10 port 55850 Sep 9 04:58:15.856051 sshd-session[6031]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:15.858957 systemd-logind[1850]: Session 11 logged out. Waiting for processes to exit. Sep 9 04:58:15.859224 systemd[1]: sshd@8-10.200.20.14:22-10.200.16.10:55850.service: Deactivated successfully. Sep 9 04:58:15.860851 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 04:58:15.863292 systemd-logind[1850]: Removed session 11. Sep 9 04:58:20.942179 systemd[1]: Started sshd@9-10.200.20.14:22-10.200.16.10:38334.service - OpenSSH per-connection server daemon (10.200.16.10:38334). Sep 9 04:58:21.186785 containerd[1873]: time="2025-09-09T04:58:21.186750916Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"6c50fd0e57da12d7e4004d92516f3cb309e7b82f8281e39b1fd5e285661f2102\" pid:6067 exited_at:{seconds:1757393901 nanos:186522866}" Sep 9 04:58:21.360051 sshd[6053]: Accepted publickey for core from 10.200.16.10 port 38334 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:21.360939 sshd-session[6053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:21.364316 systemd-logind[1850]: New session 12 of user core. Sep 9 04:58:21.371118 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 04:58:21.506111 containerd[1873]: time="2025-09-09T04:58:21.506076010Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"4df659149272172e9cf595342bedcdb81c7721be428350066622d33cee7dd2c0\" pid:6090 exited_at:{seconds:1757393901 nanos:505648187}" Sep 9 04:58:21.706287 sshd[6077]: Connection closed by 10.200.16.10 port 38334 Sep 9 04:58:21.705308 sshd-session[6053]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:21.708769 systemd[1]: sshd@9-10.200.20.14:22-10.200.16.10:38334.service: Deactivated successfully. Sep 9 04:58:21.710859 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 04:58:21.711630 systemd-logind[1850]: Session 12 logged out. Waiting for processes to exit. Sep 9 04:58:21.712880 systemd-logind[1850]: Removed session 12. Sep 9 04:58:21.782490 systemd[1]: Started sshd@10-10.200.20.14:22-10.200.16.10:38340.service - OpenSSH per-connection server daemon (10.200.16.10:38340). Sep 9 04:58:21.855783 containerd[1873]: time="2025-09-09T04:58:21.855720798Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"2824db5a0b8eac78dc410e6b7c12ce0f2af0449b21b1e37f22d3cd3b7c4cfcb6\" pid:6127 exited_at:{seconds:1757393901 nanos:855512946}" Sep 9 04:58:22.201286 sshd[6110]: Accepted publickey for core from 10.200.16.10 port 38340 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:22.202444 sshd-session[6110]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:22.206166 systemd-logind[1850]: New session 13 of user core. Sep 9 04:58:22.214095 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 04:58:22.575535 sshd[6138]: Connection closed by 10.200.16.10 port 38340 Sep 9 04:58:22.576182 sshd-session[6110]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:22.579576 systemd[1]: sshd@10-10.200.20.14:22-10.200.16.10:38340.service: Deactivated successfully. Sep 9 04:58:22.581338 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 04:58:22.582464 systemd-logind[1850]: Session 13 logged out. Waiting for processes to exit. Sep 9 04:58:22.584157 systemd-logind[1850]: Removed session 13. Sep 9 04:58:22.658162 systemd[1]: Started sshd@11-10.200.20.14:22-10.200.16.10:38346.service - OpenSSH per-connection server daemon (10.200.16.10:38346). Sep 9 04:58:23.074691 sshd[6147]: Accepted publickey for core from 10.200.16.10 port 38346 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:23.075811 sshd-session[6147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:23.079525 systemd-logind[1850]: New session 14 of user core. Sep 9 04:58:23.084111 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 04:58:23.435172 sshd[6152]: Connection closed by 10.200.16.10 port 38346 Sep 9 04:58:23.435720 sshd-session[6147]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:23.438901 systemd[1]: sshd@11-10.200.20.14:22-10.200.16.10:38346.service: Deactivated successfully. Sep 9 04:58:23.440831 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 04:58:23.441717 systemd-logind[1850]: Session 14 logged out. Waiting for processes to exit. Sep 9 04:58:23.443291 systemd-logind[1850]: Removed session 14. Sep 9 04:58:24.551993 containerd[1873]: time="2025-09-09T04:58:24.551946328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"b804241dd5ba49fc629805412176f96a1e2045bf9af13c14bb7aaec095f75bbf\" pid:6175 exited_at:{seconds:1757393904 nanos:551653993}" Sep 9 04:58:28.511120 systemd[1]: Started sshd@12-10.200.20.14:22-10.200.16.10:38354.service - OpenSSH per-connection server daemon (10.200.16.10:38354). Sep 9 04:58:28.918036 sshd[6191]: Accepted publickey for core from 10.200.16.10 port 38354 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:28.919121 sshd-session[6191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:28.922528 systemd-logind[1850]: New session 15 of user core. Sep 9 04:58:28.935098 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 04:58:29.272872 sshd[6194]: Connection closed by 10.200.16.10 port 38354 Sep 9 04:58:29.273434 sshd-session[6191]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:29.276525 systemd[1]: sshd@12-10.200.20.14:22-10.200.16.10:38354.service: Deactivated successfully. Sep 9 04:58:29.278315 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 04:58:29.279155 systemd-logind[1850]: Session 15 logged out. Waiting for processes to exit. Sep 9 04:58:29.280701 systemd-logind[1850]: Removed session 15. Sep 9 04:58:34.352838 systemd[1]: Started sshd@13-10.200.20.14:22-10.200.16.10:40944.service - OpenSSH per-connection server daemon (10.200.16.10:40944). Sep 9 04:58:34.766502 sshd[6221]: Accepted publickey for core from 10.200.16.10 port 40944 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:34.767581 sshd-session[6221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:34.771056 systemd-logind[1850]: New session 16 of user core. Sep 9 04:58:34.781198 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 04:58:35.123518 sshd[6224]: Connection closed by 10.200.16.10 port 40944 Sep 9 04:58:35.124059 sshd-session[6221]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:35.127721 systemd[1]: sshd@13-10.200.20.14:22-10.200.16.10:40944.service: Deactivated successfully. Sep 9 04:58:35.129614 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 04:58:35.130313 systemd-logind[1850]: Session 16 logged out. Waiting for processes to exit. Sep 9 04:58:35.131363 systemd-logind[1850]: Removed session 16. Sep 9 04:58:40.209889 systemd[1]: Started sshd@14-10.200.20.14:22-10.200.16.10:51822.service - OpenSSH per-connection server daemon (10.200.16.10:51822). Sep 9 04:58:40.659190 sshd[6243]: Accepted publickey for core from 10.200.16.10 port 51822 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:40.660168 sshd-session[6243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:40.663793 systemd-logind[1850]: New session 17 of user core. Sep 9 04:58:40.669090 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 04:58:41.032232 sshd[6246]: Connection closed by 10.200.16.10 port 51822 Sep 9 04:58:41.032806 sshd-session[6243]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:41.036682 systemd[1]: sshd@14-10.200.20.14:22-10.200.16.10:51822.service: Deactivated successfully. Sep 9 04:58:41.038827 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 04:58:41.039638 systemd-logind[1850]: Session 17 logged out. Waiting for processes to exit. Sep 9 04:58:41.041426 systemd-logind[1850]: Removed session 17. Sep 9 04:58:41.120643 systemd[1]: Started sshd@15-10.200.20.14:22-10.200.16.10:51828.service - OpenSSH per-connection server daemon (10.200.16.10:51828). Sep 9 04:58:41.579074 sshd[6258]: Accepted publickey for core from 10.200.16.10 port 51828 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:41.580618 sshd-session[6258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:41.590043 systemd-logind[1850]: New session 18 of user core. Sep 9 04:58:41.594154 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 04:58:42.111384 sshd[6261]: Connection closed by 10.200.16.10 port 51828 Sep 9 04:58:42.114551 sshd-session[6258]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:42.119440 systemd[1]: sshd@15-10.200.20.14:22-10.200.16.10:51828.service: Deactivated successfully. Sep 9 04:58:42.124397 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 04:58:42.127231 systemd-logind[1850]: Session 18 logged out. Waiting for processes to exit. Sep 9 04:58:42.129152 systemd-logind[1850]: Removed session 18. Sep 9 04:58:42.199379 systemd[1]: Started sshd@16-10.200.20.14:22-10.200.16.10:51830.service - OpenSSH per-connection server daemon (10.200.16.10:51830). Sep 9 04:58:42.661653 sshd[6271]: Accepted publickey for core from 10.200.16.10 port 51830 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:42.663446 sshd-session[6271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:42.668690 systemd-logind[1850]: New session 19 of user core. Sep 9 04:58:42.677162 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 04:58:44.392726 sshd[6274]: Connection closed by 10.200.16.10 port 51830 Sep 9 04:58:44.391480 sshd-session[6271]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:44.395780 systemd-logind[1850]: Session 19 logged out. Waiting for processes to exit. Sep 9 04:58:44.398390 systemd[1]: sshd@16-10.200.20.14:22-10.200.16.10:51830.service: Deactivated successfully. Sep 9 04:58:44.400560 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 04:58:44.401338 systemd[1]: session-19.scope: Consumed 351ms CPU time, 76M memory peak. Sep 9 04:58:44.406200 systemd-logind[1850]: Removed session 19. Sep 9 04:58:44.468240 systemd[1]: Started sshd@17-10.200.20.14:22-10.200.16.10:51836.service - OpenSSH per-connection server daemon (10.200.16.10:51836). Sep 9 04:58:44.887040 sshd[6291]: Accepted publickey for core from 10.200.16.10 port 51836 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:44.888570 sshd-session[6291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:44.892458 systemd-logind[1850]: New session 20 of user core. Sep 9 04:58:44.901098 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 04:58:45.340870 sshd[6294]: Connection closed by 10.200.16.10 port 51836 Sep 9 04:58:45.341184 sshd-session[6291]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:45.346053 systemd[1]: sshd@17-10.200.20.14:22-10.200.16.10:51836.service: Deactivated successfully. Sep 9 04:58:45.347909 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 04:58:45.348868 systemd-logind[1850]: Session 20 logged out. Waiting for processes to exit. Sep 9 04:58:45.351456 systemd-logind[1850]: Removed session 20. Sep 9 04:58:45.417408 systemd[1]: Started sshd@18-10.200.20.14:22-10.200.16.10:51838.service - OpenSSH per-connection server daemon (10.200.16.10:51838). Sep 9 04:58:45.829892 sshd[6304]: Accepted publickey for core from 10.200.16.10 port 51838 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:45.831198 sshd-session[6304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:45.834794 systemd-logind[1850]: New session 21 of user core. Sep 9 04:58:45.847217 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 04:58:46.187188 sshd[6307]: Connection closed by 10.200.16.10 port 51838 Sep 9 04:58:46.187095 sshd-session[6304]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:46.190730 systemd[1]: sshd@18-10.200.20.14:22-10.200.16.10:51838.service: Deactivated successfully. Sep 9 04:58:46.192822 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 04:58:46.193804 systemd-logind[1850]: Session 21 logged out. Waiting for processes to exit. Sep 9 04:58:46.197052 systemd-logind[1850]: Removed session 21. Sep 9 04:58:51.187878 containerd[1873]: time="2025-09-09T04:58:51.187816759Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"f48e619b8f4af96c83515765570d2160ad057697b20a8bd0d34cf4121e031764\" pid:6333 exited_at:{seconds:1757393931 nanos:187615475}" Sep 9 04:58:51.261328 systemd[1]: Started sshd@19-10.200.20.14:22-10.200.16.10:53852.service - OpenSSH per-connection server daemon (10.200.16.10:53852). Sep 9 04:58:51.675068 sshd[6344]: Accepted publickey for core from 10.200.16.10 port 53852 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:51.677532 sshd-session[6344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:51.684147 systemd-logind[1850]: New session 22 of user core. Sep 9 04:58:51.690089 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 04:58:51.871214 containerd[1873]: time="2025-09-09T04:58:51.871170826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"b39401384eae8d1df80308f91cb3b32f6a78731b24490818b1b40d8af243217c\" pid:6362 exited_at:{seconds:1757393931 nanos:870836854}" Sep 9 04:58:52.040804 sshd[6347]: Connection closed by 10.200.16.10 port 53852 Sep 9 04:58:52.042129 sshd-session[6344]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:52.047799 systemd[1]: sshd@19-10.200.20.14:22-10.200.16.10:53852.service: Deactivated successfully. Sep 9 04:58:52.051935 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 04:58:52.053659 systemd-logind[1850]: Session 22 logged out. Waiting for processes to exit. Sep 9 04:58:52.055372 systemd-logind[1850]: Removed session 22. Sep 9 04:58:54.544951 containerd[1873]: time="2025-09-09T04:58:54.544890704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"7db376ea7460efe617acdf321a4ecee9bae59b97aae6fc73fe93727360a17d03\" pid:6395 exited_at:{seconds:1757393934 nanos:544654078}" Sep 9 04:58:57.128160 systemd[1]: Started sshd@20-10.200.20.14:22-10.200.16.10:53858.service - OpenSSH per-connection server daemon (10.200.16.10:53858). Sep 9 04:58:57.545237 sshd[6405]: Accepted publickey for core from 10.200.16.10 port 53858 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:57.546217 sshd-session[6405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:57.549812 systemd-logind[1850]: New session 23 of user core. Sep 9 04:58:57.560086 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 04:58:57.896756 sshd[6408]: Connection closed by 10.200.16.10 port 53858 Sep 9 04:58:57.897369 sshd-session[6405]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:57.900730 systemd-logind[1850]: Session 23 logged out. Waiting for processes to exit. Sep 9 04:58:57.901985 systemd[1]: sshd@20-10.200.20.14:22-10.200.16.10:53858.service: Deactivated successfully. Sep 9 04:58:57.904572 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 04:58:57.907561 systemd-logind[1850]: Removed session 23. Sep 9 04:59:02.973359 systemd[1]: Started sshd@21-10.200.20.14:22-10.200.16.10:54402.service - OpenSSH per-connection server daemon (10.200.16.10:54402). Sep 9 04:59:03.388598 sshd[6422]: Accepted publickey for core from 10.200.16.10 port 54402 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:03.389632 sshd-session[6422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:03.393391 systemd-logind[1850]: New session 24 of user core. Sep 9 04:59:03.401085 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 04:59:03.744216 sshd[6425]: Connection closed by 10.200.16.10 port 54402 Sep 9 04:59:03.744772 sshd-session[6422]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:03.747534 systemd-logind[1850]: Session 24 logged out. Waiting for processes to exit. Sep 9 04:59:03.748484 systemd[1]: sshd@21-10.200.20.14:22-10.200.16.10:54402.service: Deactivated successfully. Sep 9 04:59:03.750329 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 04:59:03.752904 systemd-logind[1850]: Removed session 24. Sep 9 04:59:06.530741 containerd[1873]: time="2025-09-09T04:59:06.530698956Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"9adc1567d97d752ffd8bd0f029ba8e675a840b3cdb0bcdd0224f998631c7d830\" pid:6448 exited_at:{seconds:1757393946 nanos:530438547}" Sep 9 04:59:08.824445 systemd[1]: Started sshd@22-10.200.20.14:22-10.200.16.10:54414.service - OpenSSH per-connection server daemon (10.200.16.10:54414). Sep 9 04:59:09.233159 sshd[6458]: Accepted publickey for core from 10.200.16.10 port 54414 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:09.234233 sshd-session[6458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:09.237840 systemd-logind[1850]: New session 25 of user core. Sep 9 04:59:09.246253 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 04:59:09.582248 sshd[6461]: Connection closed by 10.200.16.10 port 54414 Sep 9 04:59:09.582092 sshd-session[6458]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:09.585753 systemd[1]: sshd@22-10.200.20.14:22-10.200.16.10:54414.service: Deactivated successfully. Sep 9 04:59:09.587572 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 04:59:09.588476 systemd-logind[1850]: Session 25 logged out. Waiting for processes to exit. Sep 9 04:59:09.590755 systemd-logind[1850]: Removed session 25. Sep 9 04:59:14.666615 systemd[1]: Started sshd@23-10.200.20.14:22-10.200.16.10:41550.service - OpenSSH per-connection server daemon (10.200.16.10:41550). Sep 9 04:59:15.122702 sshd[6473]: Accepted publickey for core from 10.200.16.10 port 41550 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:15.124189 sshd-session[6473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:15.128303 systemd-logind[1850]: New session 26 of user core. Sep 9 04:59:15.137079 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 04:59:15.501583 sshd[6476]: Connection closed by 10.200.16.10 port 41550 Sep 9 04:59:15.501451 sshd-session[6473]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:15.505969 systemd-logind[1850]: Session 26 logged out. Waiting for processes to exit. Sep 9 04:59:15.507606 systemd[1]: sshd@23-10.200.20.14:22-10.200.16.10:41550.service: Deactivated successfully. Sep 9 04:59:15.510292 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 04:59:15.512467 systemd-logind[1850]: Removed session 26. Sep 9 04:59:20.579181 systemd[1]: Started sshd@24-10.200.20.14:22-10.200.16.10:53394.service - OpenSSH per-connection server daemon (10.200.16.10:53394). Sep 9 04:59:20.988066 sshd[6487]: Accepted publickey for core from 10.200.16.10 port 53394 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:20.989143 sshd-session[6487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:20.996962 systemd-logind[1850]: New session 27 of user core. Sep 9 04:59:21.001101 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 04:59:21.188537 containerd[1873]: time="2025-09-09T04:59:21.188468960Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e9b0e0ed909cbcc93dfc89fe33d916c7a0f0329e226f83e1960e93188fc59116\" id:\"b032fa1a556046dfa985bd013a81f45d27be264766585f243ccc4ee954d4131d\" pid:6504 exited_at:{seconds:1757393961 nanos:188152132}" Sep 9 04:59:21.340708 sshd[6490]: Connection closed by 10.200.16.10 port 53394 Sep 9 04:59:21.341350 sshd-session[6487]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:21.344434 systemd[1]: sshd@24-10.200.20.14:22-10.200.16.10:53394.service: Deactivated successfully. Sep 9 04:59:21.346614 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 04:59:21.347578 systemd-logind[1850]: Session 27 logged out. Waiting for processes to exit. Sep 9 04:59:21.349857 systemd-logind[1850]: Removed session 27. Sep 9 04:59:21.505011 containerd[1873]: time="2025-09-09T04:59:21.504960793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a133b18d39bcade2fa25f3abab932026fd495efccf1b04223d50ce2818e7685c\" id:\"2674740b75c24770244c35fd8f2f17700be1feb8d2ceb7b80122e2ea6423cfaa\" pid:6535 exited_at:{seconds:1757393961 nanos:504750638}" Sep 9 04:59:21.844066 containerd[1873]: time="2025-09-09T04:59:21.844023075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ceded8b679c281657263adcad2c52be4c1594c4a51e021df865515346721c37\" id:\"9aab8e6eae4e8d51f39267cb4d9db2ca1586619e3e0037093a06323aecfd61a2\" pid:6557 exited_at:{seconds:1757393961 nanos:843777523}"