Sep 9 04:54:48.090098 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 9 04:54:48.090116 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:54:48.090122 kernel: KASLR enabled Sep 9 04:54:48.090126 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 9 04:54:48.090131 kernel: printk: legacy bootconsole [pl11] enabled Sep 9 04:54:48.090134 kernel: efi: EFI v2.7 by EDK II Sep 9 04:54:48.090139 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e698 RNG=0x3fd5f998 MEMRESERVE=0x3e477598 Sep 9 04:54:48.090143 kernel: random: crng init done Sep 9 04:54:48.090148 kernel: secureboot: Secure boot disabled Sep 9 04:54:48.090151 kernel: ACPI: Early table checksum verification disabled Sep 9 04:54:48.090155 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 9 04:54:48.090159 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090163 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090168 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 9 04:54:48.090173 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090177 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090181 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090185 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090190 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090194 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090198 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 9 04:54:48.090202 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 9 04:54:48.090207 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 9 04:54:48.090211 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:54:48.090215 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 9 04:54:48.090219 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 9 04:54:48.090223 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 9 04:54:48.090227 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 9 04:54:48.090231 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 9 04:54:48.090237 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 9 04:54:48.090241 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 9 04:54:48.090245 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 9 04:54:48.090249 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 9 04:54:48.090253 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 9 04:54:48.090257 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 9 04:54:48.090261 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 9 04:54:48.090265 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 9 04:54:48.090269 kernel: NODE_DATA(0) allocated [mem 0x1bf7fda00-0x1bf804fff] Sep 9 04:54:48.090273 kernel: Zone ranges: Sep 9 04:54:48.090277 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 9 04:54:48.090284 kernel: DMA32 empty Sep 9 04:54:48.090288 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 04:54:48.090293 kernel: Device empty Sep 9 04:54:48.090297 kernel: Movable zone start for each node Sep 9 04:54:48.090301 kernel: Early memory node ranges Sep 9 04:54:48.090307 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 9 04:54:48.090311 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 9 04:54:48.090315 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 9 04:54:48.090320 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 9 04:54:48.090324 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 9 04:54:48.090328 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 9 04:54:48.090333 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 9 04:54:48.090337 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 9 04:54:48.090341 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 9 04:54:48.090345 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 9 04:54:48.090350 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 9 04:54:48.090354 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 9 04:54:48.090359 kernel: psci: probing for conduit method from ACPI. Sep 9 04:54:48.090364 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:54:48.090368 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:54:48.090372 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 9 04:54:48.090377 kernel: psci: SMC Calling Convention v1.4 Sep 9 04:54:48.090381 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 9 04:54:48.090385 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 9 04:54:48.090389 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:54:48.090394 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:54:48.090398 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 9 04:54:48.090403 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:54:48.090408 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 9 04:54:48.090412 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:54:48.090416 kernel: CPU features: detected: Spectre-v4 Sep 9 04:54:48.090421 kernel: CPU features: detected: Spectre-BHB Sep 9 04:54:48.090425 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:54:48.090429 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:54:48.090434 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 9 04:54:48.090438 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:54:48.090442 kernel: alternatives: applying boot alternatives Sep 9 04:54:48.090457 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:54:48.090462 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:54:48.090467 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:54:48.090472 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:54:48.090476 kernel: Fallback order for Node 0: 0 Sep 9 04:54:48.090481 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 9 04:54:48.090485 kernel: Policy zone: Normal Sep 9 04:54:48.090489 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:54:48.090493 kernel: software IO TLB: area num 2. Sep 9 04:54:48.090498 kernel: software IO TLB: mapped [mem 0x0000000036280000-0x000000003a280000] (64MB) Sep 9 04:54:48.090502 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 04:54:48.090506 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:54:48.090511 kernel: rcu: RCU event tracing is enabled. Sep 9 04:54:48.090517 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 04:54:48.090521 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:54:48.090526 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:54:48.090530 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:54:48.090534 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 04:54:48.090539 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 04:54:48.090543 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 04:54:48.090547 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:54:48.090552 kernel: GICv3: 960 SPIs implemented Sep 9 04:54:48.090556 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:54:48.090560 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:54:48.090565 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 9 04:54:48.090570 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 9 04:54:48.090574 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 9 04:54:48.090578 kernel: ITS: No ITS available, not enabling LPIs Sep 9 04:54:48.090583 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:54:48.090587 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 9 04:54:48.090592 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 04:54:48.090596 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 9 04:54:48.090601 kernel: Console: colour dummy device 80x25 Sep 9 04:54:48.090605 kernel: printk: legacy console [tty1] enabled Sep 9 04:54:48.090610 kernel: ACPI: Core revision 20240827 Sep 9 04:54:48.090614 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 9 04:54:48.090620 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:54:48.090624 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:54:48.090629 kernel: landlock: Up and running. Sep 9 04:54:48.090633 kernel: SELinux: Initializing. Sep 9 04:54:48.090638 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:54:48.090646 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:54:48.090651 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 9 04:54:48.090656 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 9 04:54:48.090660 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 9 04:54:48.090665 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:54:48.090670 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:54:48.090675 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:54:48.090680 kernel: Remapping and enabling EFI services. Sep 9 04:54:48.090685 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:54:48.090690 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:54:48.090694 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 9 04:54:48.090700 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 9 04:54:48.090705 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 04:54:48.090709 kernel: SMP: Total of 2 processors activated. Sep 9 04:54:48.090714 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:54:48.090719 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:54:48.090723 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 9 04:54:48.090728 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:54:48.090733 kernel: CPU features: detected: Common not Private translations Sep 9 04:54:48.090738 kernel: CPU features: detected: CRC32 instructions Sep 9 04:54:48.090743 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 9 04:54:48.090748 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:54:48.090753 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:54:48.090757 kernel: CPU features: detected: Privileged Access Never Sep 9 04:54:48.090762 kernel: CPU features: detected: Speculation barrier (SB) Sep 9 04:54:48.090767 kernel: CPU features: detected: TLB range maintenance instructions Sep 9 04:54:48.090772 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:54:48.090777 kernel: CPU features: detected: Scalable Vector Extension Sep 9 04:54:48.090781 kernel: alternatives: applying system-wide alternatives Sep 9 04:54:48.090787 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 9 04:54:48.090791 kernel: SVE: maximum available vector length 16 bytes per vector Sep 9 04:54:48.090796 kernel: SVE: default vector length 16 bytes per vector Sep 9 04:54:48.090801 kernel: Memory: 3959604K/4194160K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 213368K reserved, 16384K cma-reserved) Sep 9 04:54:48.090806 kernel: devtmpfs: initialized Sep 9 04:54:48.090811 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:54:48.090816 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 04:54:48.090820 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:54:48.090825 kernel: 0 pages in range for non-PLT usage Sep 9 04:54:48.090830 kernel: 508560 pages in range for PLT usage Sep 9 04:54:48.090835 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:54:48.090840 kernel: SMBIOS 3.1.0 present. Sep 9 04:54:48.090845 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 9 04:54:48.090849 kernel: DMI: Memory slots populated: 2/2 Sep 9 04:54:48.090854 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:54:48.090859 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:54:48.090864 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:54:48.090869 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:54:48.090874 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:54:48.090879 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Sep 9 04:54:48.090883 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:54:48.090888 kernel: cpuidle: using governor menu Sep 9 04:54:48.090893 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:54:48.090898 kernel: ASID allocator initialised with 32768 entries Sep 9 04:54:48.090902 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:54:48.090907 kernel: Serial: AMBA PL011 UART driver Sep 9 04:54:48.090912 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:54:48.090917 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:54:48.090922 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:54:48.090927 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:54:48.090932 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:54:48.090936 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:54:48.090941 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:54:48.090946 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:54:48.090951 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:54:48.090955 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:54:48.090961 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:54:48.090965 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:54:48.090970 kernel: ACPI: Interpreter enabled Sep 9 04:54:48.090975 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:54:48.090979 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:54:48.090984 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:54:48.090989 kernel: printk: legacy bootconsole [pl11] disabled Sep 9 04:54:48.090994 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 9 04:54:48.090998 kernel: ACPI: CPU0 has been hot-added Sep 9 04:54:48.091004 kernel: ACPI: CPU1 has been hot-added Sep 9 04:54:48.091009 kernel: iommu: Default domain type: Translated Sep 9 04:54:48.091013 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:54:48.091018 kernel: efivars: Registered efivars operations Sep 9 04:54:48.091023 kernel: vgaarb: loaded Sep 9 04:54:48.091027 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:54:48.091032 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:54:48.091037 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:54:48.091042 kernel: pnp: PnP ACPI init Sep 9 04:54:48.091047 kernel: pnp: PnP ACPI: found 0 devices Sep 9 04:54:48.091052 kernel: NET: Registered PF_INET protocol family Sep 9 04:54:48.091056 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:54:48.091061 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:54:48.091066 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:54:48.091071 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:54:48.091076 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:54:48.091080 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:54:48.091085 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:54:48.091091 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:54:48.091095 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:54:48.091100 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:54:48.091105 kernel: kvm [1]: HYP mode not available Sep 9 04:54:48.091110 kernel: Initialise system trusted keyrings Sep 9 04:54:48.091114 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:54:48.091119 kernel: Key type asymmetric registered Sep 9 04:54:48.091124 kernel: Asymmetric key parser 'x509' registered Sep 9 04:54:48.091128 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:54:48.091134 kernel: io scheduler mq-deadline registered Sep 9 04:54:48.091139 kernel: io scheduler kyber registered Sep 9 04:54:48.091143 kernel: io scheduler bfq registered Sep 9 04:54:48.091148 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:54:48.091153 kernel: thunder_xcv, ver 1.0 Sep 9 04:54:48.091157 kernel: thunder_bgx, ver 1.0 Sep 9 04:54:48.091162 kernel: nicpf, ver 1.0 Sep 9 04:54:48.091167 kernel: nicvf, ver 1.0 Sep 9 04:54:48.091277 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:54:48.091329 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:54:47 UTC (1757393687) Sep 9 04:54:48.091335 kernel: efifb: probing for efifb Sep 9 04:54:48.091340 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 9 04:54:48.091345 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 9 04:54:48.091349 kernel: efifb: scrolling: redraw Sep 9 04:54:48.091354 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 9 04:54:48.091359 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 04:54:48.091364 kernel: fb0: EFI VGA frame buffer device Sep 9 04:54:48.091369 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 9 04:54:48.091374 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:54:48.091379 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:54:48.091384 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:54:48.091388 kernel: watchdog: NMI not fully supported Sep 9 04:54:48.091393 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:54:48.091398 kernel: Segment Routing with IPv6 Sep 9 04:54:48.091403 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:54:48.091407 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:54:48.091413 kernel: Key type dns_resolver registered Sep 9 04:54:48.091417 kernel: registered taskstats version 1 Sep 9 04:54:48.091422 kernel: Loading compiled-in X.509 certificates Sep 9 04:54:48.091427 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:54:48.091432 kernel: Demotion targets for Node 0: null Sep 9 04:54:48.091436 kernel: Key type .fscrypt registered Sep 9 04:54:48.091441 kernel: Key type fscrypt-provisioning registered Sep 9 04:54:48.091446 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:54:48.091459 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:54:48.091465 kernel: ima: No architecture policies found Sep 9 04:54:48.091470 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:54:48.091474 kernel: clk: Disabling unused clocks Sep 9 04:54:48.091479 kernel: PM: genpd: Disabling unused power domains Sep 9 04:54:48.091484 kernel: Warning: unable to open an initial console. Sep 9 04:54:48.091489 kernel: Freeing unused kernel memory: 38976K Sep 9 04:54:48.091493 kernel: Run /init as init process Sep 9 04:54:48.091498 kernel: with arguments: Sep 9 04:54:48.091503 kernel: /init Sep 9 04:54:48.091508 kernel: with environment: Sep 9 04:54:48.091513 kernel: HOME=/ Sep 9 04:54:48.091517 kernel: TERM=linux Sep 9 04:54:48.091522 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:54:48.091528 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:54:48.091535 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:54:48.091540 systemd[1]: Detected virtualization microsoft. Sep 9 04:54:48.091546 systemd[1]: Detected architecture arm64. Sep 9 04:54:48.091551 systemd[1]: Running in initrd. Sep 9 04:54:48.091556 systemd[1]: No hostname configured, using default hostname. Sep 9 04:54:48.091562 systemd[1]: Hostname set to . Sep 9 04:54:48.091567 systemd[1]: Initializing machine ID from random generator. Sep 9 04:54:48.091572 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:54:48.091577 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:54:48.091582 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:54:48.091588 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:54:48.091594 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:54:48.091599 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:54:48.091605 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:54:48.091610 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:54:48.091616 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:54:48.091621 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:54:48.091627 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:54:48.091632 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:54:48.091637 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:54:48.091642 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:54:48.091648 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:54:48.091653 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:54:48.091658 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:54:48.091663 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:54:48.091668 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:54:48.091674 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:54:48.091679 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:54:48.091685 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:54:48.091690 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:54:48.091695 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:54:48.091700 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:54:48.091705 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:54:48.091711 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:54:48.091717 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:54:48.091722 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:54:48.091727 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:54:48.091743 systemd-journald[224]: Collecting audit messages is disabled. Sep 9 04:54:48.091758 systemd-journald[224]: Journal started Sep 9 04:54:48.091771 systemd-journald[224]: Runtime Journal (/run/log/journal/e926c9ffa8794f6aac5be8982166669a) is 8M, max 78.5M, 70.5M free. Sep 9 04:54:48.102147 systemd-modules-load[226]: Inserted module 'overlay' Sep 9 04:54:48.108725 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:48.120462 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:54:48.120486 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:54:48.130924 kernel: Bridge firewalling registered Sep 9 04:54:48.130998 systemd-modules-load[226]: Inserted module 'br_netfilter' Sep 9 04:54:48.131709 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:54:48.144674 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:54:48.155550 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:54:48.159076 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:54:48.166512 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:48.178399 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:54:48.199088 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:54:48.213287 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:54:48.228844 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:54:48.244106 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:54:48.260562 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:54:48.266974 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:54:48.279922 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:54:48.285393 systemd-tmpfiles[249]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:54:48.292493 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:54:48.318843 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:54:48.332599 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:54:48.346774 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:54:48.361031 dracut-cmdline[259]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:54:48.406096 systemd-resolved[265]: Positive Trust Anchors: Sep 9 04:54:48.406109 systemd-resolved[265]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:54:48.406128 systemd-resolved[265]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:54:48.407898 systemd-resolved[265]: Defaulting to hostname 'linux'. Sep 9 04:54:48.408641 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:54:48.422168 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:54:48.503472 kernel: SCSI subsystem initialized Sep 9 04:54:48.509461 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:54:48.516458 kernel: iscsi: registered transport (tcp) Sep 9 04:54:48.530529 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:54:48.530574 kernel: QLogic iSCSI HBA Driver Sep 9 04:54:48.543597 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:54:48.562719 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:54:48.569547 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:54:48.617101 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:54:48.622740 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:54:48.685470 kernel: raid6: neonx8 gen() 18561 MB/s Sep 9 04:54:48.704478 kernel: raid6: neonx4 gen() 18552 MB/s Sep 9 04:54:48.723458 kernel: raid6: neonx2 gen() 17072 MB/s Sep 9 04:54:48.743455 kernel: raid6: neonx1 gen() 15013 MB/s Sep 9 04:54:48.762550 kernel: raid6: int64x8 gen() 10532 MB/s Sep 9 04:54:48.781473 kernel: raid6: int64x4 gen() 10601 MB/s Sep 9 04:54:48.801455 kernel: raid6: int64x2 gen() 8978 MB/s Sep 9 04:54:48.824541 kernel: raid6: int64x1 gen() 7009 MB/s Sep 9 04:54:48.824548 kernel: raid6: using algorithm neonx8 gen() 18561 MB/s Sep 9 04:54:48.848332 kernel: raid6: .... xor() 14911 MB/s, rmw enabled Sep 9 04:54:48.848340 kernel: raid6: using neon recovery algorithm Sep 9 04:54:48.856317 kernel: xor: measuring software checksum speed Sep 9 04:54:48.856323 kernel: 8regs : 28633 MB/sec Sep 9 04:54:48.859092 kernel: 32regs : 28783 MB/sec Sep 9 04:54:48.861704 kernel: arm64_neon : 37559 MB/sec Sep 9 04:54:48.864517 kernel: xor: using function: arm64_neon (37559 MB/sec) Sep 9 04:54:48.902463 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:54:48.907864 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:54:48.914846 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:54:48.944473 systemd-udevd[473]: Using default interface naming scheme 'v255'. Sep 9 04:54:48.948630 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:54:48.962718 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:54:48.999958 dracut-pre-trigger[486]: rd.md=0: removing MD RAID activation Sep 9 04:54:49.021496 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:54:49.028223 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:54:49.072795 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:54:49.089375 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:54:49.172476 kernel: hv_vmbus: Vmbus version:5.3 Sep 9 04:54:49.182354 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:54:49.182471 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:49.223214 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 9 04:54:49.223237 kernel: hv_vmbus: registering driver hid_hyperv Sep 9 04:54:49.223244 kernel: hv_vmbus: registering driver hv_netvsc Sep 9 04:54:49.223260 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 9 04:54:49.223268 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 9 04:54:49.223274 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Sep 9 04:54:49.223280 kernel: hv_vmbus: registering driver hv_storvsc Sep 9 04:54:49.204282 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:49.254298 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 9 04:54:49.254479 kernel: scsi host1: storvsc_host_t Sep 9 04:54:49.254571 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Sep 9 04:54:49.254579 kernel: scsi host0: storvsc_host_t Sep 9 04:54:49.254648 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 9 04:54:49.254664 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 9 04:54:49.264256 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:49.439965 kernel: PTP clock support registered Sep 9 04:54:49.439983 kernel: hv_utils: Registering HyperV Utility Driver Sep 9 04:54:49.439990 kernel: hv_vmbus: registering driver hv_utils Sep 9 04:54:49.439996 kernel: hv_utils: Heartbeat IC version 3.0 Sep 9 04:54:49.440002 kernel: hv_utils: Shutdown IC version 3.2 Sep 9 04:54:49.440009 kernel: hv_utils: TimeSync IC version 4.0 Sep 9 04:54:49.288420 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:54:49.439799 systemd-resolved[265]: Clock change detected. Flushing caches. Sep 9 04:54:49.472628 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 9 04:54:49.472776 kernel: hv_netvsc 002248b7-5712-0022-48b7-5712002248b7 eth0: VF slot 1 added Sep 9 04:54:49.472856 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 9 04:54:49.447248 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:54:49.480205 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 9 04:54:49.447319 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:49.492905 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 9 04:54:49.493069 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 9 04:54:49.456119 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:54:49.504482 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#125 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:49.504609 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#68 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:49.523365 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:49.523411 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 9 04:54:49.521107 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:54:49.538429 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 9 04:54:49.538577 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 04:54:49.542614 kernel: hv_vmbus: registering driver hv_pci Sep 9 04:54:49.542890 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 9 04:54:49.548901 kernel: hv_pci 336485e8-da74-4f7c-a6f1-19b3f65c67ee: PCI VMBus probing: Using version 0x10004 Sep 9 04:54:49.558408 kernel: hv_pci 336485e8-da74-4f7c-a6f1-19b3f65c67ee: PCI host bridge to bus da74:00 Sep 9 04:54:49.558547 kernel: pci_bus da74:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 9 04:54:49.562699 kernel: pci_bus da74:00: No busn resource found for root bus, will use [bus 00-ff] Sep 9 04:54:49.568949 kernel: pci da74:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 9 04:54:49.574906 kernel: pci da74:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 9 04:54:49.574927 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#176 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:54:49.583273 kernel: pci da74:00:02.0: enabling Extended Tags Sep 9 04:54:49.600971 kernel: pci da74:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at da74:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 9 04:54:49.612660 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#149 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:54:49.612818 kernel: pci_bus da74:00: busn_res: [bus 00-ff] end is updated to 00 Sep 9 04:54:49.617980 kernel: pci da74:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 9 04:54:49.677087 kernel: mlx5_core da74:00:02.0: enabling device (0000 -> 0002) Sep 9 04:54:49.685041 kernel: mlx5_core da74:00:02.0: PTM is not supported by PCIe Sep 9 04:54:49.685255 kernel: mlx5_core da74:00:02.0: firmware version: 16.30.5006 Sep 9 04:54:49.860844 kernel: hv_netvsc 002248b7-5712-0022-48b7-5712002248b7 eth0: VF registering: eth1 Sep 9 04:54:49.861422 kernel: mlx5_core da74:00:02.0 eth1: joined to eth0 Sep 9 04:54:49.867496 kernel: mlx5_core da74:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 9 04:54:49.879910 kernel: mlx5_core da74:00:02.0 enP55924s1: renamed from eth1 Sep 9 04:54:50.067296 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 9 04:54:50.117546 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 9 04:54:50.136088 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 9 04:54:50.141132 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 9 04:54:50.164045 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 04:54:50.168752 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:54:50.184449 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:54:50.189381 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:54:50.200142 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:54:50.214063 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:54:50.229485 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:54:50.253353 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#75 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:50.253603 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:54:50.268937 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:51.281771 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#132 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 9 04:54:51.295638 disk-uuid[660]: The operation has completed successfully. Sep 9 04:54:51.300391 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 04:54:51.364101 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:54:51.364198 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:54:51.387848 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:54:51.412089 sh[824]: Success Sep 9 04:54:51.444350 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:54:51.444398 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:54:51.449344 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:54:51.460912 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:54:51.785322 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:54:51.793497 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:54:51.812022 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:54:51.834898 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (842) Sep 9 04:54:51.845192 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:54:51.845225 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:52.262820 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:54:52.262909 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:54:52.302762 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:54:52.306872 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:54:52.313907 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:54:52.314637 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:54:52.337494 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:54:52.367903 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (871) Sep 9 04:54:52.378283 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:52.378319 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:52.432351 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:52.432405 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:52.440888 kernel: BTRFS info (device sda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:52.440872 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:54:52.454343 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:54:52.461339 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:54:52.470338 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:54:52.500058 systemd-networkd[1009]: lo: Link UP Sep 9 04:54:52.500065 systemd-networkd[1009]: lo: Gained carrier Sep 9 04:54:52.500736 systemd-networkd[1009]: Enumeration completed Sep 9 04:54:52.501037 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:54:52.503938 systemd-networkd[1009]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:54:52.503941 systemd-networkd[1009]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:54:52.508123 systemd[1]: Reached target network.target - Network. Sep 9 04:54:52.582559 kernel: mlx5_core da74:00:02.0 enP55924s1: Link up Sep 9 04:54:52.582815 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 04:54:52.618919 kernel: hv_netvsc 002248b7-5712-0022-48b7-5712002248b7 eth0: Data path switched to VF: enP55924s1 Sep 9 04:54:52.619398 systemd-networkd[1009]: enP55924s1: Link UP Sep 9 04:54:52.619609 systemd-networkd[1009]: eth0: Link UP Sep 9 04:54:52.619924 systemd-networkd[1009]: eth0: Gained carrier Sep 9 04:54:52.619939 systemd-networkd[1009]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:54:52.628014 systemd-networkd[1009]: enP55924s1: Gained carrier Sep 9 04:54:52.648903 systemd-networkd[1009]: eth0: DHCPv4 address 10.200.20.4/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:54:53.690990 ignition[1011]: Ignition 2.22.0 Sep 9 04:54:53.691002 ignition[1011]: Stage: fetch-offline Sep 9 04:54:53.691097 ignition[1011]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:53.697304 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:54:53.691103 ignition[1011]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:53.709482 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 04:54:53.691181 ignition[1011]: parsed url from cmdline: "" Sep 9 04:54:53.691183 ignition[1011]: no config URL provided Sep 9 04:54:53.691186 ignition[1011]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:54:53.691192 ignition[1011]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:54:53.691195 ignition[1011]: failed to fetch config: resource requires networking Sep 9 04:54:53.691315 ignition[1011]: Ignition finished successfully Sep 9 04:54:53.746396 ignition[1021]: Ignition 2.22.0 Sep 9 04:54:53.746402 ignition[1021]: Stage: fetch Sep 9 04:54:53.746578 ignition[1021]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:53.746585 ignition[1021]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:53.746642 ignition[1021]: parsed url from cmdline: "" Sep 9 04:54:53.746644 ignition[1021]: no config URL provided Sep 9 04:54:53.746648 ignition[1021]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:54:53.746652 ignition[1021]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:54:53.746667 ignition[1021]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 9 04:54:53.813113 ignition[1021]: GET result: OK Sep 9 04:54:53.813298 ignition[1021]: config has been read from IMDS userdata Sep 9 04:54:53.813329 ignition[1021]: parsing config with SHA512: 59c2f9b3c88e3c4279a0e03b6b92a1632160d1f73fe1cc1dfc491ebfab282230b49ad34cc39f12b46b20fc46354b07d4bf907f604a873e29a5ff309e7cd5ca6a Sep 9 04:54:53.816614 unknown[1021]: fetched base config from "system" Sep 9 04:54:53.816824 ignition[1021]: fetch: fetch complete Sep 9 04:54:53.816619 unknown[1021]: fetched base config from "system" Sep 9 04:54:53.816827 ignition[1021]: fetch: fetch passed Sep 9 04:54:53.816623 unknown[1021]: fetched user config from "azure" Sep 9 04:54:53.816882 ignition[1021]: Ignition finished successfully Sep 9 04:54:53.820605 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 04:54:53.826959 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:54:53.867454 ignition[1028]: Ignition 2.22.0 Sep 9 04:54:53.867468 ignition[1028]: Stage: kargs Sep 9 04:54:53.871711 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:54:53.867643 ignition[1028]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:53.877335 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:54:53.867650 ignition[1028]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:53.868142 ignition[1028]: kargs: kargs passed Sep 9 04:54:53.868182 ignition[1028]: Ignition finished successfully Sep 9 04:54:53.912830 ignition[1034]: Ignition 2.22.0 Sep 9 04:54:53.912835 ignition[1034]: Stage: disks Sep 9 04:54:53.913112 ignition[1034]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:53.920060 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:54:53.913120 ignition[1034]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:53.928154 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:54:53.914295 ignition[1034]: disks: disks passed Sep 9 04:54:53.936563 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:54:53.914355 ignition[1034]: Ignition finished successfully Sep 9 04:54:53.947415 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:54:53.956884 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:54:53.963957 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:54:53.975060 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:54:54.067933 systemd-fsck[1043]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 9 04:54:54.074502 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:54:54.086584 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:54:54.526079 systemd-networkd[1009]: eth0: Gained IPv6LL Sep 9 04:54:56.013901 kernel: EXT4-fs (sda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:54:56.014546 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:54:56.017797 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:54:56.053933 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:54:56.073451 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:54:56.081002 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 04:54:56.095958 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1057) Sep 9 04:54:56.095982 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:56.101483 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:54:56.117781 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:56.111755 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:54:56.129968 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:54:56.135346 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:54:56.155827 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:56.155855 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:56.158345 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:54:56.736812 coreos-metadata[1059]: Sep 09 04:54:56.736 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 04:54:56.743339 coreos-metadata[1059]: Sep 09 04:54:56.743 INFO Fetch successful Sep 9 04:54:56.743339 coreos-metadata[1059]: Sep 09 04:54:56.743 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 9 04:54:56.754515 coreos-metadata[1059]: Sep 09 04:54:56.754 INFO Fetch successful Sep 9 04:54:56.766540 coreos-metadata[1059]: Sep 09 04:54:56.766 INFO wrote hostname ci-4452.0.0-n-e60618bb0b to /sysroot/etc/hostname Sep 9 04:54:56.773910 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 04:54:56.927850 initrd-setup-root[1090]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:54:56.984990 initrd-setup-root[1097]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:54:57.016184 initrd-setup-root[1104]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:54:57.021590 initrd-setup-root[1111]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:54:58.219497 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:54:58.224512 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:54:58.239507 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:54:58.249494 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:54:58.258957 kernel: BTRFS info (device sda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:58.279946 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:54:58.289497 ignition[1179]: INFO : Ignition 2.22.0 Sep 9 04:54:58.289497 ignition[1179]: INFO : Stage: mount Sep 9 04:54:58.295671 ignition[1179]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:58.295671 ignition[1179]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:58.295671 ignition[1179]: INFO : mount: mount passed Sep 9 04:54:58.295671 ignition[1179]: INFO : Ignition finished successfully Sep 9 04:54:58.294473 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:54:58.300378 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:54:58.324377 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:54:58.356889 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1190) Sep 9 04:54:58.366952 kernel: BTRFS info (device sda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:54:58.366986 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:54:58.376774 kernel: BTRFS info (device sda6): turning on async discard Sep 9 04:54:58.376803 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 04:54:58.378268 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:54:58.410916 ignition[1207]: INFO : Ignition 2.22.0 Sep 9 04:54:58.410916 ignition[1207]: INFO : Stage: files Sep 9 04:54:58.410916 ignition[1207]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:54:58.410916 ignition[1207]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:54:58.425244 ignition[1207]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:54:58.442037 ignition[1207]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:54:58.442037 ignition[1207]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:54:58.511002 ignition[1207]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:54:58.516972 ignition[1207]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:54:58.516972 ignition[1207]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:54:58.511369 unknown[1207]: wrote ssh authorized keys file for user: core Sep 9 04:54:58.560951 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 9 04:54:58.568370 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 9 04:54:58.754500 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:54:58.846801 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:54:58.909200 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 9 04:54:59.415445 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:54:59.687998 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:54:59.687998 ignition[1207]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:54:59.728749 ignition[1207]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:54:59.739439 ignition[1207]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:54:59.739439 ignition[1207]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:54:59.752234 ignition[1207]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:54:59.752234 ignition[1207]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:54:59.752234 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:54:59.752234 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:54:59.752234 ignition[1207]: INFO : files: files passed Sep 9 04:54:59.752234 ignition[1207]: INFO : Ignition finished successfully Sep 9 04:54:59.747987 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:54:59.756644 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:54:59.780328 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:54:59.799773 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:54:59.799847 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:54:59.827855 initrd-setup-root-after-ignition[1236]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:54:59.827855 initrd-setup-root-after-ignition[1236]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:54:59.840229 initrd-setup-root-after-ignition[1240]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:54:59.834797 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:54:59.844922 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:54:59.855211 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:54:59.894396 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:54:59.894502 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:54:59.902782 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:54:59.912119 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:54:59.922411 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:54:59.923145 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:54:59.955855 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:54:59.962456 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:54:59.983910 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:54:59.988869 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:54:59.998073 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:55:00.006146 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:55:00.006280 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:55:00.018288 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:55:00.027721 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:55:00.035598 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:55:00.043830 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:55:00.052099 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:55:00.061264 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:55:00.070400 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:55:00.079367 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:55:00.088088 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:55:00.097328 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:55:00.105628 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:55:00.113842 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:55:00.114008 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:55:00.125054 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:55:00.133917 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:55:00.142817 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:55:00.142919 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:55:00.152941 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:55:00.153095 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:55:00.166256 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:55:00.166404 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:55:00.175500 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:55:00.175618 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:55:00.183260 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 04:55:00.183364 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 04:55:00.195997 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:55:00.242645 ignition[1260]: INFO : Ignition 2.22.0 Sep 9 04:55:00.242645 ignition[1260]: INFO : Stage: umount Sep 9 04:55:00.242645 ignition[1260]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:55:00.242645 ignition[1260]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 9 04:55:00.242645 ignition[1260]: INFO : umount: umount passed Sep 9 04:55:00.242645 ignition[1260]: INFO : Ignition finished successfully Sep 9 04:55:00.209077 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:55:00.217315 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:55:00.218705 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:55:00.237825 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:55:00.237926 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:55:00.248169 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:55:00.248250 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:55:00.255358 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:55:00.255625 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:55:00.263446 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:55:00.263488 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:55:00.272521 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:55:00.272559 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:55:00.279843 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 04:55:00.279882 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 04:55:00.286796 systemd[1]: Stopped target network.target - Network. Sep 9 04:55:00.293273 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:55:00.293311 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:55:00.303017 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:55:00.306762 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:55:00.309896 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:55:00.315450 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:55:00.323055 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:55:00.331181 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:55:00.331216 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:55:00.338457 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:55:00.338479 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:55:00.346445 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:55:00.346495 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:55:00.354081 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:55:00.354108 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:55:00.363173 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:55:00.370759 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:55:00.379269 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:55:00.379697 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:55:00.379764 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:55:00.386489 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:55:00.386572 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:55:00.403181 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:55:00.403305 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:55:00.416005 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:55:00.588903 kernel: hv_netvsc 002248b7-5712-0022-48b7-5712002248b7 eth0: Data path switched from VF: enP55924s1 Sep 9 04:55:00.416203 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:55:00.416277 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:55:00.427218 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:55:00.427639 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:55:00.435087 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:55:00.435122 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:55:00.449066 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:55:00.461066 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:55:00.461123 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:55:00.470626 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:55:00.470669 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:55:00.483305 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:55:00.483351 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:55:00.487708 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:55:00.487734 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:55:00.501753 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:55:00.510635 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:55:00.510686 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:55:00.543478 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:55:00.543654 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:55:00.552532 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:55:00.552567 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:55:00.561235 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:55:00.561255 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:55:00.576121 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:55:00.576170 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:55:00.589012 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:55:00.589062 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:55:00.601349 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:55:00.601394 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:55:00.615401 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:55:00.629211 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:55:00.629262 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:55:00.641983 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:55:00.642024 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:55:00.650932 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 04:55:00.650976 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:55:00.661640 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:55:00.831436 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Sep 9 04:55:00.661681 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:55:00.666858 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:55:00.666921 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:00.681046 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 04:55:00.681093 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 04:55:00.681114 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 04:55:00.681135 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:55:00.681371 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:55:00.681456 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:55:00.689206 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:55:00.689282 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:55:00.697396 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:55:00.706481 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:55:00.743543 systemd[1]: Switching root. Sep 9 04:55:00.897375 systemd-journald[224]: Journal stopped Sep 9 04:55:08.406733 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:55:08.406752 kernel: SELinux: policy capability open_perms=1 Sep 9 04:55:08.406759 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:55:08.406765 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:55:08.406771 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:55:08.406776 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:55:08.406782 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:55:08.406787 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:55:08.406792 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:55:08.406798 kernel: audit: type=1403 audit(1757393702.311:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:55:08.406804 systemd[1]: Successfully loaded SELinux policy in 181.281ms. Sep 9 04:55:08.406812 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.290ms. Sep 9 04:55:08.406818 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:55:08.406824 systemd[1]: Detected virtualization microsoft. Sep 9 04:55:08.406831 systemd[1]: Detected architecture arm64. Sep 9 04:55:08.406837 systemd[1]: Detected first boot. Sep 9 04:55:08.406843 systemd[1]: Hostname set to . Sep 9 04:55:08.406849 systemd[1]: Initializing machine ID from random generator. Sep 9 04:55:08.406857 zram_generator::config[1303]: No configuration found. Sep 9 04:55:08.406863 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:55:08.406869 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:55:08.406886 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:55:08.406893 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:55:08.406899 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:55:08.406904 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:55:08.406910 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:55:08.406917 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:55:08.406923 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:55:08.406929 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:55:08.406935 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:55:08.406942 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:55:08.406948 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:55:08.406953 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:55:08.406959 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:55:08.406965 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:55:08.406971 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:55:08.406977 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:55:08.406983 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:55:08.406990 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:55:08.406997 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:55:08.407004 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:55:08.407010 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:55:08.407016 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:55:08.407023 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:55:08.407029 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:55:08.407035 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:55:08.407042 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:55:08.407048 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:55:08.407054 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:55:08.407060 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:55:08.407066 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:55:08.407072 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:55:08.407079 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:55:08.407085 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:55:08.407092 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:55:08.407098 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:55:08.407104 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:55:08.407110 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:55:08.407117 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:55:08.407123 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:55:08.407130 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:55:08.407136 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:55:08.407142 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:55:08.407149 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:55:08.407155 systemd[1]: Reached target machines.target - Containers. Sep 9 04:55:08.407161 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:55:08.407168 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:08.407174 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:55:08.407181 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:55:08.407187 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:08.407193 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:55:08.407199 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:08.407205 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:55:08.407211 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:08.407218 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:55:08.407225 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:55:08.407231 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:55:08.407237 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:55:08.407243 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:55:08.407250 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:08.407256 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:55:08.407262 kernel: fuse: init (API version 7.41) Sep 9 04:55:08.407268 kernel: loop: module loaded Sep 9 04:55:08.407275 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:55:08.407280 kernel: ACPI: bus type drm_connector registered Sep 9 04:55:08.407286 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:55:08.407293 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:55:08.407311 systemd-journald[1407]: Collecting audit messages is disabled. Sep 9 04:55:08.407326 systemd-journald[1407]: Journal started Sep 9 04:55:08.407341 systemd-journald[1407]: Runtime Journal (/run/log/journal/ba064823451f418ab6db64c026e72d65) is 8M, max 78.5M, 70.5M free. Sep 9 04:55:07.637920 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:55:07.642309 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 04:55:07.642665 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:55:07.642911 systemd[1]: systemd-journald.service: Consumed 2.499s CPU time. Sep 9 04:55:08.418171 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:55:08.438118 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:55:08.447373 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:55:08.447425 systemd[1]: Stopped verity-setup.service. Sep 9 04:55:08.461572 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:55:08.462998 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:55:08.467481 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:55:08.472387 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:55:08.476432 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:55:08.482323 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:55:08.486984 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:55:08.490811 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:55:08.497317 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:55:08.502870 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:55:08.503011 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:55:08.507857 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:08.508079 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:08.513328 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:55:08.513441 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:55:08.517796 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:08.517949 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:08.523345 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:55:08.523465 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:55:08.528621 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:08.528757 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:08.535407 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:55:08.541898 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:55:08.547043 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:55:08.551847 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:55:08.565037 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:55:08.570429 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:55:08.583793 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:55:08.589018 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:55:08.589044 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:55:08.593854 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:55:08.599690 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:55:08.604255 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:08.626710 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:55:08.636484 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:55:08.641867 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:55:08.642578 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:55:08.649206 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:55:08.650264 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:55:08.656941 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:55:08.663749 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:55:08.672018 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:55:08.678051 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:55:08.683910 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:55:08.695670 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:55:08.702290 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:55:08.709996 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:55:08.723958 systemd-journald[1407]: Time spent on flushing to /var/log/journal/ba064823451f418ab6db64c026e72d65 is 13.094ms for 945 entries. Sep 9 04:55:08.723958 systemd-journald[1407]: System Journal (/var/log/journal/ba064823451f418ab6db64c026e72d65) is 8M, max 2.6G, 2.6G free. Sep 9 04:55:08.756366 systemd-journald[1407]: Received client request to flush runtime journal. Sep 9 04:55:08.756399 kernel: loop0: detected capacity change from 0 to 119368 Sep 9 04:55:08.758164 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:55:08.812990 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:55:08.813598 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:55:08.857974 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:55:08.903124 systemd-tmpfiles[1443]: ACLs are not supported, ignoring. Sep 9 04:55:08.903136 systemd-tmpfiles[1443]: ACLs are not supported, ignoring. Sep 9 04:55:08.906569 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:55:08.912731 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:55:09.287903 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:55:09.358903 kernel: loop1: detected capacity change from 0 to 27936 Sep 9 04:55:09.460142 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:55:09.465964 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:55:09.483366 systemd-tmpfiles[1461]: ACLs are not supported, ignoring. Sep 9 04:55:09.483380 systemd-tmpfiles[1461]: ACLs are not supported, ignoring. Sep 9 04:55:09.485605 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:55:09.908905 kernel: loop2: detected capacity change from 0 to 211168 Sep 9 04:55:09.965154 kernel: loop3: detected capacity change from 0 to 100632 Sep 9 04:55:10.338146 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:55:10.344429 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:55:10.369951 systemd-udevd[1467]: Using default interface naming scheme 'v255'. Sep 9 04:55:10.481906 kernel: loop4: detected capacity change from 0 to 119368 Sep 9 04:55:10.492892 kernel: loop5: detected capacity change from 0 to 27936 Sep 9 04:55:10.503905 kernel: loop6: detected capacity change from 0 to 211168 Sep 9 04:55:10.519892 kernel: loop7: detected capacity change from 0 to 100632 Sep 9 04:55:10.528465 (sd-merge)[1469]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 9 04:55:10.528820 (sd-merge)[1469]: Merged extensions into '/usr'. Sep 9 04:55:10.532259 systemd[1]: Reload requested from client PID 1441 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:55:10.532451 systemd[1]: Reloading... Sep 9 04:55:10.584905 zram_generator::config[1494]: No configuration found. Sep 9 04:55:10.772244 systemd[1]: Reloading finished in 239 ms. Sep 9 04:55:10.801028 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:55:10.809769 systemd[1]: Starting ensure-sysext.service... Sep 9 04:55:10.814999 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:55:10.859587 systemd[1]: Reload requested from client PID 1550 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:55:10.859601 systemd[1]: Reloading... Sep 9 04:55:10.887644 systemd-tmpfiles[1551]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:55:10.887688 systemd-tmpfiles[1551]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:55:10.888694 systemd-tmpfiles[1551]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:55:10.888887 systemd-tmpfiles[1551]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:55:10.889321 systemd-tmpfiles[1551]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:55:10.889462 systemd-tmpfiles[1551]: ACLs are not supported, ignoring. Sep 9 04:55:10.889489 systemd-tmpfiles[1551]: ACLs are not supported, ignoring. Sep 9 04:55:10.910898 zram_generator::config[1579]: No configuration found. Sep 9 04:55:10.960133 systemd-tmpfiles[1551]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:55:10.960144 systemd-tmpfiles[1551]: Skipping /boot Sep 9 04:55:10.965267 systemd-tmpfiles[1551]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:55:10.965369 systemd-tmpfiles[1551]: Skipping /boot Sep 9 04:55:11.056632 systemd[1]: Reloading finished in 196 ms. Sep 9 04:55:11.070254 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:55:11.084894 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:55:11.108125 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:55:11.180437 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:55:11.190842 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:55:11.201032 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:55:11.208858 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:55:11.218046 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:55:11.229942 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:55:11.232288 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:11.234134 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:11.251040 kernel: hv_vmbus: registering driver hv_balloon Sep 9 04:55:11.251120 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#171 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 9 04:55:11.261984 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 9 04:55:11.262063 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 9 04:55:11.262086 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 04:55:11.262506 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:11.280999 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:11.287270 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:11.287371 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:11.290295 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:11.290454 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:11.298933 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:11.299066 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:11.306565 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:11.306715 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:11.316600 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 9 04:55:11.321867 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:55:11.325114 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:55:11.331696 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:55:11.339364 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:55:11.348013 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:55:11.355417 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:55:11.355552 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:55:11.355718 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:55:11.370907 kernel: hv_vmbus: registering driver hyperv_fb Sep 9 04:55:11.374723 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:55:11.378801 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 9 04:55:11.378942 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 9 04:55:11.387368 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:55:11.390945 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:55:11.396032 kernel: Console: switching to colour dummy device 80x25 Sep 9 04:55:11.399796 kernel: Console: switching to colour frame buffer device 128x48 Sep 9 04:55:11.401391 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:55:11.402101 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:55:11.412636 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:55:11.413030 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:55:11.424864 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:55:11.425500 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:55:11.432440 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 9 04:55:11.438290 systemd[1]: Finished ensure-sysext.service. Sep 9 04:55:11.442488 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:55:11.459606 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:55:11.459649 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:55:11.460712 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:55:11.472391 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:55:11.472678 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:11.477584 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:55:11.486049 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:55:11.563002 augenrules[1805]: No rules Sep 9 04:55:11.565593 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:55:11.566047 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:55:11.587078 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:55:11.597607 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 9 04:55:11.603943 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:55:11.614400 systemd-resolved[1685]: Positive Trust Anchors: Sep 9 04:55:11.614625 systemd-resolved[1685]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:55:11.614648 systemd-resolved[1685]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:55:11.634867 systemd-networkd[1682]: lo: Link UP Sep 9 04:55:11.634885 systemd-networkd[1682]: lo: Gained carrier Sep 9 04:55:11.635809 systemd-networkd[1682]: Enumeration completed Sep 9 04:55:11.635957 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:55:11.636080 systemd-networkd[1682]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:11.636083 systemd-networkd[1682]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:55:11.643348 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:55:11.643888 kernel: MACsec IEEE 802.1AE Sep 9 04:55:11.650218 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:55:11.671092 systemd-resolved[1685]: Using system hostname 'ci-4452.0.0-n-e60618bb0b'. Sep 9 04:55:11.693893 kernel: mlx5_core da74:00:02.0 enP55924s1: Link up Sep 9 04:55:11.694229 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 9 04:55:11.713748 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:55:11.725109 kernel: hv_netvsc 002248b7-5712-0022-48b7-5712002248b7 eth0: Data path switched to VF: enP55924s1 Sep 9 04:55:11.725416 systemd-networkd[1682]: enP55924s1: Link UP Sep 9 04:55:11.725531 systemd-networkd[1682]: eth0: Link UP Sep 9 04:55:11.725534 systemd-networkd[1682]: eth0: Gained carrier Sep 9 04:55:11.725553 systemd-networkd[1682]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:11.726519 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:55:11.731591 systemd-networkd[1682]: enP55924s1: Gained carrier Sep 9 04:55:11.732272 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:55:11.738571 systemd[1]: Reached target network.target - Network. Sep 9 04:55:11.742372 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:55:11.752932 systemd-networkd[1682]: eth0: DHCPv4 address 10.200.20.4/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:55:12.830044 systemd-networkd[1682]: eth0: Gained IPv6LL Sep 9 04:55:12.832117 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:55:12.837448 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:55:12.842679 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:55:13.545853 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:55:13.551310 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:55:17.022561 ldconfig[1436]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:55:17.029496 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:55:17.035838 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:55:17.066373 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:55:17.071549 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:55:17.075749 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:55:17.081166 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:55:17.087442 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:55:17.091701 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:55:17.096589 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:55:17.102751 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:55:17.102783 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:55:17.106282 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:55:17.123158 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:55:17.128558 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:55:17.133725 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:55:17.138524 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:55:17.143155 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:55:17.149404 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:55:17.166095 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:55:17.171497 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:55:17.175722 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:55:17.180014 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:55:17.183919 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:55:17.183942 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:55:17.199288 systemd[1]: Starting chronyd.service - NTP client/server... Sep 9 04:55:17.211800 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:55:17.217999 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 04:55:17.222578 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:55:17.230638 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:55:17.239000 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:55:17.256012 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:55:17.259767 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:55:17.260989 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 9 04:55:17.265544 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 9 04:55:17.266327 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:17.271347 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:55:17.274156 jq[1843]: false Sep 9 04:55:17.277017 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:55:17.282929 KVP[1845]: KVP starting; pid is:1845 Sep 9 04:55:17.284124 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:55:17.286983 chronyd[1835]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 9 04:55:17.290889 kernel: hv_utils: KVP IC version 4.0 Sep 9 04:55:17.291628 KVP[1845]: KVP LIC Version: 3.1 Sep 9 04:55:17.292572 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:55:17.298825 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:55:17.305384 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:55:17.310482 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:55:17.311261 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:55:17.312124 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:55:17.319125 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:55:17.327868 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:55:17.332861 jq[1857]: true Sep 9 04:55:17.335581 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:55:17.336034 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:55:17.340698 extend-filesystems[1844]: Found /dev/sda6 Sep 9 04:55:17.344007 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:55:17.345605 chronyd[1835]: Timezone right/UTC failed leap second check, ignoring Sep 9 04:55:17.345926 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:55:17.345757 chronyd[1835]: Loaded seccomp filter (level 2) Sep 9 04:55:17.355205 systemd[1]: Started chronyd.service - NTP client/server. Sep 9 04:55:17.362381 extend-filesystems[1844]: Found /dev/sda9 Sep 9 04:55:17.370224 extend-filesystems[1844]: Checking size of /dev/sda9 Sep 9 04:55:17.379865 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:55:17.383056 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:55:17.388774 jq[1868]: true Sep 9 04:55:17.390145 (ntainerd)[1873]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:55:17.398495 update_engine[1856]: I20250909 04:55:17.397506 1856 main.cc:92] Flatcar Update Engine starting Sep 9 04:55:17.412901 extend-filesystems[1844]: Old size kept for /dev/sda9 Sep 9 04:55:17.417583 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:55:17.417786 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:55:17.427423 systemd-logind[1855]: New seat seat0. Sep 9 04:55:17.429520 systemd-logind[1855]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Sep 9 04:55:17.430113 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:55:17.444915 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:55:17.469456 tar[1866]: linux-arm64/LICENSE Sep 9 04:55:17.469819 tar[1866]: linux-arm64/helm Sep 9 04:55:17.477359 bash[1902]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:55:17.479525 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:55:17.488862 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:55:17.724231 dbus-daemon[1838]: [system] SELinux support is enabled Sep 9 04:55:17.724617 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:55:17.734358 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:55:17.735079 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:55:17.740480 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:55:17.740499 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:55:17.748445 dbus-daemon[1838]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 04:55:17.750720 update_engine[1856]: I20250909 04:55:17.750666 1856 update_check_scheduler.cc:74] Next update check in 4m29s Sep 9 04:55:17.750906 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:55:17.758081 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:55:17.831888 coreos-metadata[1837]: Sep 09 04:55:17.831 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 9 04:55:17.836695 coreos-metadata[1837]: Sep 09 04:55:17.836 INFO Fetch successful Sep 9 04:55:17.836695 coreos-metadata[1837]: Sep 09 04:55:17.836 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 9 04:55:17.843415 coreos-metadata[1837]: Sep 09 04:55:17.842 INFO Fetch successful Sep 9 04:55:17.845585 coreos-metadata[1837]: Sep 09 04:55:17.845 INFO Fetching http://168.63.129.16/machine/8ccf6dd4-3087-4918-9660-81a98fd4e3cf/ec790dbf%2D49f8%2D4d1f%2Db865%2D6bda531715e4.%5Fci%2D4452.0.0%2Dn%2De60618bb0b?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 9 04:55:17.848045 coreos-metadata[1837]: Sep 09 04:55:17.847 INFO Fetch successful Sep 9 04:55:17.848045 coreos-metadata[1837]: Sep 09 04:55:17.848 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 9 04:55:17.857506 coreos-metadata[1837]: Sep 09 04:55:17.857 INFO Fetch successful Sep 9 04:55:17.886223 sshd_keygen[1885]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:55:17.896146 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 04:55:17.901568 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:55:17.923114 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:55:17.933116 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:55:17.942829 locksmithd[1975]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:55:17.944974 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 9 04:55:17.959203 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:55:17.959391 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:55:17.978138 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:55:17.995784 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 9 04:55:18.016300 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:55:18.024174 tar[1866]: linux-arm64/README.md Sep 9 04:55:18.028137 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:55:18.039126 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:55:18.046838 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:55:18.053373 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:55:18.154545 containerd[1873]: time="2025-09-09T04:55:18Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:55:18.157225 containerd[1873]: time="2025-09-09T04:55:18.156236472Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:55:18.161300 containerd[1873]: time="2025-09-09T04:55:18.161269976Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.656µs" Sep 9 04:55:18.161300 containerd[1873]: time="2025-09-09T04:55:18.161295288Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:55:18.161300 containerd[1873]: time="2025-09-09T04:55:18.161308304Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:55:18.161443 containerd[1873]: time="2025-09-09T04:55:18.161428040Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:55:18.161443 containerd[1873]: time="2025-09-09T04:55:18.161441536Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:55:18.161469 containerd[1873]: time="2025-09-09T04:55:18.161458032Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161503 containerd[1873]: time="2025-09-09T04:55:18.161492088Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161503 containerd[1873]: time="2025-09-09T04:55:18.161500592Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161664 containerd[1873]: time="2025-09-09T04:55:18.161649576Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161664 containerd[1873]: time="2025-09-09T04:55:18.161661568Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161697 containerd[1873]: time="2025-09-09T04:55:18.161668280Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161697 containerd[1873]: time="2025-09-09T04:55:18.161674280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:55:18.161772 containerd[1873]: time="2025-09-09T04:55:18.161760840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:55:18.162343 containerd[1873]: time="2025-09-09T04:55:18.162319040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:55:18.162387 containerd[1873]: time="2025-09-09T04:55:18.162365392Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:55:18.162387 containerd[1873]: time="2025-09-09T04:55:18.162373328Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:55:18.162527 containerd[1873]: time="2025-09-09T04:55:18.162397488Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:55:18.162890 containerd[1873]: time="2025-09-09T04:55:18.162565712Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:55:18.162890 containerd[1873]: time="2025-09-09T04:55:18.162627576Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:55:18.173466 containerd[1873]: time="2025-09-09T04:55:18.173437232Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173481368Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173497488Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173505704Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173513688Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173521608Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173532816Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:55:18.173542 containerd[1873]: time="2025-09-09T04:55:18.173539904Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173547392Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173553696Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173559512Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173568328Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173668264Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173681480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173691960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173698448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173707872Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173715152Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173722104Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173728520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173735264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173741088Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:55:18.173940 containerd[1873]: time="2025-09-09T04:55:18.173750408Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:55:18.174124 containerd[1873]: time="2025-09-09T04:55:18.173804000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:55:18.174124 containerd[1873]: time="2025-09-09T04:55:18.173814864Z" level=info msg="Start snapshots syncer" Sep 9 04:55:18.174124 containerd[1873]: time="2025-09-09T04:55:18.173830848Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:55:18.174160 containerd[1873]: time="2025-09-09T04:55:18.174004632Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:55:18.174160 containerd[1873]: time="2025-09-09T04:55:18.174044248Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:55:18.174160 containerd[1873]: time="2025-09-09T04:55:18.174096384Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174183216Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174198096Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174206728Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174214512Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174221960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174228536Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:55:18.174248 containerd[1873]: time="2025-09-09T04:55:18.174234712Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174251360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174258664Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174265784Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174286008Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174294632Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174299840Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174305368Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174309896Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174315232Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:55:18.174329 containerd[1873]: time="2025-09-09T04:55:18.174321320Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:55:18.174445 containerd[1873]: time="2025-09-09T04:55:18.174333288Z" level=info msg="runtime interface created" Sep 9 04:55:18.174445 containerd[1873]: time="2025-09-09T04:55:18.174337008Z" level=info msg="created NRI interface" Sep 9 04:55:18.174445 containerd[1873]: time="2025-09-09T04:55:18.174341856Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:55:18.174445 containerd[1873]: time="2025-09-09T04:55:18.174349160Z" level=info msg="Connect containerd service" Sep 9 04:55:18.174445 containerd[1873]: time="2025-09-09T04:55:18.174367240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:55:18.175895 containerd[1873]: time="2025-09-09T04:55:18.175092656Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:55:18.249725 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:18.430585 containerd[1873]: time="2025-09-09T04:55:18.430493432Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:55:18.430585 containerd[1873]: time="2025-09-09T04:55:18.430559896Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:55:18.430740 containerd[1873]: time="2025-09-09T04:55:18.430684536Z" level=info msg="Start subscribing containerd event" Sep 9 04:55:18.430759 containerd[1873]: time="2025-09-09T04:55:18.430741168Z" level=info msg="Start recovering state" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.430974544Z" level=info msg="Start event monitor" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.430994088Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431000536Z" level=info msg="Start streaming server" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431006736Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431012584Z" level=info msg="runtime interface starting up..." Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431016392Z" level=info msg="starting plugins..." Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431027320Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:55:18.432898 containerd[1873]: time="2025-09-09T04:55:18.431149232Z" level=info msg="containerd successfully booted in 0.276912s" Sep 9 04:55:18.431294 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:55:18.437199 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:55:18.443528 systemd[1]: Startup finished in 1.665s (kernel) + 14.356s (initrd) + 16.311s (userspace) = 32.333s. Sep 9 04:55:18.589634 (kubelet)[2033]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:18.958835 kubelet[2033]: E0909 04:55:18.958730 2033 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:18.960939 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:18.961047 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:18.961989 systemd[1]: kubelet.service: Consumed 552ms CPU time, 257.7M memory peak. Sep 9 04:55:19.162444 login[2016]: pam_lastlog(login:session): file /var/log/lastlog is locked/read, retrying Sep 9 04:55:19.177037 login[2015]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:19.185739 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:55:19.187966 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:55:19.191208 systemd-logind[1855]: New session 2 of user core. Sep 9 04:55:19.222503 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:55:19.224198 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:55:19.280007 (systemd)[2049]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:55:19.282840 systemd-logind[1855]: New session c1 of user core. Sep 9 04:55:19.577745 systemd[2049]: Queued start job for default target default.target. Sep 9 04:55:19.585575 systemd[2049]: Created slice app.slice - User Application Slice. Sep 9 04:55:19.585596 systemd[2049]: Reached target paths.target - Paths. Sep 9 04:55:19.585624 systemd[2049]: Reached target timers.target - Timers. Sep 9 04:55:19.586620 systemd[2049]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:55:19.594225 systemd[2049]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:55:19.594268 systemd[2049]: Reached target sockets.target - Sockets. Sep 9 04:55:19.594300 systemd[2049]: Reached target basic.target - Basic System. Sep 9 04:55:19.594321 systemd[2049]: Reached target default.target - Main User Target. Sep 9 04:55:19.594339 systemd[2049]: Startup finished in 306ms. Sep 9 04:55:19.594544 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:55:19.596375 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:55:19.999852 waagent[2011]: 2025-09-09T04:55:19.999724Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 9 04:55:20.004333 waagent[2011]: 2025-09-09T04:55:20.004297Z INFO Daemon Daemon OS: flatcar 4452.0.0 Sep 9 04:55:20.007721 waagent[2011]: 2025-09-09T04:55:20.007689Z INFO Daemon Daemon Python: 3.11.13 Sep 9 04:55:20.012885 waagent[2011]: 2025-09-09T04:55:20.011143Z INFO Daemon Daemon Run daemon Sep 9 04:55:20.014518 waagent[2011]: 2025-09-09T04:55:20.014488Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4452.0.0' Sep 9 04:55:20.022070 waagent[2011]: 2025-09-09T04:55:20.022032Z INFO Daemon Daemon Using waagent for provisioning Sep 9 04:55:20.025763 waagent[2011]: 2025-09-09T04:55:20.025731Z INFO Daemon Daemon Activate resource disk Sep 9 04:55:20.029239 waagent[2011]: 2025-09-09T04:55:20.029209Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 9 04:55:20.037159 waagent[2011]: 2025-09-09T04:55:20.037128Z INFO Daemon Daemon Found device: None Sep 9 04:55:20.040359 waagent[2011]: 2025-09-09T04:55:20.040328Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 9 04:55:20.048061 waagent[2011]: 2025-09-09T04:55:20.048034Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 9 04:55:20.056983 waagent[2011]: 2025-09-09T04:55:20.056932Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 04:55:20.062678 waagent[2011]: 2025-09-09T04:55:20.062642Z INFO Daemon Daemon Running default provisioning handler Sep 9 04:55:20.072337 waagent[2011]: 2025-09-09T04:55:20.072294Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 9 04:55:20.083504 waagent[2011]: 2025-09-09T04:55:20.083465Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 9 04:55:20.091120 waagent[2011]: 2025-09-09T04:55:20.091089Z INFO Daemon Daemon cloud-init is enabled: False Sep 9 04:55:20.094743 waagent[2011]: 2025-09-09T04:55:20.094718Z INFO Daemon Daemon Copying ovf-env.xml Sep 9 04:55:20.165841 login[2016]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:20.169672 systemd-logind[1855]: New session 1 of user core. Sep 9 04:55:20.175998 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:55:20.215597 waagent[2011]: 2025-09-09T04:55:20.215518Z INFO Daemon Daemon Successfully mounted dvd Sep 9 04:55:20.252404 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 9 04:55:20.255897 waagent[2011]: 2025-09-09T04:55:20.254434Z INFO Daemon Daemon Detect protocol endpoint Sep 9 04:55:20.258078 waagent[2011]: 2025-09-09T04:55:20.258043Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 9 04:55:20.262116 waagent[2011]: 2025-09-09T04:55:20.262076Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 9 04:55:20.266664 waagent[2011]: 2025-09-09T04:55:20.266633Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 9 04:55:20.270549 waagent[2011]: 2025-09-09T04:55:20.270518Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 9 04:55:20.274354 waagent[2011]: 2025-09-09T04:55:20.274327Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 9 04:55:20.314588 waagent[2011]: 2025-09-09T04:55:20.314547Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 9 04:55:20.319534 waagent[2011]: 2025-09-09T04:55:20.319508Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 9 04:55:20.323174 waagent[2011]: 2025-09-09T04:55:20.323144Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 9 04:55:20.430676 waagent[2011]: 2025-09-09T04:55:20.430589Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 9 04:55:20.435304 waagent[2011]: 2025-09-09T04:55:20.435266Z INFO Daemon Daemon Forcing an update of the goal state. Sep 9 04:55:20.443357 waagent[2011]: 2025-09-09T04:55:20.443321Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 04:55:20.487383 waagent[2011]: 2025-09-09T04:55:20.487344Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 9 04:55:20.491535 waagent[2011]: 2025-09-09T04:55:20.491500Z INFO Daemon Sep 9 04:55:20.493551 waagent[2011]: 2025-09-09T04:55:20.493525Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 55e5083b-d73a-419e-b030-b21c5dbf4f14 eTag: 10552783479927610072 source: Fabric] Sep 9 04:55:20.501522 waagent[2011]: 2025-09-09T04:55:20.501490Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 9 04:55:20.506075 waagent[2011]: 2025-09-09T04:55:20.506015Z INFO Daemon Sep 9 04:55:20.507989 waagent[2011]: 2025-09-09T04:55:20.507963Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 9 04:55:20.516341 waagent[2011]: 2025-09-09T04:55:20.516313Z INFO Daemon Daemon Downloading artifacts profile blob Sep 9 04:55:20.571579 waagent[2011]: 2025-09-09T04:55:20.571527Z INFO Daemon Downloaded certificate {'thumbprint': '4329B1B1A8F2CB3651158D6668040035B3601B1F', 'hasPrivateKey': True} Sep 9 04:55:20.578676 waagent[2011]: 2025-09-09T04:55:20.578641Z INFO Daemon Fetch goal state completed Sep 9 04:55:20.587785 waagent[2011]: 2025-09-09T04:55:20.587754Z INFO Daemon Daemon Starting provisioning Sep 9 04:55:20.591574 waagent[2011]: 2025-09-09T04:55:20.591545Z INFO Daemon Daemon Handle ovf-env.xml. Sep 9 04:55:20.595355 waagent[2011]: 2025-09-09T04:55:20.595330Z INFO Daemon Daemon Set hostname [ci-4452.0.0-n-e60618bb0b] Sep 9 04:55:20.613428 waagent[2011]: 2025-09-09T04:55:20.613388Z INFO Daemon Daemon Publish hostname [ci-4452.0.0-n-e60618bb0b] Sep 9 04:55:20.617691 waagent[2011]: 2025-09-09T04:55:20.617659Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 9 04:55:20.621736 waagent[2011]: 2025-09-09T04:55:20.621710Z INFO Daemon Daemon Primary interface is [eth0] Sep 9 04:55:20.631114 systemd-networkd[1682]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:55:20.631119 systemd-networkd[1682]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:55:20.631162 systemd-networkd[1682]: eth0: DHCP lease lost Sep 9 04:55:20.634896 waagent[2011]: 2025-09-09T04:55:20.631928Z INFO Daemon Daemon Create user account if not exists Sep 9 04:55:20.636129 waagent[2011]: 2025-09-09T04:55:20.636098Z INFO Daemon Daemon User core already exists, skip useradd Sep 9 04:55:20.640389 waagent[2011]: 2025-09-09T04:55:20.640364Z INFO Daemon Daemon Configure sudoer Sep 9 04:55:20.650656 waagent[2011]: 2025-09-09T04:55:20.647916Z INFO Daemon Daemon Configure sshd Sep 9 04:55:20.653687 waagent[2011]: 2025-09-09T04:55:20.653648Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 9 04:55:20.662306 waagent[2011]: 2025-09-09T04:55:20.662278Z INFO Daemon Daemon Deploy ssh public key. Sep 9 04:55:20.670926 systemd-networkd[1682]: eth0: DHCPv4 address 10.200.20.4/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 9 04:55:21.783949 waagent[2011]: 2025-09-09T04:55:21.783818Z INFO Daemon Daemon Provisioning complete Sep 9 04:55:21.796966 waagent[2011]: 2025-09-09T04:55:21.796851Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 9 04:55:21.801146 waagent[2011]: 2025-09-09T04:55:21.801114Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 9 04:55:21.807711 waagent[2011]: 2025-09-09T04:55:21.807686Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 9 04:55:21.905914 waagent[2101]: 2025-09-09T04:55:21.904958Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 9 04:55:21.905914 waagent[2101]: 2025-09-09T04:55:21.905084Z INFO ExtHandler ExtHandler OS: flatcar 4452.0.0 Sep 9 04:55:21.905914 waagent[2101]: 2025-09-09T04:55:21.905121Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 9 04:55:21.905914 waagent[2101]: 2025-09-09T04:55:21.905155Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 9 04:55:21.966236 waagent[2101]: 2025-09-09T04:55:21.966172Z INFO ExtHandler ExtHandler Distro: flatcar-4452.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 9 04:55:21.966530 waagent[2101]: 2025-09-09T04:55:21.966502Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:21.966663 waagent[2101]: 2025-09-09T04:55:21.966638Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:21.972776 waagent[2101]: 2025-09-09T04:55:21.972731Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 9 04:55:21.977612 waagent[2101]: 2025-09-09T04:55:21.977583Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 9 04:55:21.978062 waagent[2101]: 2025-09-09T04:55:21.978026Z INFO ExtHandler Sep 9 04:55:21.978211 waagent[2101]: 2025-09-09T04:55:21.978186Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: faa49362-d08b-4ec7-895a-571d52fd848d eTag: 10552783479927610072 source: Fabric] Sep 9 04:55:21.978510 waagent[2101]: 2025-09-09T04:55:21.978481Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 9 04:55:21.979044 waagent[2101]: 2025-09-09T04:55:21.979013Z INFO ExtHandler Sep 9 04:55:21.979158 waagent[2101]: 2025-09-09T04:55:21.979135Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 9 04:55:21.982679 waagent[2101]: 2025-09-09T04:55:21.982652Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 9 04:55:22.034373 waagent[2101]: 2025-09-09T04:55:22.033277Z INFO ExtHandler Downloaded certificate {'thumbprint': '4329B1B1A8F2CB3651158D6668040035B3601B1F', 'hasPrivateKey': True} Sep 9 04:55:22.034373 waagent[2101]: 2025-09-09T04:55:22.033657Z INFO ExtHandler Fetch goal state completed Sep 9 04:55:22.045533 waagent[2101]: 2025-09-09T04:55:22.045496Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.2 1 Jul 2025 (Library: OpenSSL 3.4.2 1 Jul 2025) Sep 9 04:55:22.049357 waagent[2101]: 2025-09-09T04:55:22.049309Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2101 Sep 9 04:55:22.049551 waagent[2101]: 2025-09-09T04:55:22.049522Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 9 04:55:22.049896 waagent[2101]: 2025-09-09T04:55:22.049852Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 9 04:55:22.051098 waagent[2101]: 2025-09-09T04:55:22.051059Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4452.0.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 9 04:55:22.051495 waagent[2101]: 2025-09-09T04:55:22.051461Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4452.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 9 04:55:22.051713 waagent[2101]: 2025-09-09T04:55:22.051681Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 9 04:55:22.052255 waagent[2101]: 2025-09-09T04:55:22.052221Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 9 04:55:22.114866 waagent[2101]: 2025-09-09T04:55:22.114832Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 9 04:55:22.115192 waagent[2101]: 2025-09-09T04:55:22.115164Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 9 04:55:22.119518 waagent[2101]: 2025-09-09T04:55:22.119497Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 9 04:55:22.124007 systemd[1]: Reload requested from client PID 2116 ('systemctl') (unit waagent.service)... Sep 9 04:55:22.124019 systemd[1]: Reloading... Sep 9 04:55:22.178905 zram_generator::config[2155]: No configuration found. Sep 9 04:55:22.341726 systemd[1]: Reloading finished in 217 ms. Sep 9 04:55:22.364316 waagent[2101]: 2025-09-09T04:55:22.363623Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 9 04:55:22.364316 waagent[2101]: 2025-09-09T04:55:22.363759Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 9 04:55:23.393699 waagent[2101]: 2025-09-09T04:55:23.392932Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 9 04:55:23.393699 waagent[2101]: 2025-09-09T04:55:23.393244Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 9 04:55:23.394067 waagent[2101]: 2025-09-09T04:55:23.393949Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:23.394067 waagent[2101]: 2025-09-09T04:55:23.394024Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:23.394211 waagent[2101]: 2025-09-09T04:55:23.394180Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 9 04:55:23.394301 waagent[2101]: 2025-09-09T04:55:23.394261Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 9 04:55:23.394424 waagent[2101]: 2025-09-09T04:55:23.394394Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 9 04:55:23.394424 waagent[2101]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 9 04:55:23.394424 waagent[2101]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 9 04:55:23.394424 waagent[2101]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 9 04:55:23.394424 waagent[2101]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.394424 waagent[2101]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.394424 waagent[2101]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 9 04:55:23.394858 waagent[2101]: 2025-09-09T04:55:23.394825Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 9 04:55:23.395156 waagent[2101]: 2025-09-09T04:55:23.395125Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 9 04:55:23.395208 waagent[2101]: 2025-09-09T04:55:23.395188Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 9 04:55:23.395311 waagent[2101]: 2025-09-09T04:55:23.395286Z INFO EnvHandler ExtHandler Configure routes Sep 9 04:55:23.395352 waagent[2101]: 2025-09-09T04:55:23.395334Z INFO EnvHandler ExtHandler Gateway:None Sep 9 04:55:23.395381 waagent[2101]: 2025-09-09T04:55:23.395367Z INFO EnvHandler ExtHandler Routes:None Sep 9 04:55:23.395655 waagent[2101]: 2025-09-09T04:55:23.395579Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 9 04:55:23.395748 waagent[2101]: 2025-09-09T04:55:23.395649Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 9 04:55:23.396368 waagent[2101]: 2025-09-09T04:55:23.396339Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 9 04:55:23.396475 waagent[2101]: 2025-09-09T04:55:23.396435Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 9 04:55:23.396987 waagent[2101]: 2025-09-09T04:55:23.396914Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 9 04:55:23.401961 waagent[2101]: 2025-09-09T04:55:23.401932Z INFO ExtHandler ExtHandler Sep 9 04:55:23.402087 waagent[2101]: 2025-09-09T04:55:23.402063Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: a0135878-7c5f-4144-8727-a9b07736a942 correlation 91b74ffb-188f-4b2a-872c-740af10ef2b1 created: 2025-09-09T04:54:05.122627Z] Sep 9 04:55:23.402422 waagent[2101]: 2025-09-09T04:55:23.402393Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 9 04:55:23.402920 waagent[2101]: 2025-09-09T04:55:23.402891Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Sep 9 04:55:23.429638 waagent[2101]: 2025-09-09T04:55:23.429576Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 9 04:55:23.429638 waagent[2101]: Try `iptables -h' or 'iptables --help' for more information.) Sep 9 04:55:23.430001 waagent[2101]: 2025-09-09T04:55:23.429965Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 3A42B12E-51E7-495C-AC10-9DED48942550;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 9 04:55:23.477620 waagent[2101]: 2025-09-09T04:55:23.477548Z INFO MonitorHandler ExtHandler Network interfaces: Sep 9 04:55:23.477620 waagent[2101]: Executing ['ip', '-a', '-o', 'link']: Sep 9 04:55:23.477620 waagent[2101]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 9 04:55:23.477620 waagent[2101]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:b7:57:12 brd ff:ff:ff:ff:ff:ff Sep 9 04:55:23.477620 waagent[2101]: 3: enP55924s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:b7:57:12 brd ff:ff:ff:ff:ff:ff\ altname enP55924p0s2 Sep 9 04:55:23.477620 waagent[2101]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 9 04:55:23.477620 waagent[2101]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 9 04:55:23.477620 waagent[2101]: 2: eth0 inet 10.200.20.4/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 9 04:55:23.477620 waagent[2101]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 9 04:55:23.477620 waagent[2101]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 9 04:55:23.477620 waagent[2101]: 2: eth0 inet6 fe80::222:48ff:feb7:5712/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 9 04:55:23.534780 waagent[2101]: 2025-09-09T04:55:23.534146Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 9 04:55:23.534780 waagent[2101]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:23.534780 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.534780 waagent[2101]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:23.534780 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.534780 waagent[2101]: Chain OUTPUT (policy ACCEPT 4 packets, 401 bytes) Sep 9 04:55:23.534780 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.534780 waagent[2101]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 04:55:23.534780 waagent[2101]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 04:55:23.534780 waagent[2101]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 04:55:23.536483 waagent[2101]: 2025-09-09T04:55:23.536451Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 9 04:55:23.536483 waagent[2101]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:23.536483 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.536483 waagent[2101]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 9 04:55:23.536483 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.536483 waagent[2101]: Chain OUTPUT (policy ACCEPT 4 packets, 401 bytes) Sep 9 04:55:23.536483 waagent[2101]: pkts bytes target prot opt in out source destination Sep 9 04:55:23.536483 waagent[2101]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 9 04:55:23.536483 waagent[2101]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 9 04:55:23.536483 waagent[2101]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 9 04:55:23.536862 waagent[2101]: 2025-09-09T04:55:23.536838Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 9 04:55:29.058507 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:55:29.060195 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:29.154657 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:29.157261 (kubelet)[2255]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:29.256474 kubelet[2255]: E0909 04:55:29.256425 2255 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:29.259228 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:29.259330 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:29.259936 systemd[1]: kubelet.service: Consumed 104ms CPU time, 106.8M memory peak. Sep 9 04:55:39.308619 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:55:39.310342 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:39.405510 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:39.407969 (kubelet)[2269]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:39.520131 kubelet[2269]: E0909 04:55:39.520068 2269 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:39.522453 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:39.522658 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:39.523169 systemd[1]: kubelet.service: Consumed 104ms CPU time, 107.7M memory peak. Sep 9 04:55:41.143689 chronyd[1835]: Selected source PHC0 Sep 9 04:55:49.558746 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 04:55:49.561127 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:49.648978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:49.651507 (kubelet)[2284]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:55:49.803696 kubelet[2284]: E0909 04:55:49.803642 2284 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:55:49.805985 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:55:49.806213 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:55:49.806757 systemd[1]: kubelet.service: Consumed 101ms CPU time, 105.2M memory peak. Sep 9 04:55:52.145006 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:55:52.146459 systemd[1]: Started sshd@0-10.200.20.4:22-10.200.16.10:38354.service - OpenSSH per-connection server daemon (10.200.16.10:38354). Sep 9 04:55:52.722844 sshd[2292]: Accepted publickey for core from 10.200.16.10 port 38354 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:52.724044 sshd-session[2292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:52.727400 systemd-logind[1855]: New session 3 of user core. Sep 9 04:55:52.738997 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:55:53.099662 systemd[1]: Started sshd@1-10.200.20.4:22-10.200.16.10:38358.service - OpenSSH per-connection server daemon (10.200.16.10:38358). Sep 9 04:55:53.551786 sshd[2298]: Accepted publickey for core from 10.200.16.10 port 38358 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:53.552157 sshd-session[2298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:53.555802 systemd-logind[1855]: New session 4 of user core. Sep 9 04:55:53.563982 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:55:53.869223 sshd[2301]: Connection closed by 10.200.16.10 port 38358 Sep 9 04:55:53.869792 sshd-session[2298]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:53.872809 systemd[1]: sshd@1-10.200.20.4:22-10.200.16.10:38358.service: Deactivated successfully. Sep 9 04:55:53.874169 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:55:53.874755 systemd-logind[1855]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:55:53.876168 systemd-logind[1855]: Removed session 4. Sep 9 04:55:53.949076 systemd[1]: Started sshd@2-10.200.20.4:22-10.200.16.10:38360.service - OpenSSH per-connection server daemon (10.200.16.10:38360). Sep 9 04:55:54.365052 sshd[2307]: Accepted publickey for core from 10.200.16.10 port 38360 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:54.366133 sshd-session[2307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:54.369633 systemd-logind[1855]: New session 5 of user core. Sep 9 04:55:54.377010 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:55:54.667785 sshd[2310]: Connection closed by 10.200.16.10 port 38360 Sep 9 04:55:54.668436 sshd-session[2307]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:54.671217 systemd-logind[1855]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:55:54.671341 systemd[1]: sshd@2-10.200.20.4:22-10.200.16.10:38360.service: Deactivated successfully. Sep 9 04:55:54.672634 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:55:54.674272 systemd-logind[1855]: Removed session 5. Sep 9 04:55:54.746272 systemd[1]: Started sshd@3-10.200.20.4:22-10.200.16.10:38362.service - OpenSSH per-connection server daemon (10.200.16.10:38362). Sep 9 04:55:55.158038 sshd[2316]: Accepted publickey for core from 10.200.16.10 port 38362 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:55.159107 sshd-session[2316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:55.162463 systemd-logind[1855]: New session 6 of user core. Sep 9 04:55:55.171164 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:55:55.475794 sshd[2319]: Connection closed by 10.200.16.10 port 38362 Sep 9 04:55:55.476088 sshd-session[2316]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:55.479484 systemd-logind[1855]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:55:55.480158 systemd[1]: sshd@3-10.200.20.4:22-10.200.16.10:38362.service: Deactivated successfully. Sep 9 04:55:55.482998 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:55:55.484301 systemd-logind[1855]: Removed session 6. Sep 9 04:55:55.556205 systemd[1]: Started sshd@4-10.200.20.4:22-10.200.16.10:38364.service - OpenSSH per-connection server daemon (10.200.16.10:38364). Sep 9 04:55:55.972840 sshd[2325]: Accepted publickey for core from 10.200.16.10 port 38364 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:55.973887 sshd-session[2325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:55.977247 systemd-logind[1855]: New session 7 of user core. Sep 9 04:55:55.985153 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:55:56.352691 sudo[2329]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:55:56.352940 sudo[2329]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:56.376181 sudo[2329]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:56.456663 sshd[2328]: Connection closed by 10.200.16.10 port 38364 Sep 9 04:55:56.457322 sshd-session[2325]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:56.460762 systemd[1]: sshd@4-10.200.20.4:22-10.200.16.10:38364.service: Deactivated successfully. Sep 9 04:55:56.462358 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:55:56.463387 systemd-logind[1855]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:55:56.464381 systemd-logind[1855]: Removed session 7. Sep 9 04:55:56.535290 systemd[1]: Started sshd@5-10.200.20.4:22-10.200.16.10:38380.service - OpenSSH per-connection server daemon (10.200.16.10:38380). Sep 9 04:55:56.955165 sshd[2335]: Accepted publickey for core from 10.200.16.10 port 38380 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:56.956208 sshd-session[2335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:56.959586 systemd-logind[1855]: New session 8 of user core. Sep 9 04:55:56.967006 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 04:55:57.190986 sudo[2340]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:55:57.191191 sudo[2340]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:57.197141 sudo[2340]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:57.200620 sudo[2339]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:55:57.200805 sudo[2339]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:57.207766 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:55:57.237309 augenrules[2362]: No rules Sep 9 04:55:57.238499 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:55:57.239909 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:55:57.240541 sudo[2339]: pam_unix(sudo:session): session closed for user root Sep 9 04:55:57.305078 sshd[2338]: Connection closed by 10.200.16.10 port 38380 Sep 9 04:55:57.305584 sshd-session[2335]: pam_unix(sshd:session): session closed for user core Sep 9 04:55:57.309442 systemd[1]: sshd@5-10.200.20.4:22-10.200.16.10:38380.service: Deactivated successfully. Sep 9 04:55:57.310766 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 04:55:57.311383 systemd-logind[1855]: Session 8 logged out. Waiting for processes to exit. Sep 9 04:55:57.312498 systemd-logind[1855]: Removed session 8. Sep 9 04:55:57.388256 systemd[1]: Started sshd@6-10.200.20.4:22-10.200.16.10:38394.service - OpenSSH per-connection server daemon (10.200.16.10:38394). Sep 9 04:55:57.810080 sshd[2371]: Accepted publickey for core from 10.200.16.10 port 38394 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:55:57.811143 sshd-session[2371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:55:57.814737 systemd-logind[1855]: New session 9 of user core. Sep 9 04:55:57.821990 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 04:55:58.046181 sudo[2375]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:55:58.046385 sudo[2375]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:55:59.403797 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 9 04:55:59.481862 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:55:59.492280 (dockerd)[2394]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:55:59.808456 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 04:55:59.811140 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:55:59.981153 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:55:59.988068 (kubelet)[2407]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:56:00.062762 kubelet[2407]: E0909 04:56:00.062625 2407 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:56:00.065306 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:56:00.065422 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:56:00.065935 systemd[1]: kubelet.service: Consumed 103ms CPU time, 106.4M memory peak. Sep 9 04:56:00.560285 dockerd[2394]: time="2025-09-09T04:56:00.560242693Z" level=info msg="Starting up" Sep 9 04:56:00.561184 dockerd[2394]: time="2025-09-09T04:56:00.561156168Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:56:00.569851 dockerd[2394]: time="2025-09-09T04:56:00.569824033Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:56:00.592473 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2657651299-merged.mount: Deactivated successfully. Sep 9 04:56:00.652908 dockerd[2394]: time="2025-09-09T04:56:00.652778871Z" level=info msg="Loading containers: start." Sep 9 04:56:00.716893 kernel: Initializing XFRM netlink socket Sep 9 04:56:01.141916 systemd-networkd[1682]: docker0: Link UP Sep 9 04:56:01.154872 dockerd[2394]: time="2025-09-09T04:56:01.154792606Z" level=info msg="Loading containers: done." Sep 9 04:56:01.172111 dockerd[2394]: time="2025-09-09T04:56:01.172075327Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:56:01.172259 dockerd[2394]: time="2025-09-09T04:56:01.172146297Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:56:01.172259 dockerd[2394]: time="2025-09-09T04:56:01.172223371Z" level=info msg="Initializing buildkit" Sep 9 04:56:01.208228 dockerd[2394]: time="2025-09-09T04:56:01.208193838Z" level=info msg="Completed buildkit initialization" Sep 9 04:56:01.213182 dockerd[2394]: time="2025-09-09T04:56:01.213149153Z" level=info msg="Daemon has completed initialization" Sep 9 04:56:01.213429 dockerd[2394]: time="2025-09-09T04:56:01.213395673Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:56:01.213853 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:56:01.590273 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck896696921-merged.mount: Deactivated successfully. Sep 9 04:56:02.039294 containerd[1873]: time="2025-09-09T04:56:02.038980823Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 9 04:56:02.784605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1733417374.mount: Deactivated successfully. Sep 9 04:56:03.112004 update_engine[1856]: I20250909 04:56:03.111914 1856 update_attempter.cc:509] Updating boot flags... Sep 9 04:56:03.724793 containerd[1873]: time="2025-09-09T04:56:03.724190606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.726411 containerd[1873]: time="2025-09-09T04:56:03.726389239Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352613" Sep 9 04:56:03.729013 containerd[1873]: time="2025-09-09T04:56:03.728994141Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.732135 containerd[1873]: time="2025-09-09T04:56:03.732113129Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:03.732614 containerd[1873]: time="2025-09-09T04:56:03.732586687Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 1.693567087s" Sep 9 04:56:03.732659 containerd[1873]: time="2025-09-09T04:56:03.732619280Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Sep 9 04:56:03.733808 containerd[1873]: time="2025-09-09T04:56:03.733787651Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 9 04:56:04.946231 containerd[1873]: time="2025-09-09T04:56:04.946179142Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.948306 containerd[1873]: time="2025-09-09T04:56:04.948154720Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536977" Sep 9 04:56:04.950798 containerd[1873]: time="2025-09-09T04:56:04.950776054Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.954257 containerd[1873]: time="2025-09-09T04:56:04.954226332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:04.954945 containerd[1873]: time="2025-09-09T04:56:04.954802134Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 1.220992138s" Sep 9 04:56:04.954945 containerd[1873]: time="2025-09-09T04:56:04.954830150Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Sep 9 04:56:04.955551 containerd[1873]: time="2025-09-09T04:56:04.955535275Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 9 04:56:05.862176 containerd[1873]: time="2025-09-09T04:56:05.862121911Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.865304 containerd[1873]: time="2025-09-09T04:56:05.865275727Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292014" Sep 9 04:56:05.867825 containerd[1873]: time="2025-09-09T04:56:05.867802009Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.872301 containerd[1873]: time="2025-09-09T04:56:05.872275057Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:05.873289 containerd[1873]: time="2025-09-09T04:56:05.873262972Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 917.638366ms" Sep 9 04:56:05.873328 containerd[1873]: time="2025-09-09T04:56:05.873294933Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Sep 9 04:56:05.874400 containerd[1873]: time="2025-09-09T04:56:05.874376100Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 9 04:56:06.759429 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3426560466.mount: Deactivated successfully. Sep 9 04:56:07.049984 containerd[1873]: time="2025-09-09T04:56:07.049697034Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.051766 containerd[1873]: time="2025-09-09T04:56:07.051722922Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199959" Sep 9 04:56:07.054421 containerd[1873]: time="2025-09-09T04:56:07.054361920Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.057869 containerd[1873]: time="2025-09-09T04:56:07.057509608Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:07.057869 containerd[1873]: time="2025-09-09T04:56:07.057758929Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.18334958s" Sep 9 04:56:07.057869 containerd[1873]: time="2025-09-09T04:56:07.057786114Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Sep 9 04:56:07.058320 containerd[1873]: time="2025-09-09T04:56:07.058295724Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 9 04:56:08.199554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3557994061.mount: Deactivated successfully. Sep 9 04:56:08.902044 containerd[1873]: time="2025-09-09T04:56:08.901990328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:08.904155 containerd[1873]: time="2025-09-09T04:56:08.904129604Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Sep 9 04:56:08.906626 containerd[1873]: time="2025-09-09T04:56:08.906573635Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:08.912339 containerd[1873]: time="2025-09-09T04:56:08.911688897Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:08.912339 containerd[1873]: time="2025-09-09T04:56:08.912218908Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.853893567s" Sep 9 04:56:08.912339 containerd[1873]: time="2025-09-09T04:56:08.912246741Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 9 04:56:08.913030 containerd[1873]: time="2025-09-09T04:56:08.913007048Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:56:09.579763 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount129958831.mount: Deactivated successfully. Sep 9 04:56:09.595482 containerd[1873]: time="2025-09-09T04:56:09.595441723Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.598140 containerd[1873]: time="2025-09-09T04:56:09.598116171Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 9 04:56:09.600592 containerd[1873]: time="2025-09-09T04:56:09.600569962Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.604323 containerd[1873]: time="2025-09-09T04:56:09.604067887Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:56:09.604619 containerd[1873]: time="2025-09-09T04:56:09.604592754Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 691.561288ms" Sep 9 04:56:09.604659 containerd[1873]: time="2025-09-09T04:56:09.604618506Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:56:09.605097 containerd[1873]: time="2025-09-09T04:56:09.605075075Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 9 04:56:10.122716 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 9 04:56:10.123901 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:10.140779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3639371576.mount: Deactivated successfully. Sep 9 04:56:10.539037 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:10.541608 (kubelet)[2819]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:56:10.566652 kubelet[2819]: E0909 04:56:10.566608 2819 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:56:10.568555 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:56:10.568655 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:56:10.569953 systemd[1]: kubelet.service: Consumed 99ms CPU time, 106.7M memory peak. Sep 9 04:56:12.856303 containerd[1873]: time="2025-09-09T04:56:12.856249669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:12.858816 containerd[1873]: time="2025-09-09T04:56:12.858782060Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465295" Sep 9 04:56:12.862901 containerd[1873]: time="2025-09-09T04:56:12.862324993Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:12.866010 containerd[1873]: time="2025-09-09T04:56:12.865986834Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:12.867243 containerd[1873]: time="2025-09-09T04:56:12.867219424Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.262118053s" Sep 9 04:56:12.867351 containerd[1873]: time="2025-09-09T04:56:12.867336220Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 9 04:56:16.223976 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:16.224254 systemd[1]: kubelet.service: Consumed 99ms CPU time, 106.7M memory peak. Sep 9 04:56:16.227251 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:16.246377 systemd[1]: Reload requested from client PID 2901 ('systemctl') (unit session-9.scope)... Sep 9 04:56:16.246388 systemd[1]: Reloading... Sep 9 04:56:16.348912 zram_generator::config[2953]: No configuration found. Sep 9 04:56:16.487080 systemd[1]: Reloading finished in 240 ms. Sep 9 04:56:16.520796 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:16.522740 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:16.524771 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:56:16.524955 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:16.524992 systemd[1]: kubelet.service: Consumed 72ms CPU time, 95.3M memory peak. Sep 9 04:56:16.526390 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:17.177963 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:17.184097 (kubelet)[3017]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:56:17.207965 kubelet[3017]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:17.208226 kubelet[3017]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:56:17.208267 kubelet[3017]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:17.208374 kubelet[3017]: I0909 04:56:17.208346 3017 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:56:17.700025 kubelet[3017]: I0909 04:56:17.699989 3017 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 04:56:17.700172 kubelet[3017]: I0909 04:56:17.700163 3017 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:56:17.700416 kubelet[3017]: I0909 04:56:17.700401 3017 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 04:56:18.303709 kubelet[3017]: E0909 04:56:18.011350 3017 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.4:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 9 04:56:18.303709 kubelet[3017]: I0909 04:56:18.013088 3017 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:56:18.303709 kubelet[3017]: I0909 04:56:18.020106 3017 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:56:18.303709 kubelet[3017]: I0909 04:56:18.022698 3017 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:56:18.303709 kubelet[3017]: I0909 04:56:18.023616 3017 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:56:18.304181 kubelet[3017]: I0909 04:56:18.023643 3017 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-e60618bb0b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:56:18.304181 kubelet[3017]: I0909 04:56:18.023761 3017 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:56:18.304181 kubelet[3017]: I0909 04:56:18.023768 3017 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 04:56:18.307881 kubelet[3017]: I0909 04:56:18.307831 3017 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:18.310758 kubelet[3017]: I0909 04:56:18.310737 3017 kubelet.go:480] "Attempting to sync node with API server" Sep 9 04:56:18.310790 kubelet[3017]: I0909 04:56:18.310770 3017 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:56:18.310912 kubelet[3017]: I0909 04:56:18.310902 3017 kubelet.go:386] "Adding apiserver pod source" Sep 9 04:56:18.312013 kubelet[3017]: I0909 04:56:18.312000 3017 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:56:18.313673 kubelet[3017]: E0909 04:56:18.313645 3017 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.4:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452.0.0-n-e60618bb0b&limit=500&resourceVersion=0\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 9 04:56:18.314976 kubelet[3017]: E0909 04:56:18.313961 3017 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.4:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 9 04:56:18.314976 kubelet[3017]: I0909 04:56:18.314337 3017 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:56:18.314976 kubelet[3017]: I0909 04:56:18.314712 3017 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 04:56:18.314976 kubelet[3017]: W0909 04:56:18.314753 3017 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:56:18.317445 kubelet[3017]: I0909 04:56:18.317419 3017 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:56:18.317505 kubelet[3017]: I0909 04:56:18.317454 3017 server.go:1289] "Started kubelet" Sep 9 04:56:18.318298 kubelet[3017]: I0909 04:56:18.318273 3017 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:56:18.318991 kubelet[3017]: I0909 04:56:18.318978 3017 server.go:317] "Adding debug handlers to kubelet server" Sep 9 04:56:18.321119 kubelet[3017]: I0909 04:56:18.321068 3017 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:56:18.321441 kubelet[3017]: I0909 04:56:18.321421 3017 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:56:18.322917 kubelet[3017]: E0909 04:56:18.321526 3017 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.4:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.4:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4452.0.0-n-e60618bb0b.186384548f179d58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4452.0.0-n-e60618bb0b,UID:ci-4452.0.0-n-e60618bb0b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4452.0.0-n-e60618bb0b,},FirstTimestamp:2025-09-09 04:56:18.3174342 +0000 UTC m=+1.130197306,LastTimestamp:2025-09-09 04:56:18.3174342 +0000 UTC m=+1.130197306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452.0.0-n-e60618bb0b,}" Sep 9 04:56:18.324169 kubelet[3017]: I0909 04:56:18.324152 3017 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:56:18.325478 kubelet[3017]: I0909 04:56:18.325455 3017 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:56:18.327647 kubelet[3017]: I0909 04:56:18.327633 3017 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:56:18.327819 kubelet[3017]: E0909 04:56:18.327805 3017 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" Sep 9 04:56:18.329816 kubelet[3017]: I0909 04:56:18.329799 3017 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:56:18.329957 kubelet[3017]: I0909 04:56:18.329948 3017 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:56:18.330605 kubelet[3017]: E0909 04:56:18.330585 3017 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.4:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 9 04:56:18.330939 kubelet[3017]: E0909 04:56:18.330870 3017 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-e60618bb0b?timeout=10s\": dial tcp 10.200.20.4:6443: connect: connection refused" interval="200ms" Sep 9 04:56:18.331448 kubelet[3017]: I0909 04:56:18.331335 3017 factory.go:223] Registration of the systemd container factory successfully Sep 9 04:56:18.331448 kubelet[3017]: I0909 04:56:18.331416 3017 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:56:18.332197 kubelet[3017]: E0909 04:56:18.332178 3017 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:56:18.333563 kubelet[3017]: I0909 04:56:18.333545 3017 factory.go:223] Registration of the containerd container factory successfully Sep 9 04:56:18.354249 kubelet[3017]: I0909 04:56:18.354229 3017 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:56:18.354249 kubelet[3017]: I0909 04:56:18.354245 3017 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:56:18.354341 kubelet[3017]: I0909 04:56:18.354262 3017 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:18.358286 kubelet[3017]: I0909 04:56:18.358270 3017 policy_none.go:49] "None policy: Start" Sep 9 04:56:18.358319 kubelet[3017]: I0909 04:56:18.358290 3017 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:56:18.358319 kubelet[3017]: I0909 04:56:18.358299 3017 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:56:18.365655 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:56:18.373401 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:56:18.376260 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:56:18.386266 kubelet[3017]: E0909 04:56:18.386242 3017 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 04:56:18.387609 kubelet[3017]: I0909 04:56:18.387591 3017 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:56:18.387708 kubelet[3017]: I0909 04:56:18.387681 3017 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:56:18.387863 kubelet[3017]: I0909 04:56:18.387416 3017 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 04:56:18.388738 kubelet[3017]: I0909 04:56:18.388723 3017 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 04:56:18.388807 kubelet[3017]: I0909 04:56:18.388799 3017 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 04:56:18.388862 kubelet[3017]: I0909 04:56:18.388854 3017 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:56:18.390226 kubelet[3017]: I0909 04:56:18.390208 3017 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 04:56:18.390283 kubelet[3017]: E0909 04:56:18.390244 3017 kubelet.go:2460] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Sep 9 04:56:18.390283 kubelet[3017]: I0909 04:56:18.389414 3017 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:56:18.391064 kubelet[3017]: E0909 04:56:18.391045 3017 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:56:18.391123 kubelet[3017]: E0909 04:56:18.391076 3017 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4452.0.0-n-e60618bb0b\" not found" Sep 9 04:56:18.391614 kubelet[3017]: E0909 04:56:18.391320 3017 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.4:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 9 04:56:18.492516 kubelet[3017]: I0909 04:56:18.492492 3017 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.493072 kubelet[3017]: E0909 04:56:18.493048 3017 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.4:6443/api/v1/nodes\": dial tcp 10.200.20.4:6443: connect: connection refused" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.503597 systemd[1]: Created slice kubepods-burstable-podf394f8ff1ec66b0b9c1cf3ecb6384e5a.slice - libcontainer container kubepods-burstable-podf394f8ff1ec66b0b9c1cf3ecb6384e5a.slice. Sep 9 04:56:18.509449 kubelet[3017]: E0909 04:56:18.509408 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.513250 systemd[1]: Created slice kubepods-burstable-pod1b739791776c036ade4b80f5135f754d.slice - libcontainer container kubepods-burstable-pod1b739791776c036ade4b80f5135f754d.slice. Sep 9 04:56:18.525398 kubelet[3017]: E0909 04:56:18.525377 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.527327 systemd[1]: Created slice kubepods-burstable-pod6aa4e5cd59b02d992996fa43864c11f2.slice - libcontainer container kubepods-burstable-pod6aa4e5cd59b02d992996fa43864c11f2.slice. Sep 9 04:56:18.528826 kubelet[3017]: E0909 04:56:18.528810 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.531271 kubelet[3017]: E0909 04:56:18.531248 3017 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-e60618bb0b?timeout=10s\": dial tcp 10.200.20.4:6443: connect: connection refused" interval="400ms" Sep 9 04:56:18.631740 kubelet[3017]: I0909 04:56:18.631691 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632082 kubelet[3017]: I0909 04:56:18.631966 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632082 kubelet[3017]: I0909 04:56:18.631995 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632082 kubelet[3017]: I0909 04:56:18.632008 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632082 kubelet[3017]: I0909 04:56:18.632018 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632082 kubelet[3017]: I0909 04:56:18.632028 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632210 kubelet[3017]: I0909 04:56:18.632043 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6aa4e5cd59b02d992996fa43864c11f2-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-e60618bb0b\" (UID: \"6aa4e5cd59b02d992996fa43864c11f2\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632210 kubelet[3017]: I0909 04:56:18.632054 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.632210 kubelet[3017]: I0909 04:56:18.632064 3017 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.694641 kubelet[3017]: I0909 04:56:18.694552 3017 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.694983 kubelet[3017]: E0909 04:56:18.694958 3017 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.4:6443/api/v1/nodes\": dial tcp 10.200.20.4:6443: connect: connection refused" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:18.811352 containerd[1873]: time="2025-09-09T04:56:18.811306775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-e60618bb0b,Uid:f394f8ff1ec66b0b9c1cf3ecb6384e5a,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.826522 containerd[1873]: time="2025-09-09T04:56:18.826483980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-e60618bb0b,Uid:1b739791776c036ade4b80f5135f754d,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.830447 containerd[1873]: time="2025-09-09T04:56:18.830420337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-e60618bb0b,Uid:6aa4e5cd59b02d992996fa43864c11f2,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:18.881382 containerd[1873]: time="2025-09-09T04:56:18.881228798Z" level=info msg="connecting to shim 1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388" address="unix:///run/containerd/s/b00f69329a4e82f876ee3a903cb3a0b2fae01b5561f56ffa267a6f068117f4e0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.902000 systemd[1]: Started cri-containerd-1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388.scope - libcontainer container 1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388. Sep 9 04:56:18.913856 containerd[1873]: time="2025-09-09T04:56:18.913763143Z" level=info msg="connecting to shim aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4" address="unix:///run/containerd/s/022db06199a7eb8c6e7871716a60e435ffe5ec13f18ea8d9830b933795d0790e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.918510 containerd[1873]: time="2025-09-09T04:56:18.918444429Z" level=info msg="connecting to shim 4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d" address="unix:///run/containerd/s/4d1723796a109ba00b69b13e58355d3a011eaa8468d0c919189c4c0ccb3206d2" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:18.931951 kubelet[3017]: E0909 04:56:18.931855 3017 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.4:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-e60618bb0b?timeout=10s\": dial tcp 10.200.20.4:6443: connect: connection refused" interval="800ms" Sep 9 04:56:18.950900 systemd[1]: Started cri-containerd-aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4.scope - libcontainer container aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4. Sep 9 04:56:18.954313 systemd[1]: Started cri-containerd-4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d.scope - libcontainer container 4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d. Sep 9 04:56:18.969355 containerd[1873]: time="2025-09-09T04:56:18.968368363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-e60618bb0b,Uid:f394f8ff1ec66b0b9c1cf3ecb6384e5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388\"" Sep 9 04:56:18.980938 containerd[1873]: time="2025-09-09T04:56:18.980316405Z" level=info msg="CreateContainer within sandbox \"1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:56:18.999755 containerd[1873]: time="2025-09-09T04:56:18.999721490Z" level=info msg="Container d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:19.001185 containerd[1873]: time="2025-09-09T04:56:19.001158493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-e60618bb0b,Uid:1b739791776c036ade4b80f5135f754d,Namespace:kube-system,Attempt:0,} returns sandbox id \"aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4\"" Sep 9 04:56:19.004129 containerd[1873]: time="2025-09-09T04:56:19.004100710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-e60618bb0b,Uid:6aa4e5cd59b02d992996fa43864c11f2,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d\"" Sep 9 04:56:19.007799 containerd[1873]: time="2025-09-09T04:56:19.007270143Z" level=info msg="CreateContainer within sandbox \"aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:56:19.025258 containerd[1873]: time="2025-09-09T04:56:19.025225512Z" level=info msg="CreateContainer within sandbox \"4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:56:19.026055 containerd[1873]: time="2025-09-09T04:56:19.026026068Z" level=info msg="CreateContainer within sandbox \"1b8be6071f78350700feeff85e9f8075962343c9fa830d5bd18826f608777388\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d\"" Sep 9 04:56:19.026631 containerd[1873]: time="2025-09-09T04:56:19.026610441Z" level=info msg="StartContainer for \"d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d\"" Sep 9 04:56:19.027569 containerd[1873]: time="2025-09-09T04:56:19.027538874Z" level=info msg="connecting to shim d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d" address="unix:///run/containerd/s/b00f69329a4e82f876ee3a903cb3a0b2fae01b5561f56ffa267a6f068117f4e0" protocol=ttrpc version=3 Sep 9 04:56:19.030002 containerd[1873]: time="2025-09-09T04:56:19.029978193Z" level=info msg="Container 9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:19.048002 systemd[1]: Started cri-containerd-d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d.scope - libcontainer container d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d. Sep 9 04:56:19.051010 containerd[1873]: time="2025-09-09T04:56:19.050949374Z" level=info msg="Container 8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:19.060276 containerd[1873]: time="2025-09-09T04:56:19.060232697Z" level=info msg="CreateContainer within sandbox \"aca46059e051661b9ea17b4352861509aea232f499d4feac9c64d25f822b4bc4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8\"" Sep 9 04:56:19.060816 containerd[1873]: time="2025-09-09T04:56:19.060792501Z" level=info msg="StartContainer for \"9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8\"" Sep 9 04:56:19.062762 containerd[1873]: time="2025-09-09T04:56:19.062468737Z" level=info msg="connecting to shim 9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8" address="unix:///run/containerd/s/022db06199a7eb8c6e7871716a60e435ffe5ec13f18ea8d9830b933795d0790e" protocol=ttrpc version=3 Sep 9 04:56:19.067341 containerd[1873]: time="2025-09-09T04:56:19.067314774Z" level=info msg="CreateContainer within sandbox \"4f93937e71b92ae5347ac84a4e67c01178658b83d727255b1bc5bf406bf72b1d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c\"" Sep 9 04:56:19.069256 containerd[1873]: time="2025-09-09T04:56:19.069230474Z" level=info msg="StartContainer for \"8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c\"" Sep 9 04:56:19.070858 containerd[1873]: time="2025-09-09T04:56:19.070773913Z" level=info msg="connecting to shim 8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c" address="unix:///run/containerd/s/4d1723796a109ba00b69b13e58355d3a011eaa8468d0c919189c4c0ccb3206d2" protocol=ttrpc version=3 Sep 9 04:56:19.088050 systemd[1]: Started cri-containerd-9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8.scope - libcontainer container 9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8. Sep 9 04:56:19.098003 systemd[1]: Started cri-containerd-8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c.scope - libcontainer container 8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c. Sep 9 04:56:19.099401 kubelet[3017]: I0909 04:56:19.099272 3017 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:19.100300 kubelet[3017]: E0909 04:56:19.100269 3017 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.4:6443/api/v1/nodes\": dial tcp 10.200.20.4:6443: connect: connection refused" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:19.102543 containerd[1873]: time="2025-09-09T04:56:19.102509949Z" level=info msg="StartContainer for \"d901f75296cc862760d2841d21538e749de6dfee7dd66901b64f5880d150272d\" returns successfully" Sep 9 04:56:19.120311 kubelet[3017]: E0909 04:56:19.120274 3017 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.4:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.4:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 9 04:56:19.154681 containerd[1873]: time="2025-09-09T04:56:19.154468339Z" level=info msg="StartContainer for \"8e0e8fe1ab0fd97d880cb3e0fa4d3f6054d849be3850065fa919d2d9a269cb3c\" returns successfully" Sep 9 04:56:19.155653 containerd[1873]: time="2025-09-09T04:56:19.155475055Z" level=info msg="StartContainer for \"9a82babc774a8457ea696d82e6dac53816e5a5020cb4c0e3c3e3186fb989b9f8\" returns successfully" Sep 9 04:56:19.397353 kubelet[3017]: E0909 04:56:19.397328 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:19.399959 kubelet[3017]: E0909 04:56:19.399694 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:19.401270 kubelet[3017]: E0909 04:56:19.401252 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:19.904910 kubelet[3017]: I0909 04:56:19.903149 3017 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.410038 kubelet[3017]: E0909 04:56:20.410012 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.410730 kubelet[3017]: E0909 04:56:20.410656 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.411118 kubelet[3017]: E0909 04:56:20.411091 3017 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-e60618bb0b\" not found" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.507108 kubelet[3017]: I0909 04:56:20.506992 3017 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.529064 kubelet[3017]: I0909 04:56:20.529034 3017 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.616248 kubelet[3017]: E0909 04:56:20.616212 3017 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="1.6s" Sep 9 04:56:20.621008 kubelet[3017]: E0909 04:56:20.620980 3017 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.621008 kubelet[3017]: I0909 04:56:20.621004 3017 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.626576 kubelet[3017]: E0909 04:56:20.626555 3017 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.626753 kubelet[3017]: I0909 04:56:20.626660 3017 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:20.628740 kubelet[3017]: E0909 04:56:20.628712 3017 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452.0.0-n-e60618bb0b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:21.315051 kubelet[3017]: I0909 04:56:21.314940 3017 apiserver.go:52] "Watching apiserver" Sep 9 04:56:21.330559 kubelet[3017]: I0909 04:56:21.330511 3017 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:56:21.407669 kubelet[3017]: I0909 04:56:21.407600 3017 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:21.417440 kubelet[3017]: I0909 04:56:21.417395 3017 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 04:56:22.613792 systemd[1]: Reload requested from client PID 3292 ('systemctl') (unit session-9.scope)... Sep 9 04:56:22.613804 systemd[1]: Reloading... Sep 9 04:56:22.685907 zram_generator::config[3336]: No configuration found. Sep 9 04:56:22.848259 systemd[1]: Reloading finished in 234 ms. Sep 9 04:56:22.868431 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:22.884817 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:56:22.885031 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:22.885080 systemd[1]: kubelet.service: Consumed 796ms CPU time, 126.9M memory peak. Sep 9 04:56:22.886517 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:56:22.980785 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:56:22.989243 (kubelet)[3403]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:56:23.022591 kubelet[3403]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:23.023898 kubelet[3403]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:56:23.023898 kubelet[3403]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:56:23.023898 kubelet[3403]: I0909 04:56:23.022987 3403 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:56:23.027319 kubelet[3403]: I0909 04:56:23.027295 3403 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 04:56:23.027418 kubelet[3403]: I0909 04:56:23.027408 3403 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:56:23.027640 kubelet[3403]: I0909 04:56:23.027624 3403 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 04:56:23.028588 kubelet[3403]: I0909 04:56:23.028573 3403 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 9 04:56:23.030382 kubelet[3403]: I0909 04:56:23.030354 3403 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:56:23.034959 kubelet[3403]: I0909 04:56:23.034917 3403 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:56:23.037485 kubelet[3403]: I0909 04:56:23.037469 3403 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:56:23.037736 kubelet[3403]: I0909 04:56:23.037713 3403 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:56:23.037907 kubelet[3403]: I0909 04:56:23.037790 3403 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-e60618bb0b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:56:23.038028 kubelet[3403]: I0909 04:56:23.038017 3403 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:56:23.038141 kubelet[3403]: I0909 04:56:23.038132 3403 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 04:56:23.038220 kubelet[3403]: I0909 04:56:23.038211 3403 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:23.038396 kubelet[3403]: I0909 04:56:23.038385 3403 kubelet.go:480] "Attempting to sync node with API server" Sep 9 04:56:23.038459 kubelet[3403]: I0909 04:56:23.038450 3403 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:56:23.038523 kubelet[3403]: I0909 04:56:23.038515 3403 kubelet.go:386] "Adding apiserver pod source" Sep 9 04:56:23.038571 kubelet[3403]: I0909 04:56:23.038564 3403 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:56:23.045560 kubelet[3403]: I0909 04:56:23.045538 3403 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:56:23.045969 kubelet[3403]: I0909 04:56:23.045945 3403 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 04:56:23.049179 kubelet[3403]: I0909 04:56:23.049158 3403 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:56:23.049299 kubelet[3403]: I0909 04:56:23.049286 3403 server.go:1289] "Started kubelet" Sep 9 04:56:23.049375 kubelet[3403]: I0909 04:56:23.049357 3403 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:56:23.049780 kubelet[3403]: I0909 04:56:23.049746 3403 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:56:23.050406 kubelet[3403]: I0909 04:56:23.050374 3403 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:56:23.051226 kubelet[3403]: I0909 04:56:23.051029 3403 server.go:317] "Adding debug handlers to kubelet server" Sep 9 04:56:23.053449 kubelet[3403]: I0909 04:56:23.053421 3403 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:56:23.058010 kubelet[3403]: I0909 04:56:23.057982 3403 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:56:23.060520 kubelet[3403]: I0909 04:56:23.060370 3403 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:56:23.060520 kubelet[3403]: I0909 04:56:23.060477 3403 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:56:23.060831 kubelet[3403]: I0909 04:56:23.060602 3403 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:56:23.061475 kubelet[3403]: I0909 04:56:23.061452 3403 factory.go:223] Registration of the systemd container factory successfully Sep 9 04:56:23.061569 kubelet[3403]: I0909 04:56:23.061550 3403 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:56:23.062114 kubelet[3403]: E0909 04:56:23.062093 3403 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:56:23.064020 kubelet[3403]: I0909 04:56:23.063679 3403 factory.go:223] Registration of the containerd container factory successfully Sep 9 04:56:23.068340 kubelet[3403]: I0909 04:56:23.068307 3403 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 04:56:23.069961 kubelet[3403]: I0909 04:56:23.069937 3403 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 04:56:23.069961 kubelet[3403]: I0909 04:56:23.069956 3403 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 04:56:23.070046 kubelet[3403]: I0909 04:56:23.069971 3403 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:56:23.070046 kubelet[3403]: I0909 04:56:23.069975 3403 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 04:56:23.070046 kubelet[3403]: E0909 04:56:23.070012 3403 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123086 3403 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123103 3403 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123122 3403 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123219 3403 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123225 3403 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123238 3403 policy_none.go:49] "None policy: Start" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123245 3403 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123252 3403 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:56:23.123592 kubelet[3403]: I0909 04:56:23.123313 3403 state_mem.go:75] "Updated machine memory state" Sep 9 04:56:23.126957 kubelet[3403]: E0909 04:56:23.126936 3403 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 04:56:23.127099 kubelet[3403]: I0909 04:56:23.127081 3403 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:56:23.127136 kubelet[3403]: I0909 04:56:23.127105 3403 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:56:23.127953 kubelet[3403]: I0909 04:56:23.127756 3403 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:56:23.129123 kubelet[3403]: E0909 04:56:23.129091 3403 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:56:23.170803 kubelet[3403]: I0909 04:56:23.170775 3403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.172757 kubelet[3403]: I0909 04:56:23.170806 3403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.172757 kubelet[3403]: I0909 04:56:23.170858 3403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.180309 kubelet[3403]: I0909 04:56:23.180232 3403 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 04:56:23.185266 kubelet[3403]: I0909 04:56:23.185164 3403 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 04:56:23.186120 kubelet[3403]: I0909 04:56:23.186095 3403 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 04:56:23.186220 kubelet[3403]: E0909 04:56:23.186206 3403 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452.0.0-n-e60618bb0b\" already exists" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.232540 kubelet[3403]: I0909 04:56:23.232521 3403 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.245346 kubelet[3403]: I0909 04:56:23.245322 3403 kubelet_node_status.go:124] "Node was previously registered" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.245396 kubelet[3403]: I0909 04:56:23.245387 3403 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.361795 kubelet[3403]: I0909 04:56:23.361629 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6aa4e5cd59b02d992996fa43864c11f2-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-e60618bb0b\" (UID: \"6aa4e5cd59b02d992996fa43864c11f2\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.361795 kubelet[3403]: I0909 04:56:23.361655 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.361795 kubelet[3403]: I0909 04:56:23.361668 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.361795 kubelet[3403]: I0909 04:56:23.361681 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.361795 kubelet[3403]: I0909 04:56:23.361694 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f394f8ff1ec66b0b9c1cf3ecb6384e5a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" (UID: \"f394f8ff1ec66b0b9c1cf3ecb6384e5a\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.362033 kubelet[3403]: I0909 04:56:23.361703 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.362033 kubelet[3403]: I0909 04:56:23.361714 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.362033 kubelet[3403]: I0909 04:56:23.361722 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:23.362033 kubelet[3403]: I0909 04:56:23.361739 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1b739791776c036ade4b80f5135f754d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-e60618bb0b\" (UID: \"1b739791776c036ade4b80f5135f754d\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:24.045495 kubelet[3403]: I0909 04:56:24.045247 3403 apiserver.go:52] "Watching apiserver" Sep 9 04:56:24.061147 kubelet[3403]: I0909 04:56:24.061106 3403 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:56:24.069962 kubelet[3403]: I0909 04:56:24.069909 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" podStartSLOduration=1.069896308 podStartE2EDuration="1.069896308s" podCreationTimestamp="2025-09-09 04:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:24.06955921 +0000 UTC m=+1.075957824" watchObservedRunningTime="2025-09-09 04:56:24.069896308 +0000 UTC m=+1.076294914" Sep 9 04:56:24.089547 kubelet[3403]: I0909 04:56:24.089493 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-e60618bb0b" podStartSLOduration=1.089478654 podStartE2EDuration="1.089478654s" podCreationTimestamp="2025-09-09 04:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:24.088396692 +0000 UTC m=+1.094795298" watchObservedRunningTime="2025-09-09 04:56:24.089478654 +0000 UTC m=+1.095877308" Sep 9 04:56:24.098466 kubelet[3403]: I0909 04:56:24.098364 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4452.0.0-n-e60618bb0b" podStartSLOduration=3.098355082 podStartE2EDuration="3.098355082s" podCreationTimestamp="2025-09-09 04:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:24.09834645 +0000 UTC m=+1.104745056" watchObservedRunningTime="2025-09-09 04:56:24.098355082 +0000 UTC m=+1.104753688" Sep 9 04:56:24.106337 kubelet[3403]: I0909 04:56:24.106261 3403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:24.120251 kubelet[3403]: I0909 04:56:24.120227 3403 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 9 04:56:24.120435 kubelet[3403]: E0909 04:56:24.120268 3403 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452.0.0-n-e60618bb0b\" already exists" pod="kube-system/kube-apiserver-ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:28.177359 kubelet[3403]: I0909 04:56:28.177146 3403 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:56:28.177691 kubelet[3403]: I0909 04:56:28.177679 3403 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:56:28.177722 containerd[1873]: time="2025-09-09T04:56:28.177492534Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:56:29.144910 systemd[1]: Created slice kubepods-besteffort-pod6a3830f8_8c51_4be2_967a_22189dc69a2c.slice - libcontainer container kubepods-besteffort-pod6a3830f8_8c51_4be2_967a_22189dc69a2c.slice. Sep 9 04:56:29.189173 kubelet[3403]: I0909 04:56:29.189039 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6a3830f8-8c51-4be2-967a-22189dc69a2c-kube-proxy\") pod \"kube-proxy-vb5cq\" (UID: \"6a3830f8-8c51-4be2-967a-22189dc69a2c\") " pod="kube-system/kube-proxy-vb5cq" Sep 9 04:56:29.189173 kubelet[3403]: I0909 04:56:29.189086 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrgx\" (UniqueName: \"kubernetes.io/projected/6a3830f8-8c51-4be2-967a-22189dc69a2c-kube-api-access-4lrgx\") pod \"kube-proxy-vb5cq\" (UID: \"6a3830f8-8c51-4be2-967a-22189dc69a2c\") " pod="kube-system/kube-proxy-vb5cq" Sep 9 04:56:29.189173 kubelet[3403]: I0909 04:56:29.189104 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6a3830f8-8c51-4be2-967a-22189dc69a2c-xtables-lock\") pod \"kube-proxy-vb5cq\" (UID: \"6a3830f8-8c51-4be2-967a-22189dc69a2c\") " pod="kube-system/kube-proxy-vb5cq" Sep 9 04:56:29.189173 kubelet[3403]: I0909 04:56:29.189115 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a3830f8-8c51-4be2-967a-22189dc69a2c-lib-modules\") pod \"kube-proxy-vb5cq\" (UID: \"6a3830f8-8c51-4be2-967a-22189dc69a2c\") " pod="kube-system/kube-proxy-vb5cq" Sep 9 04:56:29.452134 systemd[1]: Created slice kubepods-besteffort-podeff144b8_3f43_468a_862f_caa88824b790.slice - libcontainer container kubepods-besteffort-podeff144b8_3f43_468a_862f_caa88824b790.slice. Sep 9 04:56:29.453746 containerd[1873]: time="2025-09-09T04:56:29.453710028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vb5cq,Uid:6a3830f8-8c51-4be2-967a-22189dc69a2c,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:29.491015 kubelet[3403]: I0909 04:56:29.490951 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/eff144b8-3f43-468a-862f-caa88824b790-kube-api-access-kzjhb\") pod \"tigera-operator-755d956888-tsfft\" (UID: \"eff144b8-3f43-468a-862f-caa88824b790\") " pod="tigera-operator/tigera-operator-755d956888-tsfft" Sep 9 04:56:29.491184 kubelet[3403]: I0909 04:56:29.491064 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eff144b8-3f43-468a-862f-caa88824b790-var-lib-calico\") pod \"tigera-operator-755d956888-tsfft\" (UID: \"eff144b8-3f43-468a-862f-caa88824b790\") " pod="tigera-operator/tigera-operator-755d956888-tsfft" Sep 9 04:56:29.493772 containerd[1873]: time="2025-09-09T04:56:29.493741987Z" level=info msg="connecting to shim d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c" address="unix:///run/containerd/s/5c96fa6f7646e42b75dc9c306df6dff84f38c69cb035f66c8d3a5f20b408a537" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:29.514019 systemd[1]: Started cri-containerd-d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c.scope - libcontainer container d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c. Sep 9 04:56:29.532168 containerd[1873]: time="2025-09-09T04:56:29.532109953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vb5cq,Uid:6a3830f8-8c51-4be2-967a-22189dc69a2c,Namespace:kube-system,Attempt:0,} returns sandbox id \"d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c\"" Sep 9 04:56:29.539849 containerd[1873]: time="2025-09-09T04:56:29.539797013Z" level=info msg="CreateContainer within sandbox \"d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:56:29.558243 containerd[1873]: time="2025-09-09T04:56:29.557690033Z" level=info msg="Container f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:29.576392 containerd[1873]: time="2025-09-09T04:56:29.576353582Z" level=info msg="CreateContainer within sandbox \"d97eef525a5ff67ae017517e3fbbb8ab4d6ae7f2d8e1d4e5938504d8f2368d1c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb\"" Sep 9 04:56:29.577392 containerd[1873]: time="2025-09-09T04:56:29.577355904Z" level=info msg="StartContainer for \"f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb\"" Sep 9 04:56:29.578556 containerd[1873]: time="2025-09-09T04:56:29.578534488Z" level=info msg="connecting to shim f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb" address="unix:///run/containerd/s/5c96fa6f7646e42b75dc9c306df6dff84f38c69cb035f66c8d3a5f20b408a537" protocol=ttrpc version=3 Sep 9 04:56:29.594060 systemd[1]: Started cri-containerd-f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb.scope - libcontainer container f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb. Sep 9 04:56:29.624953 containerd[1873]: time="2025-09-09T04:56:29.624911797Z" level=info msg="StartContainer for \"f7c4001a5e7dc18571ad422dfc219cb89842bd41e2440506d2ad060d5e41d2fb\" returns successfully" Sep 9 04:56:29.755630 containerd[1873]: time="2025-09-09T04:56:29.755332846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-tsfft,Uid:eff144b8-3f43-468a-862f-caa88824b790,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:56:29.788403 containerd[1873]: time="2025-09-09T04:56:29.788337192Z" level=info msg="connecting to shim 687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f" address="unix:///run/containerd/s/a7365c417128e0ed44f042fe6aed8c6809fc48756585d36bf95bc0c7540e4e2e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:29.805001 systemd[1]: Started cri-containerd-687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f.scope - libcontainer container 687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f. Sep 9 04:56:29.838725 containerd[1873]: time="2025-09-09T04:56:29.838601296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-tsfft,Uid:eff144b8-3f43-468a-862f-caa88824b790,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f\"" Sep 9 04:56:29.841459 containerd[1873]: time="2025-09-09T04:56:29.841428071Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:56:31.063096 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3090600250.mount: Deactivated successfully. Sep 9 04:56:31.351724 containerd[1873]: time="2025-09-09T04:56:31.351174146Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.353321 containerd[1873]: time="2025-09-09T04:56:31.353297794Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:56:31.357241 containerd[1873]: time="2025-09-09T04:56:31.357221862Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.361989 containerd[1873]: time="2025-09-09T04:56:31.361959166Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:31.362535 containerd[1873]: time="2025-09-09T04:56:31.362512177Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.521051976s" Sep 9 04:56:31.362615 containerd[1873]: time="2025-09-09T04:56:31.362602748Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:56:31.368716 containerd[1873]: time="2025-09-09T04:56:31.368458601Z" level=info msg="CreateContainer within sandbox \"687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:56:31.388045 containerd[1873]: time="2025-09-09T04:56:31.387678906Z" level=info msg="Container a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:31.399132 containerd[1873]: time="2025-09-09T04:56:31.399104739Z" level=info msg="CreateContainer within sandbox \"687231e7a9b3f38281e60a0c347dae8901dd49d467eb765a32115f28effd8a4f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537\"" Sep 9 04:56:31.400626 containerd[1873]: time="2025-09-09T04:56:31.399653406Z" level=info msg="StartContainer for \"a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537\"" Sep 9 04:56:31.401540 containerd[1873]: time="2025-09-09T04:56:31.401519813Z" level=info msg="connecting to shim a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537" address="unix:///run/containerd/s/a7365c417128e0ed44f042fe6aed8c6809fc48756585d36bf95bc0c7540e4e2e" protocol=ttrpc version=3 Sep 9 04:56:31.422003 systemd[1]: Started cri-containerd-a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537.scope - libcontainer container a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537. Sep 9 04:56:31.444685 containerd[1873]: time="2025-09-09T04:56:31.444616507Z" level=info msg="StartContainer for \"a9895118537685949a252adbdf46127ece5e87b181c96338af2b26bbbee01537\" returns successfully" Sep 9 04:56:32.132468 kubelet[3403]: I0909 04:56:32.132417 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vb5cq" podStartSLOduration=3.132403917 podStartE2EDuration="3.132403917s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:56:30.129303426 +0000 UTC m=+7.135702040" watchObservedRunningTime="2025-09-09 04:56:32.132403917 +0000 UTC m=+9.138802523" Sep 9 04:56:32.133181 kubelet[3403]: I0909 04:56:32.132490 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-tsfft" podStartSLOduration=1.609984391 podStartE2EDuration="3.13248524s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="2025-09-09 04:56:29.840669326 +0000 UTC m=+6.847067932" lastFinishedPulling="2025-09-09 04:56:31.363170167 +0000 UTC m=+8.369568781" observedRunningTime="2025-09-09 04:56:32.132234903 +0000 UTC m=+9.138633509" watchObservedRunningTime="2025-09-09 04:56:32.13248524 +0000 UTC m=+9.138883846" Sep 9 04:56:36.503194 sudo[2375]: pam_unix(sudo:session): session closed for user root Sep 9 04:56:36.574026 sshd[2374]: Connection closed by 10.200.16.10 port 38394 Sep 9 04:56:36.576069 sshd-session[2371]: pam_unix(sshd:session): session closed for user core Sep 9 04:56:36.579699 systemd[1]: sshd@6-10.200.20.4:22-10.200.16.10:38394.service: Deactivated successfully. Sep 9 04:56:36.585540 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 04:56:36.586134 systemd[1]: session-9.scope: Consumed 3.895s CPU time, 219.8M memory peak. Sep 9 04:56:36.588435 systemd-logind[1855]: Session 9 logged out. Waiting for processes to exit. Sep 9 04:56:36.589911 systemd-logind[1855]: Removed session 9. Sep 9 04:56:41.331782 systemd[1]: Created slice kubepods-besteffort-podeefabd71_1391_4276_9aff_e9cddcda9b79.slice - libcontainer container kubepods-besteffort-podeefabd71_1391_4276_9aff_e9cddcda9b79.slice. Sep 9 04:56:41.354864 kubelet[3403]: I0909 04:56:41.354833 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52cnw\" (UniqueName: \"kubernetes.io/projected/eefabd71-1391-4276-9aff-e9cddcda9b79-kube-api-access-52cnw\") pod \"calico-typha-6868b4bd7-p6b2v\" (UID: \"eefabd71-1391-4276-9aff-e9cddcda9b79\") " pod="calico-system/calico-typha-6868b4bd7-p6b2v" Sep 9 04:56:41.355290 kubelet[3403]: I0909 04:56:41.355230 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/eefabd71-1391-4276-9aff-e9cddcda9b79-typha-certs\") pod \"calico-typha-6868b4bd7-p6b2v\" (UID: \"eefabd71-1391-4276-9aff-e9cddcda9b79\") " pod="calico-system/calico-typha-6868b4bd7-p6b2v" Sep 9 04:56:41.355290 kubelet[3403]: I0909 04:56:41.355254 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eefabd71-1391-4276-9aff-e9cddcda9b79-tigera-ca-bundle\") pod \"calico-typha-6868b4bd7-p6b2v\" (UID: \"eefabd71-1391-4276-9aff-e9cddcda9b79\") " pod="calico-system/calico-typha-6868b4bd7-p6b2v" Sep 9 04:56:41.450965 systemd[1]: Created slice kubepods-besteffort-podee55ef52_23e2_488f_9db8_059ab05ef9d7.slice - libcontainer container kubepods-besteffort-podee55ef52_23e2_488f_9db8_059ab05ef9d7.slice. Sep 9 04:56:41.455804 kubelet[3403]: I0909 04:56:41.455775 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8s4\" (UniqueName: \"kubernetes.io/projected/ee55ef52-23e2-488f-9db8-059ab05ef9d7-kube-api-access-xx8s4\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455804 kubelet[3403]: I0909 04:56:41.455806 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ee55ef52-23e2-488f-9db8-059ab05ef9d7-node-certs\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455921 kubelet[3403]: I0909 04:56:41.455819 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee55ef52-23e2-488f-9db8-059ab05ef9d7-tigera-ca-bundle\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455921 kubelet[3403]: I0909 04:56:41.455828 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-var-run-calico\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455921 kubelet[3403]: I0909 04:56:41.455850 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-cni-bin-dir\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455921 kubelet[3403]: I0909 04:56:41.455859 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-cni-net-dir\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.455921 kubelet[3403]: I0909 04:56:41.455867 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-xtables-lock\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.456004 kubelet[3403]: I0909 04:56:41.455903 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-flexvol-driver-host\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.456004 kubelet[3403]: I0909 04:56:41.455925 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-cni-log-dir\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.456004 kubelet[3403]: I0909 04:56:41.455943 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-policysync\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.456004 kubelet[3403]: I0909 04:56:41.455953 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-lib-modules\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.456004 kubelet[3403]: I0909 04:56:41.455961 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ee55ef52-23e2-488f-9db8-059ab05ef9d7-var-lib-calico\") pod \"calico-node-26285\" (UID: \"ee55ef52-23e2-488f-9db8-059ab05ef9d7\") " pod="calico-system/calico-node-26285" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559103 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.560421 kubelet[3403]: W0909 04:56:41.559126 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559389 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559530 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.560421 kubelet[3403]: W0909 04:56:41.559538 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559547 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559756 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.560421 kubelet[3403]: W0909 04:56:41.559764 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.559773 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.560421 kubelet[3403]: E0909 04:56:41.560230 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.560682 kubelet[3403]: W0909 04:56:41.560240 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.560682 kubelet[3403]: E0909 04:56:41.560250 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.560682 kubelet[3403]: E0909 04:56:41.560584 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.560682 kubelet[3403]: W0909 04:56:41.560593 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.561340 kubelet[3403]: E0909 04:56:41.560786 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.561340 kubelet[3403]: E0909 04:56:41.560940 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.561340 kubelet[3403]: W0909 04:56:41.560948 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.561340 kubelet[3403]: E0909 04:56:41.560962 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.563831 kubelet[3403]: E0909 04:56:41.563802 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.563831 kubelet[3403]: W0909 04:56:41.563818 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.563831 kubelet[3403]: E0909 04:56:41.563830 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.564221 kubelet[3403]: E0909 04:56:41.564014 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.564221 kubelet[3403]: W0909 04:56:41.564024 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.564221 kubelet[3403]: E0909 04:56:41.564039 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.564858 kubelet[3403]: E0909 04:56:41.564813 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.564858 kubelet[3403]: W0909 04:56:41.564826 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.564858 kubelet[3403]: E0909 04:56:41.564836 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.598964 kubelet[3403]: E0909 04:56:41.598478 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vm259" podUID="aa78c1cd-bf72-4406-ba6d-7cd42efb9427" Sep 9 04:56:41.602233 kubelet[3403]: E0909 04:56:41.601643 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.602233 kubelet[3403]: W0909 04:56:41.601659 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.602233 kubelet[3403]: E0909 04:56:41.601672 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.636347 containerd[1873]: time="2025-09-09T04:56:41.636311480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6868b4bd7-p6b2v,Uid:eefabd71-1391-4276-9aff-e9cddcda9b79,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:41.648690 kubelet[3403]: E0909 04:56:41.648662 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.648690 kubelet[3403]: W0909 04:56:41.648682 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.648825 kubelet[3403]: E0909 04:56:41.648702 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.648994 kubelet[3403]: E0909 04:56:41.648978 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.649136 kubelet[3403]: W0909 04:56:41.648992 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.649174 kubelet[3403]: E0909 04:56:41.649137 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.649623 kubelet[3403]: E0909 04:56:41.649606 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.649623 kubelet[3403]: W0909 04:56:41.649619 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.649760 kubelet[3403]: E0909 04:56:41.649736 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.650116 kubelet[3403]: E0909 04:56:41.650101 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.650176 kubelet[3403]: W0909 04:56:41.650112 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.650176 kubelet[3403]: E0909 04:56:41.650136 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.650932 kubelet[3403]: E0909 04:56:41.650914 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.650932 kubelet[3403]: W0909 04:56:41.650927 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.651076 kubelet[3403]: E0909 04:56:41.650937 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.652029 kubelet[3403]: E0909 04:56:41.652010 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.652029 kubelet[3403]: W0909 04:56:41.652023 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.652107 kubelet[3403]: E0909 04:56:41.652035 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.652188 kubelet[3403]: E0909 04:56:41.652170 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.652188 kubelet[3403]: W0909 04:56:41.652179 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.652188 kubelet[3403]: E0909 04:56:41.652187 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.652605 kubelet[3403]: E0909 04:56:41.652590 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.652605 kubelet[3403]: W0909 04:56:41.652602 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.652664 kubelet[3403]: E0909 04:56:41.652614 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.652801 kubelet[3403]: E0909 04:56:41.652787 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.652801 kubelet[3403]: W0909 04:56:41.652798 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.652865 kubelet[3403]: E0909 04:56:41.652808 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.653123 kubelet[3403]: E0909 04:56:41.652953 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.653123 kubelet[3403]: W0909 04:56:41.652961 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.653123 kubelet[3403]: E0909 04:56:41.652970 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.653223 kubelet[3403]: E0909 04:56:41.653208 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.653223 kubelet[3403]: W0909 04:56:41.653220 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.653273 kubelet[3403]: E0909 04:56:41.653229 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.653370 kubelet[3403]: E0909 04:56:41.653359 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.653370 kubelet[3403]: W0909 04:56:41.653368 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.653432 kubelet[3403]: E0909 04:56:41.653375 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.653672 kubelet[3403]: E0909 04:56:41.653657 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.653672 kubelet[3403]: W0909 04:56:41.653669 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.653745 kubelet[3403]: E0909 04:56:41.653678 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.653817 kubelet[3403]: E0909 04:56:41.653800 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.653964 kubelet[3403]: W0909 04:56:41.653810 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.654012 kubelet[3403]: E0909 04:56:41.653970 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.654488 kubelet[3403]: E0909 04:56:41.654472 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.654488 kubelet[3403]: W0909 04:56:41.654484 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.654488 kubelet[3403]: E0909 04:56:41.654493 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.654810 kubelet[3403]: E0909 04:56:41.654795 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.654810 kubelet[3403]: W0909 04:56:41.654808 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.654871 kubelet[3403]: E0909 04:56:41.654818 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.654871 kubelet[3403]: E0909 04:56:41.654957 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.654871 kubelet[3403]: W0909 04:56:41.654964 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.654871 kubelet[3403]: E0909 04:56:41.654973 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.655671 kubelet[3403]: E0909 04:56:41.655657 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.655671 kubelet[3403]: W0909 04:56:41.655667 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.655742 kubelet[3403]: E0909 04:56:41.655676 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.655790 kubelet[3403]: E0909 04:56:41.655780 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.655790 kubelet[3403]: W0909 04:56:41.655787 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.655834 kubelet[3403]: E0909 04:56:41.655796 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.656967 kubelet[3403]: E0909 04:56:41.655931 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.656967 kubelet[3403]: W0909 04:56:41.655937 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.656967 kubelet[3403]: E0909 04:56:41.655944 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.657753 kubelet[3403]: E0909 04:56:41.657267 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.657753 kubelet[3403]: W0909 04:56:41.657728 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.658031 kubelet[3403]: E0909 04:56:41.657861 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.658031 kubelet[3403]: I0909 04:56:41.657935 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa78c1cd-bf72-4406-ba6d-7cd42efb9427-socket-dir\") pod \"csi-node-driver-vm259\" (UID: \"aa78c1cd-bf72-4406-ba6d-7cd42efb9427\") " pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:41.658189 kubelet[3403]: E0909 04:56:41.658156 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.658259 kubelet[3403]: W0909 04:56:41.658245 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.658310 kubelet[3403]: E0909 04:56:41.658298 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.658367 kubelet[3403]: I0909 04:56:41.658357 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa78c1cd-bf72-4406-ba6d-7cd42efb9427-kubelet-dir\") pod \"csi-node-driver-vm259\" (UID: \"aa78c1cd-bf72-4406-ba6d-7cd42efb9427\") " pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:41.658501 kubelet[3403]: E0909 04:56:41.658484 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.658501 kubelet[3403]: W0909 04:56:41.658497 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.658645 kubelet[3403]: E0909 04:56:41.658507 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.658970 kubelet[3403]: E0909 04:56:41.658956 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.658970 kubelet[3403]: W0909 04:56:41.658967 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.659070 kubelet[3403]: E0909 04:56:41.658980 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.659129 kubelet[3403]: E0909 04:56:41.659116 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.659158 kubelet[3403]: W0909 04:56:41.659138 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.659158 kubelet[3403]: E0909 04:56:41.659146 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.659191 kubelet[3403]: I0909 04:56:41.659166 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/aa78c1cd-bf72-4406-ba6d-7cd42efb9427-varrun\") pod \"csi-node-driver-vm259\" (UID: \"aa78c1cd-bf72-4406-ba6d-7cd42efb9427\") " pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:41.659589 kubelet[3403]: E0909 04:56:41.659572 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.659589 kubelet[3403]: W0909 04:56:41.659585 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.659953 kubelet[3403]: E0909 04:56:41.659595 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.659953 kubelet[3403]: I0909 04:56:41.659611 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa78c1cd-bf72-4406-ba6d-7cd42efb9427-registration-dir\") pod \"csi-node-driver-vm259\" (UID: \"aa78c1cd-bf72-4406-ba6d-7cd42efb9427\") " pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:41.660013 kubelet[3403]: E0909 04:56:41.659992 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.660013 kubelet[3403]: W0909 04:56:41.660003 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.660126 kubelet[3403]: E0909 04:56:41.660013 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.660126 kubelet[3403]: I0909 04:56:41.660045 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsfv8\" (UniqueName: \"kubernetes.io/projected/aa78c1cd-bf72-4406-ba6d-7cd42efb9427-kube-api-access-xsfv8\") pod \"csi-node-driver-vm259\" (UID: \"aa78c1cd-bf72-4406-ba6d-7cd42efb9427\") " pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:41.660502 kubelet[3403]: E0909 04:56:41.660460 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.660502 kubelet[3403]: W0909 04:56:41.660498 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.660600 kubelet[3403]: E0909 04:56:41.660508 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.660674 kubelet[3403]: E0909 04:56:41.660662 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.660674 kubelet[3403]: W0909 04:56:41.660672 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.660915 kubelet[3403]: E0909 04:56:41.660681 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.660915 kubelet[3403]: E0909 04:56:41.660803 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.660915 kubelet[3403]: W0909 04:56:41.660810 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.660915 kubelet[3403]: E0909 04:56:41.660817 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.661017 kubelet[3403]: E0909 04:56:41.660956 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.661017 kubelet[3403]: W0909 04:56:41.660963 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.661017 kubelet[3403]: E0909 04:56:41.660970 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.661123 kubelet[3403]: E0909 04:56:41.661080 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.661123 kubelet[3403]: W0909 04:56:41.661098 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.661123 kubelet[3403]: E0909 04:56:41.661107 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.661257 kubelet[3403]: E0909 04:56:41.661234 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.661304 kubelet[3403]: W0909 04:56:41.661268 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.661304 kubelet[3403]: E0909 04:56:41.661277 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.661516 kubelet[3403]: E0909 04:56:41.661398 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.661516 kubelet[3403]: W0909 04:56:41.661405 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.661516 kubelet[3403]: E0909 04:56:41.661423 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.661583 kubelet[3403]: E0909 04:56:41.661550 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.661583 kubelet[3403]: W0909 04:56:41.661557 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.661583 kubelet[3403]: E0909 04:56:41.661564 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.675168 containerd[1873]: time="2025-09-09T04:56:41.675134026Z" level=info msg="connecting to shim 5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068" address="unix:///run/containerd/s/0aa7e2fb8116509c87351109a627e4a6b2dc3f61ff365e137690a5b6d4896c1a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:41.704008 systemd[1]: Started cri-containerd-5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068.scope - libcontainer container 5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068. Sep 9 04:56:41.742972 containerd[1873]: time="2025-09-09T04:56:41.742927369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6868b4bd7-p6b2v,Uid:eefabd71-1391-4276-9aff-e9cddcda9b79,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068\"" Sep 9 04:56:41.745248 containerd[1873]: time="2025-09-09T04:56:41.745098593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:56:41.754715 containerd[1873]: time="2025-09-09T04:56:41.754686683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-26285,Uid:ee55ef52-23e2-488f-9db8-059ab05ef9d7,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:41.761758 kubelet[3403]: E0909 04:56:41.761722 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.761758 kubelet[3403]: W0909 04:56:41.761742 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.762746 kubelet[3403]: E0909 04:56:41.762718 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.763056 kubelet[3403]: E0909 04:56:41.763042 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.763056 kubelet[3403]: W0909 04:56:41.763054 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.763151 kubelet[3403]: E0909 04:56:41.763065 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.763247 kubelet[3403]: E0909 04:56:41.763228 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.763247 kubelet[3403]: W0909 04:56:41.763245 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.763294 kubelet[3403]: E0909 04:56:41.763261 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.763475 kubelet[3403]: E0909 04:56:41.763462 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.763475 kubelet[3403]: W0909 04:56:41.763474 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.763532 kubelet[3403]: E0909 04:56:41.763483 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.763667 kubelet[3403]: E0909 04:56:41.763655 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.763667 kubelet[3403]: W0909 04:56:41.763666 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.763739 kubelet[3403]: E0909 04:56:41.763673 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.763850 kubelet[3403]: E0909 04:56:41.763838 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.763850 kubelet[3403]: W0909 04:56:41.763847 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.763988 kubelet[3403]: E0909 04:56:41.763853 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.764065 kubelet[3403]: E0909 04:56:41.764054 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.764065 kubelet[3403]: W0909 04:56:41.764063 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.764179 kubelet[3403]: E0909 04:56:41.764073 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.764243 kubelet[3403]: E0909 04:56:41.764232 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.764243 kubelet[3403]: W0909 04:56:41.764240 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.764370 kubelet[3403]: E0909 04:56:41.764246 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.764710 kubelet[3403]: E0909 04:56:41.764693 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.764710 kubelet[3403]: W0909 04:56:41.764706 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.764790 kubelet[3403]: E0909 04:56:41.764716 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.764869 kubelet[3403]: E0909 04:56:41.764857 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.764869 kubelet[3403]: W0909 04:56:41.764866 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.764973 kubelet[3403]: E0909 04:56:41.764891 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.765082 kubelet[3403]: E0909 04:56:41.765070 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.765082 kubelet[3403]: W0909 04:56:41.765078 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.765201 kubelet[3403]: E0909 04:56:41.765085 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.765256 kubelet[3403]: E0909 04:56:41.765245 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.765256 kubelet[3403]: W0909 04:56:41.765253 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.765302 kubelet[3403]: E0909 04:56:41.765258 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.765449 kubelet[3403]: E0909 04:56:41.765423 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.765449 kubelet[3403]: W0909 04:56:41.765447 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.765534 kubelet[3403]: E0909 04:56:41.765454 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.765579 kubelet[3403]: E0909 04:56:41.765564 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.765579 kubelet[3403]: W0909 04:56:41.765573 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.765711 kubelet[3403]: E0909 04:56:41.765580 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.765831 kubelet[3403]: E0909 04:56:41.765816 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.765831 kubelet[3403]: W0909 04:56:41.765828 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.765914 kubelet[3403]: E0909 04:56:41.765837 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766026 kubelet[3403]: E0909 04:56:41.766016 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766026 kubelet[3403]: W0909 04:56:41.766025 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766116 kubelet[3403]: E0909 04:56:41.766034 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766203 kubelet[3403]: E0909 04:56:41.766146 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766203 kubelet[3403]: W0909 04:56:41.766151 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766203 kubelet[3403]: E0909 04:56:41.766157 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766288 kubelet[3403]: E0909 04:56:41.766245 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766288 kubelet[3403]: W0909 04:56:41.766249 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766288 kubelet[3403]: E0909 04:56:41.766255 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766439 kubelet[3403]: E0909 04:56:41.766428 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766439 kubelet[3403]: W0909 04:56:41.766435 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766439 kubelet[3403]: E0909 04:56:41.766440 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766554 kubelet[3403]: E0909 04:56:41.766546 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766554 kubelet[3403]: W0909 04:56:41.766553 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766596 kubelet[3403]: E0909 04:56:41.766558 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766691 kubelet[3403]: E0909 04:56:41.766681 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766691 kubelet[3403]: W0909 04:56:41.766688 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766759 kubelet[3403]: E0909 04:56:41.766694 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.766807 kubelet[3403]: E0909 04:56:41.766796 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.766807 kubelet[3403]: W0909 04:56:41.766803 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.766905 kubelet[3403]: E0909 04:56:41.766809 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.767012 kubelet[3403]: E0909 04:56:41.767001 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.767012 kubelet[3403]: W0909 04:56:41.767009 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.767432 kubelet[3403]: E0909 04:56:41.767016 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.767432 kubelet[3403]: E0909 04:56:41.767262 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.767432 kubelet[3403]: W0909 04:56:41.767273 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.767432 kubelet[3403]: E0909 04:56:41.767284 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.767901 kubelet[3403]: E0909 04:56:41.767817 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.767901 kubelet[3403]: W0909 04:56:41.767827 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.767901 kubelet[3403]: E0909 04:56:41.767837 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.774024 kubelet[3403]: E0909 04:56:41.774000 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:41.775487 kubelet[3403]: W0909 04:56:41.774166 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:41.775487 kubelet[3403]: E0909 04:56:41.774195 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:41.788776 containerd[1873]: time="2025-09-09T04:56:41.788572795Z" level=info msg="connecting to shim c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4" address="unix:///run/containerd/s/43234f7a33ce59e8c108cfd983a6827cca60c30f589d89a9e572bf3732a46d75" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:41.810177 systemd[1]: Started cri-containerd-c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4.scope - libcontainer container c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4. Sep 9 04:56:41.841361 containerd[1873]: time="2025-09-09T04:56:41.841281316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-26285,Uid:ee55ef52-23e2-488f-9db8-059ab05ef9d7,Namespace:calico-system,Attempt:0,} returns sandbox id \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\"" Sep 9 04:56:42.789899 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3769853463.mount: Deactivated successfully. Sep 9 04:56:43.328957 containerd[1873]: time="2025-09-09T04:56:43.328909553Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.332990 containerd[1873]: time="2025-09-09T04:56:43.332857634Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:56:43.336292 containerd[1873]: time="2025-09-09T04:56:43.336262562Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.341432 containerd[1873]: time="2025-09-09T04:56:43.341402514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:43.342148 containerd[1873]: time="2025-09-09T04:56:43.342117642Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.596861781s" Sep 9 04:56:43.342212 containerd[1873]: time="2025-09-09T04:56:43.342158795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:56:43.343368 containerd[1873]: time="2025-09-09T04:56:43.343337498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:56:43.360812 containerd[1873]: time="2025-09-09T04:56:43.360766030Z" level=info msg="CreateContainer within sandbox \"5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:56:43.374477 containerd[1873]: time="2025-09-09T04:56:43.374442766Z" level=info msg="Container 6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:43.391673 containerd[1873]: time="2025-09-09T04:56:43.391620458Z" level=info msg="CreateContainer within sandbox \"5c78bf4d231609e0edd2c32becb29b3af9f34c1a0cb2dde5b7f5bea962d22068\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8\"" Sep 9 04:56:43.392341 containerd[1873]: time="2025-09-09T04:56:43.392310440Z" level=info msg="StartContainer for \"6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8\"" Sep 9 04:56:43.393296 containerd[1873]: time="2025-09-09T04:56:43.393270736Z" level=info msg="connecting to shim 6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8" address="unix:///run/containerd/s/0aa7e2fb8116509c87351109a627e4a6b2dc3f61ff365e137690a5b6d4896c1a" protocol=ttrpc version=3 Sep 9 04:56:43.408018 systemd[1]: Started cri-containerd-6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8.scope - libcontainer container 6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8. Sep 9 04:56:43.445226 containerd[1873]: time="2025-09-09T04:56:43.444490120Z" level=info msg="StartContainer for \"6b50ac54d5007ead42adeefe082b8df2f13f49161794ff358fae768f494332e8\" returns successfully" Sep 9 04:56:44.070960 kubelet[3403]: E0909 04:56:44.070535 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vm259" podUID="aa78c1cd-bf72-4406-ba6d-7cd42efb9427" Sep 9 04:56:44.167893 kubelet[3403]: E0909 04:56:44.167688 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.168303 kubelet[3403]: W0909 04:56:44.168059 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.168303 kubelet[3403]: E0909 04:56:44.168084 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.168303 kubelet[3403]: I0909 04:56:44.167796 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6868b4bd7-p6b2v" podStartSLOduration=1.569292211 podStartE2EDuration="3.167782213s" podCreationTimestamp="2025-09-09 04:56:41 +0000 UTC" firstStartedPulling="2025-09-09 04:56:41.744725756 +0000 UTC m=+18.751124362" lastFinishedPulling="2025-09-09 04:56:43.343215758 +0000 UTC m=+20.349614364" observedRunningTime="2025-09-09 04:56:44.166869855 +0000 UTC m=+21.173268469" watchObservedRunningTime="2025-09-09 04:56:44.167782213 +0000 UTC m=+21.174180827" Sep 9 04:56:44.168892 kubelet[3403]: E0909 04:56:44.168683 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.168892 kubelet[3403]: W0909 04:56:44.168696 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.168892 kubelet[3403]: E0909 04:56:44.168733 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.169384 kubelet[3403]: E0909 04:56:44.169271 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.169384 kubelet[3403]: W0909 04:56:44.169283 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.169384 kubelet[3403]: E0909 04:56:44.169294 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.169761 kubelet[3403]: E0909 04:56:44.169748 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.169951 kubelet[3403]: W0909 04:56:44.169813 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.169951 kubelet[3403]: E0909 04:56:44.169827 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.170239 kubelet[3403]: E0909 04:56:44.170227 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.170418 kubelet[3403]: W0909 04:56:44.170292 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.170418 kubelet[3403]: E0909 04:56:44.170308 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.170740 kubelet[3403]: E0909 04:56:44.170646 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.170812 kubelet[3403]: W0909 04:56:44.170799 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.171027 kubelet[3403]: E0909 04:56:44.170866 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.171312 kubelet[3403]: E0909 04:56:44.171230 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.171312 kubelet[3403]: W0909 04:56:44.171241 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.171312 kubelet[3403]: E0909 04:56:44.171250 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.171459 kubelet[3403]: E0909 04:56:44.171448 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.171602 kubelet[3403]: W0909 04:56:44.171504 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.171602 kubelet[3403]: E0909 04:56:44.171517 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.171722 kubelet[3403]: E0909 04:56:44.171712 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.171764 kubelet[3403]: W0909 04:56:44.171756 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.171813 kubelet[3403]: E0909 04:56:44.171802 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.172005 kubelet[3403]: E0909 04:56:44.171994 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.172135 kubelet[3403]: W0909 04:56:44.172051 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.172135 kubelet[3403]: E0909 04:56:44.172066 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.172254 kubelet[3403]: E0909 04:56:44.172244 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.172302 kubelet[3403]: W0909 04:56:44.172292 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.172357 kubelet[3403]: E0909 04:56:44.172345 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.172611 kubelet[3403]: E0909 04:56:44.172523 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.172611 kubelet[3403]: W0909 04:56:44.172534 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.172611 kubelet[3403]: E0909 04:56:44.172542 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.172742 kubelet[3403]: E0909 04:56:44.172732 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.172866 kubelet[3403]: W0909 04:56:44.172781 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.172866 kubelet[3403]: E0909 04:56:44.172796 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.173012 kubelet[3403]: E0909 04:56:44.173001 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.173136 kubelet[3403]: W0909 04:56:44.173056 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.173136 kubelet[3403]: E0909 04:56:44.173070 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.173246 kubelet[3403]: E0909 04:56:44.173236 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.173356 kubelet[3403]: W0909 04:56:44.173307 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.173356 kubelet[3403]: E0909 04:56:44.173320 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.179506 kubelet[3403]: E0909 04:56:44.179488 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.179506 kubelet[3403]: W0909 04:56:44.179502 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.179587 kubelet[3403]: E0909 04:56:44.179513 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.179668 kubelet[3403]: E0909 04:56:44.179658 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.179668 kubelet[3403]: W0909 04:56:44.179666 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.179723 kubelet[3403]: E0909 04:56:44.179673 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.179818 kubelet[3403]: E0909 04:56:44.179807 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.179818 kubelet[3403]: W0909 04:56:44.179816 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.179857 kubelet[3403]: E0909 04:56:44.179822 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.179959 kubelet[3403]: E0909 04:56:44.179946 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.179959 kubelet[3403]: W0909 04:56:44.179956 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180008 kubelet[3403]: E0909 04:56:44.179964 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180079 kubelet[3403]: E0909 04:56:44.180067 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180079 kubelet[3403]: W0909 04:56:44.180075 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180112 kubelet[3403]: E0909 04:56:44.180081 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180176 kubelet[3403]: E0909 04:56:44.180164 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180176 kubelet[3403]: W0909 04:56:44.180172 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180241 kubelet[3403]: E0909 04:56:44.180178 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180289 kubelet[3403]: E0909 04:56:44.180278 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180289 kubelet[3403]: W0909 04:56:44.180285 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180328 kubelet[3403]: E0909 04:56:44.180291 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180544 kubelet[3403]: E0909 04:56:44.180531 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180544 kubelet[3403]: W0909 04:56:44.180541 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180599 kubelet[3403]: E0909 04:56:44.180548 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180668 kubelet[3403]: E0909 04:56:44.180655 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180668 kubelet[3403]: W0909 04:56:44.180663 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180734 kubelet[3403]: E0909 04:56:44.180669 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180768 kubelet[3403]: E0909 04:56:44.180756 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180768 kubelet[3403]: W0909 04:56:44.180760 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180768 kubelet[3403]: E0909 04:56:44.180765 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.180940 kubelet[3403]: E0909 04:56:44.180927 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.180940 kubelet[3403]: W0909 04:56:44.180937 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.180990 kubelet[3403]: E0909 04:56:44.180944 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.181254 kubelet[3403]: E0909 04:56:44.181164 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.181254 kubelet[3403]: W0909 04:56:44.181179 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.181254 kubelet[3403]: E0909 04:56:44.181190 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.181392 kubelet[3403]: E0909 04:56:44.181382 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.181440 kubelet[3403]: W0909 04:56:44.181430 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.181488 kubelet[3403]: E0909 04:56:44.181478 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.181667 kubelet[3403]: E0909 04:56:44.181657 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.181807 kubelet[3403]: W0909 04:56:44.181721 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.181807 kubelet[3403]: E0909 04:56:44.181735 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.181950 kubelet[3403]: E0909 04:56:44.181939 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.182007 kubelet[3403]: W0909 04:56:44.181997 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.182049 kubelet[3403]: E0909 04:56:44.182040 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.182304 kubelet[3403]: E0909 04:56:44.182212 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.182304 kubelet[3403]: W0909 04:56:44.182222 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.182304 kubelet[3403]: E0909 04:56:44.182230 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.182452 kubelet[3403]: E0909 04:56:44.182441 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.182496 kubelet[3403]: W0909 04:56:44.182487 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.182886 kubelet[3403]: E0909 04:56:44.182532 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.182981 kubelet[3403]: E0909 04:56:44.182970 3403 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:56:44.183040 kubelet[3403]: W0909 04:56:44.183029 3403 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:56:44.183097 kubelet[3403]: E0909 04:56:44.183085 3403 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:56:44.527248 containerd[1873]: time="2025-09-09T04:56:44.527118816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:44.529590 containerd[1873]: time="2025-09-09T04:56:44.529558624Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:56:44.531511 containerd[1873]: time="2025-09-09T04:56:44.531485279Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:44.535656 containerd[1873]: time="2025-09-09T04:56:44.535629895Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:44.536180 containerd[1873]: time="2025-09-09T04:56:44.535862575Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.19249854s" Sep 9 04:56:44.536180 containerd[1873]: time="2025-09-09T04:56:44.535903936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:56:44.543023 containerd[1873]: time="2025-09-09T04:56:44.542995625Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:56:44.558766 containerd[1873]: time="2025-09-09T04:56:44.558089400Z" level=info msg="Container adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:44.573036 containerd[1873]: time="2025-09-09T04:56:44.573006161Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\"" Sep 9 04:56:44.574530 containerd[1873]: time="2025-09-09T04:56:44.574300443Z" level=info msg="StartContainer for \"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\"" Sep 9 04:56:44.577889 containerd[1873]: time="2025-09-09T04:56:44.577790422Z" level=info msg="connecting to shim adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d" address="unix:///run/containerd/s/43234f7a33ce59e8c108cfd983a6827cca60c30f589d89a9e572bf3732a46d75" protocol=ttrpc version=3 Sep 9 04:56:44.603019 systemd[1]: Started cri-containerd-adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d.scope - libcontainer container adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d. Sep 9 04:56:44.637713 containerd[1873]: time="2025-09-09T04:56:44.637673938Z" level=info msg="StartContainer for \"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\" returns successfully" Sep 9 04:56:44.642300 systemd[1]: cri-containerd-adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d.scope: Deactivated successfully. Sep 9 04:56:44.645839 containerd[1873]: time="2025-09-09T04:56:44.645794133Z" level=info msg="TaskExit event in podsandbox handler container_id:\"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\" id:\"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\" pid:4075 exited_at:{seconds:1757393804 nanos:645152872}" Sep 9 04:56:44.645966 containerd[1873]: time="2025-09-09T04:56:44.645947866Z" level=info msg="received exit event container_id:\"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\" id:\"adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d\" pid:4075 exited_at:{seconds:1757393804 nanos:645152872}" Sep 9 04:56:44.661648 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-adf14c6ddd7bd07c7d1ee992da6074057f955716d07e095142e6634246c1480d-rootfs.mount: Deactivated successfully. Sep 9 04:56:45.155898 kubelet[3403]: I0909 04:56:45.155722 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:56:46.071223 kubelet[3403]: E0909 04:56:46.071175 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vm259" podUID="aa78c1cd-bf72-4406-ba6d-7cd42efb9427" Sep 9 04:56:46.161963 containerd[1873]: time="2025-09-09T04:56:46.161924837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:56:48.071107 kubelet[3403]: E0909 04:56:48.071063 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vm259" podUID="aa78c1cd-bf72-4406-ba6d-7cd42efb9427" Sep 9 04:56:48.393740 containerd[1873]: time="2025-09-09T04:56:48.393628382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:48.396048 containerd[1873]: time="2025-09-09T04:56:48.395930143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:56:48.398770 containerd[1873]: time="2025-09-09T04:56:48.398300371Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:48.402234 containerd[1873]: time="2025-09-09T04:56:48.402211663Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:48.402523 containerd[1873]: time="2025-09-09T04:56:48.402498057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.240455255s" Sep 9 04:56:48.402523 containerd[1873]: time="2025-09-09T04:56:48.402523841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:56:48.409320 containerd[1873]: time="2025-09-09T04:56:48.409296329Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:56:48.427888 containerd[1873]: time="2025-09-09T04:56:48.425485148Z" level=info msg="Container c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:48.439895 containerd[1873]: time="2025-09-09T04:56:48.439853102Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\"" Sep 9 04:56:48.441653 containerd[1873]: time="2025-09-09T04:56:48.440215937Z" level=info msg="StartContainer for \"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\"" Sep 9 04:56:48.441653 containerd[1873]: time="2025-09-09T04:56:48.441088589Z" level=info msg="connecting to shim c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b" address="unix:///run/containerd/s/43234f7a33ce59e8c108cfd983a6827cca60c30f589d89a9e572bf3732a46d75" protocol=ttrpc version=3 Sep 9 04:56:48.464014 systemd[1]: Started cri-containerd-c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b.scope - libcontainer container c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b. Sep 9 04:56:48.497888 containerd[1873]: time="2025-09-09T04:56:48.497831092Z" level=info msg="StartContainer for \"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\" returns successfully" Sep 9 04:56:49.622699 containerd[1873]: time="2025-09-09T04:56:49.622659022Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:56:49.625982 systemd[1]: cri-containerd-c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b.scope: Deactivated successfully. Sep 9 04:56:49.626665 systemd[1]: cri-containerd-c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b.scope: Consumed 311ms CPU time, 186.3M memory peak, 165.8M written to disk. Sep 9 04:56:49.627489 containerd[1873]: time="2025-09-09T04:56:49.627423158Z" level=info msg="received exit event container_id:\"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\" id:\"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\" pid:4133 exited_at:{seconds:1757393809 nanos:627241744}" Sep 9 04:56:49.628110 containerd[1873]: time="2025-09-09T04:56:49.628003016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\" id:\"c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b\" pid:4133 exited_at:{seconds:1757393809 nanos:627241744}" Sep 9 04:56:49.632597 kubelet[3403]: I0909 04:56:49.632067 3403 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 04:56:49.651645 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c53c745b7cb00b633d6b751f118761b77c3c7aadc4503de65005e083cf4c0f3b-rootfs.mount: Deactivated successfully. Sep 9 04:56:50.452652 systemd[1]: Created slice kubepods-burstable-pod6ab60380_d76f_4b58_882b_8a015f1d3fb8.slice - libcontainer container kubepods-burstable-pod6ab60380_d76f_4b58_882b_8a015f1d3fb8.slice. Sep 9 04:56:50.505107 systemd[1]: Created slice kubepods-burstable-pod49d4a79b_d00b_46a0_9f2c_ce1ae43d94e8.slice - libcontainer container kubepods-burstable-pod49d4a79b_d00b_46a0_9f2c_ce1ae43d94e8.slice. Sep 9 04:56:50.512528 systemd[1]: Created slice kubepods-besteffort-pod3aca8d68_baeb_4a2b_974a_4711dcb4eb5e.slice - libcontainer container kubepods-besteffort-pod3aca8d68_baeb_4a2b_974a_4711dcb4eb5e.slice. Sep 9 04:56:50.518248 systemd[1]: Created slice kubepods-besteffort-podaa78c1cd_bf72_4406_ba6d_7cd42efb9427.slice - libcontainer container kubepods-besteffort-podaa78c1cd_bf72_4406_ba6d_7cd42efb9427.slice. Sep 9 04:56:50.520911 kubelet[3403]: I0909 04:56:50.520577 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8-config-volume\") pod \"coredns-674b8bbfcf-2mkdx\" (UID: \"49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8\") " pod="kube-system/coredns-674b8bbfcf-2mkdx" Sep 9 04:56:50.520911 kubelet[3403]: I0909 04:56:50.520636 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4dc\" (UniqueName: \"kubernetes.io/projected/6ab60380-d76f-4b58-882b-8a015f1d3fb8-kube-api-access-np4dc\") pod \"coredns-674b8bbfcf-bdn2k\" (UID: \"6ab60380-d76f-4b58-882b-8a015f1d3fb8\") " pod="kube-system/coredns-674b8bbfcf-bdn2k" Sep 9 04:56:50.520911 kubelet[3403]: I0909 04:56:50.520652 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjgw\" (UniqueName: \"kubernetes.io/projected/3aca8d68-baeb-4a2b-974a-4711dcb4eb5e-kube-api-access-8kjgw\") pod \"calico-kube-controllers-549ccbd58c-24j9h\" (UID: \"3aca8d68-baeb-4a2b-974a-4711dcb4eb5e\") " pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" Sep 9 04:56:50.520911 kubelet[3403]: I0909 04:56:50.520667 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d83ef8c9-b986-4f3b-8e65-4557cf85482c-calico-apiserver-certs\") pod \"calico-apiserver-7f6f5df448-vgtr8\" (UID: \"d83ef8c9-b986-4f3b-8e65-4557cf85482c\") " pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" Sep 9 04:56:50.520911 kubelet[3403]: I0909 04:56:50.520690 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aca8d68-baeb-4a2b-974a-4711dcb4eb5e-tigera-ca-bundle\") pod \"calico-kube-controllers-549ccbd58c-24j9h\" (UID: \"3aca8d68-baeb-4a2b-974a-4711dcb4eb5e\") " pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" Sep 9 04:56:50.521090 kubelet[3403]: I0909 04:56:50.520700 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swhcf\" (UniqueName: \"kubernetes.io/projected/49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8-kube-api-access-swhcf\") pod \"coredns-674b8bbfcf-2mkdx\" (UID: \"49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8\") " pod="kube-system/coredns-674b8bbfcf-2mkdx" Sep 9 04:56:50.521090 kubelet[3403]: I0909 04:56:50.520711 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5ng\" (UniqueName: \"kubernetes.io/projected/d83ef8c9-b986-4f3b-8e65-4557cf85482c-kube-api-access-8l5ng\") pod \"calico-apiserver-7f6f5df448-vgtr8\" (UID: \"d83ef8c9-b986-4f3b-8e65-4557cf85482c\") " pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" Sep 9 04:56:50.521090 kubelet[3403]: I0909 04:56:50.520725 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ab60380-d76f-4b58-882b-8a015f1d3fb8-config-volume\") pod \"coredns-674b8bbfcf-bdn2k\" (UID: \"6ab60380-d76f-4b58-882b-8a015f1d3fb8\") " pod="kube-system/coredns-674b8bbfcf-bdn2k" Sep 9 04:56:50.523282 containerd[1873]: time="2025-09-09T04:56:50.522729981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vm259,Uid:aa78c1cd-bf72-4406-ba6d-7cd42efb9427,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.527036 systemd[1]: Created slice kubepods-besteffort-podd83ef8c9_b986_4f3b_8e65_4557cf85482c.slice - libcontainer container kubepods-besteffort-podd83ef8c9_b986_4f3b_8e65_4557cf85482c.slice. Sep 9 04:56:50.539524 systemd[1]: Created slice kubepods-besteffort-pod4fdc260c_5577_4668_b511_8035243e1c1f.slice - libcontainer container kubepods-besteffort-pod4fdc260c_5577_4668_b511_8035243e1c1f.slice. Sep 9 04:56:50.565727 systemd[1]: Created slice kubepods-besteffort-pod74e7e670_28ae_4273_9e5e_ade00eca3b9e.slice - libcontainer container kubepods-besteffort-pod74e7e670_28ae_4273_9e5e_ade00eca3b9e.slice. Sep 9 04:56:50.571177 systemd[1]: Created slice kubepods-besteffort-pod2a41e2c8_6787_485d_af56_a43f1e724c36.slice - libcontainer container kubepods-besteffort-pod2a41e2c8_6787_485d_af56_a43f1e724c36.slice. Sep 9 04:56:50.579188 containerd[1873]: time="2025-09-09T04:56:50.579149625Z" level=error msg="Failed to destroy network for sandbox \"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.581021 systemd[1]: run-netns-cni\x2d6b7974f8\x2da4b3\x2d33ca\x2d7566\x2d58b93d4f2049.mount: Deactivated successfully. Sep 9 04:56:50.584551 containerd[1873]: time="2025-09-09T04:56:50.584510300Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vm259,Uid:aa78c1cd-bf72-4406-ba6d-7cd42efb9427,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.584772 kubelet[3403]: E0909 04:56:50.584737 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.584819 kubelet[3403]: E0909 04:56:50.584796 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:50.584819 kubelet[3403]: E0909 04:56:50.584812 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vm259" Sep 9 04:56:50.584900 kubelet[3403]: E0909 04:56:50.584848 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vm259_calico-system(aa78c1cd-bf72-4406-ba6d-7cd42efb9427)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vm259_calico-system(aa78c1cd-bf72-4406-ba6d-7cd42efb9427)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"385dd919b4be59954dbb594df4dd0baaf38f4e7dfe8a26a78bede0c9bcbab1b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vm259" podUID="aa78c1cd-bf72-4406-ba6d-7cd42efb9427" Sep 9 04:56:50.621030 kubelet[3403]: I0909 04:56:50.620991 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-backend-key-pair\") pod \"whisker-679d689cf7-cm2jc\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " pod="calico-system/whisker-679d689cf7-cm2jc" Sep 9 04:56:50.621149 kubelet[3403]: I0909 04:56:50.621050 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxmdh\" (UniqueName: \"kubernetes.io/projected/4fdc260c-5577-4668-b511-8035243e1c1f-kube-api-access-hxmdh\") pod \"calico-apiserver-7f6f5df448-5fphh\" (UID: \"4fdc260c-5577-4668-b511-8035243e1c1f\") " pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" Sep 9 04:56:50.621149 kubelet[3403]: I0909 04:56:50.621063 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-ca-bundle\") pod \"whisker-679d689cf7-cm2jc\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " pod="calico-system/whisker-679d689cf7-cm2jc" Sep 9 04:56:50.621149 kubelet[3403]: I0909 04:56:50.621088 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a41e2c8-6787-485d-af56-a43f1e724c36-config\") pod \"goldmane-54d579b49d-cn2np\" (UID: \"2a41e2c8-6787-485d-af56-a43f1e724c36\") " pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.621149 kubelet[3403]: I0909 04:56:50.621101 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6f2q\" (UniqueName: \"kubernetes.io/projected/2a41e2c8-6787-485d-af56-a43f1e724c36-kube-api-access-k6f2q\") pod \"goldmane-54d579b49d-cn2np\" (UID: \"2a41e2c8-6787-485d-af56-a43f1e724c36\") " pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.621149 kubelet[3403]: I0909 04:56:50.621128 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvbv\" (UniqueName: \"kubernetes.io/projected/74e7e670-28ae-4273-9e5e-ade00eca3b9e-kube-api-access-vvvbv\") pod \"whisker-679d689cf7-cm2jc\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " pod="calico-system/whisker-679d689cf7-cm2jc" Sep 9 04:56:50.621278 kubelet[3403]: I0909 04:56:50.621156 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a41e2c8-6787-485d-af56-a43f1e724c36-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-cn2np\" (UID: \"2a41e2c8-6787-485d-af56-a43f1e724c36\") " pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.621278 kubelet[3403]: I0909 04:56:50.621165 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2a41e2c8-6787-485d-af56-a43f1e724c36-goldmane-key-pair\") pod \"goldmane-54d579b49d-cn2np\" (UID: \"2a41e2c8-6787-485d-af56-a43f1e724c36\") " pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.621278 kubelet[3403]: I0909 04:56:50.621177 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4fdc260c-5577-4668-b511-8035243e1c1f-calico-apiserver-certs\") pod \"calico-apiserver-7f6f5df448-5fphh\" (UID: \"4fdc260c-5577-4668-b511-8035243e1c1f\") " pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" Sep 9 04:56:50.756212 containerd[1873]: time="2025-09-09T04:56:50.755663141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bdn2k,Uid:6ab60380-d76f-4b58-882b-8a015f1d3fb8,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:50.790427 containerd[1873]: time="2025-09-09T04:56:50.790379942Z" level=error msg="Failed to destroy network for sandbox \"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.793286 containerd[1873]: time="2025-09-09T04:56:50.793239105Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bdn2k,Uid:6ab60380-d76f-4b58-882b-8a015f1d3fb8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.793490 kubelet[3403]: E0909 04:56:50.793443 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.793784 kubelet[3403]: E0909 04:56:50.793513 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-bdn2k" Sep 9 04:56:50.793784 kubelet[3403]: E0909 04:56:50.793538 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-bdn2k" Sep 9 04:56:50.793784 kubelet[3403]: E0909 04:56:50.793580 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-bdn2k_kube-system(6ab60380-d76f-4b58-882b-8a015f1d3fb8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-bdn2k_kube-system(6ab60380-d76f-4b58-882b-8a015f1d3fb8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5aa80c37b118cec49bd2ae0fda51ba59b9fbee35111099e45ba756ee83b4d1ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-bdn2k" podUID="6ab60380-d76f-4b58-882b-8a015f1d3fb8" Sep 9 04:56:50.810415 containerd[1873]: time="2025-09-09T04:56:50.810172588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mkdx,Uid:49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8,Namespace:kube-system,Attempt:0,}" Sep 9 04:56:50.821997 containerd[1873]: time="2025-09-09T04:56:50.821963875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-549ccbd58c-24j9h,Uid:3aca8d68-baeb-4a2b-974a-4711dcb4eb5e,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.834362 containerd[1873]: time="2025-09-09T04:56:50.834323237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-vgtr8,Uid:d83ef8c9-b986-4f3b-8e65-4557cf85482c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:56:50.852201 containerd[1873]: time="2025-09-09T04:56:50.852098723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-5fphh,Uid:4fdc260c-5577-4668-b511-8035243e1c1f,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:56:50.861988 containerd[1873]: time="2025-09-09T04:56:50.861943012Z" level=error msg="Failed to destroy network for sandbox \"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.865561 containerd[1873]: time="2025-09-09T04:56:50.865521270Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mkdx,Uid:49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.866188 kubelet[3403]: E0909 04:56:50.865864 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.866188 kubelet[3403]: E0909 04:56:50.865949 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2mkdx" Sep 9 04:56:50.866188 kubelet[3403]: E0909 04:56:50.865966 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2mkdx" Sep 9 04:56:50.867045 kubelet[3403]: E0909 04:56:50.866006 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-2mkdx_kube-system(49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-2mkdx_kube-system(49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"64d83b07a1360ad085f8dd2b2c6d658d28db9e958fd9f6d7ac3c995e17268dd5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-2mkdx" podUID="49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8" Sep 9 04:56:50.877736 containerd[1873]: time="2025-09-09T04:56:50.877691282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-679d689cf7-cm2jc,Uid:74e7e670-28ae-4273-9e5e-ade00eca3b9e,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.878321 containerd[1873]: time="2025-09-09T04:56:50.878296709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-cn2np,Uid:2a41e2c8-6787-485d-af56-a43f1e724c36,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:50.886165 containerd[1873]: time="2025-09-09T04:56:50.886140223Z" level=error msg="Failed to destroy network for sandbox \"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.890748 containerd[1873]: time="2025-09-09T04:56:50.890716288Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-549ccbd58c-24j9h,Uid:3aca8d68-baeb-4a2b-974a-4711dcb4eb5e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.892020 kubelet[3403]: E0909 04:56:50.891981 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.892098 kubelet[3403]: E0909 04:56:50.892030 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" Sep 9 04:56:50.892098 kubelet[3403]: E0909 04:56:50.892050 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" Sep 9 04:56:50.892098 kubelet[3403]: E0909 04:56:50.892086 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-549ccbd58c-24j9h_calico-system(3aca8d68-baeb-4a2b-974a-4711dcb4eb5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-549ccbd58c-24j9h_calico-system(3aca8d68-baeb-4a2b-974a-4711dcb4eb5e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b2136cebd876c816a1d62226b7a56583a8e5fdd2a0e0147954fc18d6897301b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" podUID="3aca8d68-baeb-4a2b-974a-4711dcb4eb5e" Sep 9 04:56:50.920218 containerd[1873]: time="2025-09-09T04:56:50.920166618Z" level=error msg="Failed to destroy network for sandbox \"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.923361 containerd[1873]: time="2025-09-09T04:56:50.923116776Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-vgtr8,Uid:d83ef8c9-b986-4f3b-8e65-4557cf85482c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.923833 kubelet[3403]: E0909 04:56:50.923786 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.924081 kubelet[3403]: E0909 04:56:50.923995 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" Sep 9 04:56:50.924081 kubelet[3403]: E0909 04:56:50.924020 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" Sep 9 04:56:50.924373 kubelet[3403]: E0909 04:56:50.924229 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6f5df448-vgtr8_calico-apiserver(d83ef8c9-b986-4f3b-8e65-4557cf85482c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6f5df448-vgtr8_calico-apiserver(d83ef8c9-b986-4f3b-8e65-4557cf85482c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ce784c4f668cc2e33aacc92d213fdea5e1a02c3dbedbc892a13e2fb6841338a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" podUID="d83ef8c9-b986-4f3b-8e65-4557cf85482c" Sep 9 04:56:50.930406 containerd[1873]: time="2025-09-09T04:56:50.930363695Z" level=error msg="Failed to destroy network for sandbox \"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.933699 containerd[1873]: time="2025-09-09T04:56:50.933616126Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-5fphh,Uid:4fdc260c-5577-4668-b511-8035243e1c1f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.933961 kubelet[3403]: E0909 04:56:50.933894 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.934082 kubelet[3403]: E0909 04:56:50.934026 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" Sep 9 04:56:50.934082 kubelet[3403]: E0909 04:56:50.934055 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" Sep 9 04:56:50.934223 kubelet[3403]: E0909 04:56:50.934172 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6f5df448-5fphh_calico-apiserver(4fdc260c-5577-4668-b511-8035243e1c1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6f5df448-5fphh_calico-apiserver(4fdc260c-5577-4668-b511-8035243e1c1f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"414717b7174e0d95064132713c6a7a7c8f8fa13508ecdc982c50c3fd82f7c1fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" podUID="4fdc260c-5577-4668-b511-8035243e1c1f" Sep 9 04:56:50.951293 containerd[1873]: time="2025-09-09T04:56:50.951252976Z" level=error msg="Failed to destroy network for sandbox \"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.952971 containerd[1873]: time="2025-09-09T04:56:50.952877603Z" level=error msg="Failed to destroy network for sandbox \"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.953819 containerd[1873]: time="2025-09-09T04:56:50.953785992Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-679d689cf7-cm2jc,Uid:74e7e670-28ae-4273-9e5e-ade00eca3b9e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.954066 kubelet[3403]: E0909 04:56:50.954006 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.954066 kubelet[3403]: E0909 04:56:50.954060 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-679d689cf7-cm2jc" Sep 9 04:56:50.954206 kubelet[3403]: E0909 04:56:50.954079 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-679d689cf7-cm2jc" Sep 9 04:56:50.954206 kubelet[3403]: E0909 04:56:50.954119 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-679d689cf7-cm2jc_calico-system(74e7e670-28ae-4273-9e5e-ade00eca3b9e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-679d689cf7-cm2jc_calico-system(74e7e670-28ae-4273-9e5e-ade00eca3b9e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c28a8222f11463ebfda4c64aa44ea2558f6630ac3f0d083d14812dbc3d362f0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-679d689cf7-cm2jc" podUID="74e7e670-28ae-4273-9e5e-ade00eca3b9e" Sep 9 04:56:50.956343 containerd[1873]: time="2025-09-09T04:56:50.956315345Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-cn2np,Uid:2a41e2c8-6787-485d-af56-a43f1e724c36,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.956673 kubelet[3403]: E0909 04:56:50.956628 3403 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:56:50.956836 kubelet[3403]: E0909 04:56:50.956663 3403 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.956836 kubelet[3403]: E0909 04:56:50.956767 3403 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-cn2np" Sep 9 04:56:50.956836 kubelet[3403]: E0909 04:56:50.956801 3403 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-cn2np_calico-system(2a41e2c8-6787-485d-af56-a43f1e724c36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-cn2np_calico-system(2a41e2c8-6787-485d-af56-a43f1e724c36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc666dc05f733412ca876979c4954ea24edf91565dd66a8d0a45687a96aa700e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-cn2np" podUID="2a41e2c8-6787-485d-af56-a43f1e724c36" Sep 9 04:56:51.174092 containerd[1873]: time="2025-09-09T04:56:51.174038268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:56:54.786138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount434870355.mount: Deactivated successfully. Sep 9 04:56:55.199765 containerd[1873]: time="2025-09-09T04:56:55.199719767Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.202414 containerd[1873]: time="2025-09-09T04:56:55.202391125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:56:55.204949 containerd[1873]: time="2025-09-09T04:56:55.204924911Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.208311 containerd[1873]: time="2025-09-09T04:56:55.208283275Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:55.208904 containerd[1873]: time="2025-09-09T04:56:55.208863558Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.034791401s" Sep 9 04:56:55.208936 containerd[1873]: time="2025-09-09T04:56:55.208908008Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:56:55.227899 containerd[1873]: time="2025-09-09T04:56:55.227664173Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:56:55.249673 containerd[1873]: time="2025-09-09T04:56:55.248014597Z" level=info msg="Container 26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:55.269531 containerd[1873]: time="2025-09-09T04:56:55.269412871Z" level=info msg="CreateContainer within sandbox \"c0ca3f7b7ef145c81b3f1751b5686588ea4f6f4ff909ab73a5845807fced2ff4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\"" Sep 9 04:56:55.270009 containerd[1873]: time="2025-09-09T04:56:55.269983865Z" level=info msg="StartContainer for \"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\"" Sep 9 04:56:55.272012 containerd[1873]: time="2025-09-09T04:56:55.271462169Z" level=info msg="connecting to shim 26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504" address="unix:///run/containerd/s/43234f7a33ce59e8c108cfd983a6827cca60c30f589d89a9e572bf3732a46d75" protocol=ttrpc version=3 Sep 9 04:56:55.293015 systemd[1]: Started cri-containerd-26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504.scope - libcontainer container 26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504. Sep 9 04:56:55.329030 containerd[1873]: time="2025-09-09T04:56:55.328991608Z" level=info msg="StartContainer for \"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" returns successfully" Sep 9 04:56:55.644978 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:56:55.645106 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:56:55.854164 kubelet[3403]: I0909 04:56:55.854093 3403 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-backend-key-pair\") pod \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " Sep 9 04:56:55.856427 kubelet[3403]: I0909 04:56:55.854239 3403 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-ca-bundle\") pod \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " Sep 9 04:56:55.856427 kubelet[3403]: I0909 04:56:55.854286 3403 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvbv\" (UniqueName: \"kubernetes.io/projected/74e7e670-28ae-4273-9e5e-ade00eca3b9e-kube-api-access-vvvbv\") pod \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\" (UID: \"74e7e670-28ae-4273-9e5e-ade00eca3b9e\") " Sep 9 04:56:55.860547 kubelet[3403]: I0909 04:56:55.856861 3403 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "74e7e670-28ae-4273-9e5e-ade00eca3b9e" (UID: "74e7e670-28ae-4273-9e5e-ade00eca3b9e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 04:56:55.860141 systemd[1]: var-lib-kubelet-pods-74e7e670\x2d28ae\x2d4273\x2d9e5e\x2dade00eca3b9e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvvvbv.mount: Deactivated successfully. Sep 9 04:56:55.860222 systemd[1]: var-lib-kubelet-pods-74e7e670\x2d28ae\x2d4273\x2d9e5e\x2dade00eca3b9e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:56:55.861970 kubelet[3403]: I0909 04:56:55.861662 3403 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "74e7e670-28ae-4273-9e5e-ade00eca3b9e" (UID: "74e7e670-28ae-4273-9e5e-ade00eca3b9e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 04:56:55.863417 kubelet[3403]: I0909 04:56:55.863375 3403 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e7e670-28ae-4273-9e5e-ade00eca3b9e-kube-api-access-vvvbv" (OuterVolumeSpecName: "kube-api-access-vvvbv") pod "74e7e670-28ae-4273-9e5e-ade00eca3b9e" (UID: "74e7e670-28ae-4273-9e5e-ade00eca3b9e"). InnerVolumeSpecName "kube-api-access-vvvbv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 04:56:55.955569 kubelet[3403]: I0909 04:56:55.955465 3403 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vvvbv\" (UniqueName: \"kubernetes.io/projected/74e7e670-28ae-4273-9e5e-ade00eca3b9e-kube-api-access-vvvbv\") on node \"ci-4452.0.0-n-e60618bb0b\" DevicePath \"\"" Sep 9 04:56:55.955569 kubelet[3403]: I0909 04:56:55.955500 3403 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-backend-key-pair\") on node \"ci-4452.0.0-n-e60618bb0b\" DevicePath \"\"" Sep 9 04:56:55.955569 kubelet[3403]: I0909 04:56:55.955507 3403 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74e7e670-28ae-4273-9e5e-ade00eca3b9e-whisker-ca-bundle\") on node \"ci-4452.0.0-n-e60618bb0b\" DevicePath \"\"" Sep 9 04:56:56.200711 systemd[1]: Removed slice kubepods-besteffort-pod74e7e670_28ae_4273_9e5e_ade00eca3b9e.slice - libcontainer container kubepods-besteffort-pod74e7e670_28ae_4273_9e5e_ade00eca3b9e.slice. Sep 9 04:56:56.224318 kubelet[3403]: I0909 04:56:56.224050 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-26285" podStartSLOduration=1.857591565 podStartE2EDuration="15.224035834s" podCreationTimestamp="2025-09-09 04:56:41 +0000 UTC" firstStartedPulling="2025-09-09 04:56:41.842985475 +0000 UTC m=+18.849384081" lastFinishedPulling="2025-09-09 04:56:55.209429744 +0000 UTC m=+32.215828350" observedRunningTime="2025-09-09 04:56:56.211679571 +0000 UTC m=+33.218078185" watchObservedRunningTime="2025-09-09 04:56:56.224035834 +0000 UTC m=+33.230434440" Sep 9 04:56:56.282900 containerd[1873]: time="2025-09-09T04:56:56.282228015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"d8a6ee13bf63696bec9c365d6afe73cc0421254a653ef34f618904cb76f932ee\" pid:4460 exit_status:1 exited_at:{seconds:1757393816 nanos:281838714}" Sep 9 04:56:56.305141 systemd[1]: Created slice kubepods-besteffort-pod70667ba6_b5e6_4182_b98d_723d98164839.slice - libcontainer container kubepods-besteffort-pod70667ba6_b5e6_4182_b98d_723d98164839.slice. Sep 9 04:56:56.361676 kubelet[3403]: I0909 04:56:56.361612 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/70667ba6-b5e6-4182-b98d-723d98164839-whisker-backend-key-pair\") pod \"whisker-f47cf97b4-vw9cx\" (UID: \"70667ba6-b5e6-4182-b98d-723d98164839\") " pod="calico-system/whisker-f47cf97b4-vw9cx" Sep 9 04:56:56.361816 kubelet[3403]: I0909 04:56:56.361692 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70667ba6-b5e6-4182-b98d-723d98164839-whisker-ca-bundle\") pod \"whisker-f47cf97b4-vw9cx\" (UID: \"70667ba6-b5e6-4182-b98d-723d98164839\") " pod="calico-system/whisker-f47cf97b4-vw9cx" Sep 9 04:56:56.361816 kubelet[3403]: I0909 04:56:56.361733 3403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnsx5\" (UniqueName: \"kubernetes.io/projected/70667ba6-b5e6-4182-b98d-723d98164839-kube-api-access-wnsx5\") pod \"whisker-f47cf97b4-vw9cx\" (UID: \"70667ba6-b5e6-4182-b98d-723d98164839\") " pod="calico-system/whisker-f47cf97b4-vw9cx" Sep 9 04:56:56.609946 containerd[1873]: time="2025-09-09T04:56:56.609903142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f47cf97b4-vw9cx,Uid:70667ba6-b5e6-4182-b98d-723d98164839,Namespace:calico-system,Attempt:0,}" Sep 9 04:56:56.725865 systemd-networkd[1682]: cali6675255dc74: Link UP Sep 9 04:56:56.726076 systemd-networkd[1682]: cali6675255dc74: Gained carrier Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.630 [INFO][4474] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.653 [INFO][4474] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0 whisker-f47cf97b4- calico-system 70667ba6-b5e6-4182-b98d-723d98164839 870 0 2025-09-09 04:56:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:f47cf97b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b whisker-f47cf97b4-vw9cx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali6675255dc74 [] [] }} ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.653 [INFO][4474] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.670 [INFO][4486] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" HandleID="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Workload="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.670 [INFO][4486] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" HandleID="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Workload="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"whisker-f47cf97b4-vw9cx", "timestamp":"2025-09-09 04:56:56.670079019 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.670 [INFO][4486] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.670 [INFO][4486] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.670 [INFO][4486] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.676 [INFO][4486] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.679 [INFO][4486] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.682 [INFO][4486] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.683 [INFO][4486] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.684 [INFO][4486] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.685 [INFO][4486] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.686 [INFO][4486] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41 Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.694 [INFO][4486] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.699 [INFO][4486] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.129/26] block=192.168.35.128/26 handle="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.699 [INFO][4486] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.129/26] handle="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.699 [INFO][4486] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:56:56.742985 containerd[1873]: 2025-09-09 04:56:56.700 [INFO][4486] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.129/26] IPv6=[] ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" HandleID="k8s-pod-network.a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Workload="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.703 [INFO][4474] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0", GenerateName:"whisker-f47cf97b4-", Namespace:"calico-system", SelfLink:"", UID:"70667ba6-b5e6-4182-b98d-723d98164839", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f47cf97b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"whisker-f47cf97b4-vw9cx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6675255dc74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.704 [INFO][4474] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.129/32] ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.704 [INFO][4474] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6675255dc74 ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.726 [INFO][4474] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.727 [INFO][4474] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0", GenerateName:"whisker-f47cf97b4-", Namespace:"calico-system", SelfLink:"", UID:"70667ba6-b5e6-4182-b98d-723d98164839", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f47cf97b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41", Pod:"whisker-f47cf97b4-vw9cx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6675255dc74", MAC:"0a:26:fc:3c:ab:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:56:56.743626 containerd[1873]: 2025-09-09 04:56:56.739 [INFO][4474] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" Namespace="calico-system" Pod="whisker-f47cf97b4-vw9cx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-whisker--f47cf97b4--vw9cx-eth0" Sep 9 04:56:56.774969 containerd[1873]: time="2025-09-09T04:56:56.774926352Z" level=info msg="connecting to shim a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41" address="unix:///run/containerd/s/682a3a50b4afc18af2cb47a512af67c85f04ddedd2a53144098c8c14e90ffbd3" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:56:56.803014 systemd[1]: Started cri-containerd-a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41.scope - libcontainer container a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41. Sep 9 04:56:56.834695 containerd[1873]: time="2025-09-09T04:56:56.834597260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f47cf97b4-vw9cx,Uid:70667ba6-b5e6-4182-b98d-723d98164839,Namespace:calico-system,Attempt:0,} returns sandbox id \"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41\"" Sep 9 04:56:56.836679 containerd[1873]: time="2025-09-09T04:56:56.836146574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:56:57.073706 kubelet[3403]: I0909 04:56:57.073584 3403 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e7e670-28ae-4273-9e5e-ade00eca3b9e" path="/var/lib/kubelet/pods/74e7e670-28ae-4273-9e5e-ade00eca3b9e/volumes" Sep 9 04:56:57.245025 containerd[1873]: time="2025-09-09T04:56:57.244917765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"81d4345998bdef26c1a8687112c16e6d74d48a3ccf7de956135b22d8f42005ba\" pid:4655 exit_status:1 exited_at:{seconds:1757393817 nanos:244605027}" Sep 9 04:56:58.046041 systemd-networkd[1682]: cali6675255dc74: Gained IPv6LL Sep 9 04:56:58.483910 containerd[1873]: time="2025-09-09T04:56:58.483767967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.486425 containerd[1873]: time="2025-09-09T04:56:58.486398396Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:56:58.488909 containerd[1873]: time="2025-09-09T04:56:58.488888292Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.492863 containerd[1873]: time="2025-09-09T04:56:58.492836460Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:56:58.493297 containerd[1873]: time="2025-09-09T04:56:58.493273730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.657101467s" Sep 9 04:56:58.493321 containerd[1873]: time="2025-09-09T04:56:58.493300051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:56:58.499055 containerd[1873]: time="2025-09-09T04:56:58.499029507Z" level=info msg="CreateContainer within sandbox \"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:56:58.514980 containerd[1873]: time="2025-09-09T04:56:58.514410475Z" level=info msg="Container 1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:56:58.517552 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3530767442.mount: Deactivated successfully. Sep 9 04:56:58.528632 containerd[1873]: time="2025-09-09T04:56:58.528597989Z" level=info msg="CreateContainer within sandbox \"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21\"" Sep 9 04:56:58.529925 containerd[1873]: time="2025-09-09T04:56:58.529903479Z" level=info msg="StartContainer for \"1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21\"" Sep 9 04:56:58.530616 containerd[1873]: time="2025-09-09T04:56:58.530590677Z" level=info msg="connecting to shim 1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21" address="unix:///run/containerd/s/682a3a50b4afc18af2cb47a512af67c85f04ddedd2a53144098c8c14e90ffbd3" protocol=ttrpc version=3 Sep 9 04:56:58.547992 systemd[1]: Started cri-containerd-1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21.scope - libcontainer container 1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21. Sep 9 04:56:58.579097 containerd[1873]: time="2025-09-09T04:56:58.579071785Z" level=info msg="StartContainer for \"1c0dc174f90dcc82e7ee15c8d9ac2f66a55f1baa219ce6496625391a45f2cb21\" returns successfully" Sep 9 04:56:58.580199 containerd[1873]: time="2025-09-09T04:56:58.580181181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:57:00.021137 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3897343934.mount: Deactivated successfully. Sep 9 04:57:00.066725 containerd[1873]: time="2025-09-09T04:57:00.066672401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.068902 containerd[1873]: time="2025-09-09T04:57:00.068817590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:57:00.071378 containerd[1873]: time="2025-09-09T04:57:00.071324791Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.075167 containerd[1873]: time="2025-09-09T04:57:00.075122937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:00.075582 containerd[1873]: time="2025-09-09T04:57:00.075418675Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.495045535s" Sep 9 04:57:00.075582 containerd[1873]: time="2025-09-09T04:57:00.075446924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:57:00.081447 containerd[1873]: time="2025-09-09T04:57:00.081423748Z" level=info msg="CreateContainer within sandbox \"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:57:00.098582 containerd[1873]: time="2025-09-09T04:57:00.098009819Z" level=info msg="Container 1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:00.114123 containerd[1873]: time="2025-09-09T04:57:00.114093586Z" level=info msg="CreateContainer within sandbox \"a119d4cfb93a6fd082886c105724758ac8c64e721ef5630a28e840d9a4fe9e41\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d\"" Sep 9 04:57:00.115354 containerd[1873]: time="2025-09-09T04:57:00.115311489Z" level=info msg="StartContainer for \"1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d\"" Sep 9 04:57:00.116414 containerd[1873]: time="2025-09-09T04:57:00.116368515Z" level=info msg="connecting to shim 1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d" address="unix:///run/containerd/s/682a3a50b4afc18af2cb47a512af67c85f04ddedd2a53144098c8c14e90ffbd3" protocol=ttrpc version=3 Sep 9 04:57:00.137114 systemd[1]: Started cri-containerd-1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d.scope - libcontainer container 1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d. Sep 9 04:57:00.169298 containerd[1873]: time="2025-09-09T04:57:00.169270974Z" level=info msg="StartContainer for \"1a5668adcd4b5eb69b8063f0f4760b00f75ccc2dc3d9dc1542f01cafe0ccce1d\" returns successfully" Sep 9 04:57:00.218427 kubelet[3403]: I0909 04:57:00.218204 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-f47cf97b4-vw9cx" podStartSLOduration=0.97789221 podStartE2EDuration="4.218190199s" podCreationTimestamp="2025-09-09 04:56:56 +0000 UTC" firstStartedPulling="2025-09-09 04:56:56.835856941 +0000 UTC m=+33.842255547" lastFinishedPulling="2025-09-09 04:57:00.07615493 +0000 UTC m=+37.082553536" observedRunningTime="2025-09-09 04:57:00.21797936 +0000 UTC m=+37.224377966" watchObservedRunningTime="2025-09-09 04:57:00.218190199 +0000 UTC m=+37.224588805" Sep 9 04:57:01.073634 containerd[1873]: time="2025-09-09T04:57:01.073594651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vm259,Uid:aa78c1cd-bf72-4406-ba6d-7cd42efb9427,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:01.155595 systemd-networkd[1682]: cali8a11d86dbaf: Link UP Sep 9 04:57:01.156448 systemd-networkd[1682]: cali8a11d86dbaf: Gained carrier Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.096 [INFO][4809] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.105 [INFO][4809] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0 csi-node-driver- calico-system aa78c1cd-bf72-4406-ba6d-7cd42efb9427 699 0 2025-09-09 04:56:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b csi-node-driver-vm259 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8a11d86dbaf [] [] }} ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.105 [INFO][4809] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.122 [INFO][4821] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" HandleID="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Workload="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.122 [INFO][4821] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" HandleID="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Workload="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b230), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"csi-node-driver-vm259", "timestamp":"2025-09-09 04:57:01.122254404 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.122 [INFO][4821] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.122 [INFO][4821] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.122 [INFO][4821] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.129 [INFO][4821] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.132 [INFO][4821] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.134 [INFO][4821] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.136 [INFO][4821] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.137 [INFO][4821] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.137 [INFO][4821] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.138 [INFO][4821] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.142 [INFO][4821] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.150 [INFO][4821] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.130/26] block=192.168.35.128/26 handle="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.150 [INFO][4821] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.130/26] handle="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.150 [INFO][4821] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:01.171892 containerd[1873]: 2025-09-09 04:57:01.150 [INFO][4821] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.130/26] IPv6=[] ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" HandleID="k8s-pod-network.cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Workload="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.152 [INFO][4809] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"aa78c1cd-bf72-4406-ba6d-7cd42efb9427", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"csi-node-driver-vm259", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8a11d86dbaf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.152 [INFO][4809] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.130/32] ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.152 [INFO][4809] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a11d86dbaf ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.156 [INFO][4809] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.157 [INFO][4809] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"aa78c1cd-bf72-4406-ba6d-7cd42efb9427", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d", Pod:"csi-node-driver-vm259", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8a11d86dbaf", MAC:"b6:7a:2e:5a:75:5e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:01.172318 containerd[1873]: 2025-09-09 04:57:01.170 [INFO][4809] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" Namespace="calico-system" Pod="csi-node-driver-vm259" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-csi--node--driver--vm259-eth0" Sep 9 04:57:01.212514 containerd[1873]: time="2025-09-09T04:57:01.212391983Z" level=info msg="connecting to shim cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d" address="unix:///run/containerd/s/d303b3cd457500dd8dbd5bbd98cc78bec9a0cc18af770769254588cc467d8106" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:01.230998 systemd[1]: Started cri-containerd-cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d.scope - libcontainer container cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d. Sep 9 04:57:01.255514 containerd[1873]: time="2025-09-09T04:57:01.255479941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vm259,Uid:aa78c1cd-bf72-4406-ba6d-7cd42efb9427,Namespace:calico-system,Attempt:0,} returns sandbox id \"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d\"" Sep 9 04:57:01.257334 containerd[1873]: time="2025-09-09T04:57:01.257266374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 04:57:02.071688 containerd[1873]: time="2025-09-09T04:57:02.071635672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bdn2k,Uid:6ab60380-d76f-4b58-882b-8a015f1d3fb8,Namespace:kube-system,Attempt:0,}" Sep 9 04:57:02.156340 systemd-networkd[1682]: calied4937b7cf4: Link UP Sep 9 04:57:02.157182 systemd-networkd[1682]: calied4937b7cf4: Gained carrier Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.094 [INFO][4902] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.102 [INFO][4902] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0 coredns-674b8bbfcf- kube-system 6ab60380-d76f-4b58-882b-8a015f1d3fb8 802 0 2025-09-09 04:56:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b coredns-674b8bbfcf-bdn2k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calied4937b7cf4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.102 [INFO][4902] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.121 [INFO][4913] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" HandleID="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.121 [INFO][4913] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" HandleID="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"coredns-674b8bbfcf-bdn2k", "timestamp":"2025-09-09 04:57:02.120981424 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.121 [INFO][4913] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.121 [INFO][4913] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.121 [INFO][4913] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.127 [INFO][4913] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.131 [INFO][4913] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.135 [INFO][4913] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.136 [INFO][4913] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.138 [INFO][4913] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.138 [INFO][4913] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.139 [INFO][4913] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00 Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.146 [INFO][4913] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.151 [INFO][4913] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.131/26] block=192.168.35.128/26 handle="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.152 [INFO][4913] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.131/26] handle="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.152 [INFO][4913] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:02.174828 containerd[1873]: 2025-09-09 04:57:02.152 [INFO][4913] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.131/26] IPv6=[] ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" HandleID="k8s-pod-network.5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.175525 containerd[1873]: 2025-09-09 04:57:02.153 [INFO][4902] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6ab60380-d76f-4b58-882b-8a015f1d3fb8", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"coredns-674b8bbfcf-bdn2k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calied4937b7cf4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:02.175525 containerd[1873]: 2025-09-09 04:57:02.153 [INFO][4902] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.131/32] ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.175525 containerd[1873]: 2025-09-09 04:57:02.153 [INFO][4902] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calied4937b7cf4 ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.175525 containerd[1873]: 2025-09-09 04:57:02.159 [INFO][4902] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.175525 containerd[1873]: 2025-09-09 04:57:02.159 [INFO][4902] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6ab60380-d76f-4b58-882b-8a015f1d3fb8", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00", Pod:"coredns-674b8bbfcf-bdn2k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calied4937b7cf4", MAC:"9a:05:79:5d:b6:10", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:02.175647 containerd[1873]: 2025-09-09 04:57:02.172 [INFO][4902] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" Namespace="kube-system" Pod="coredns-674b8bbfcf-bdn2k" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--bdn2k-eth0" Sep 9 04:57:02.208053 containerd[1873]: time="2025-09-09T04:57:02.208002302Z" level=info msg="connecting to shim 5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00" address="unix:///run/containerd/s/c8915612a7bd667803a6f6b66342337e36d478edc75e8634e677ffc084586b75" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:02.227985 systemd[1]: Started cri-containerd-5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00.scope - libcontainer container 5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00. Sep 9 04:57:02.266506 containerd[1873]: time="2025-09-09T04:57:02.266470518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bdn2k,Uid:6ab60380-d76f-4b58-882b-8a015f1d3fb8,Namespace:kube-system,Attempt:0,} returns sandbox id \"5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00\"" Sep 9 04:57:02.275909 containerd[1873]: time="2025-09-09T04:57:02.275855990Z" level=info msg="CreateContainer within sandbox \"5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:57:02.299975 containerd[1873]: time="2025-09-09T04:57:02.299941572Z" level=info msg="Container fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:02.302594 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3035200987.mount: Deactivated successfully. Sep 9 04:57:02.318984 containerd[1873]: time="2025-09-09T04:57:02.318910163Z" level=info msg="CreateContainer within sandbox \"5be9bcc1bff05fe74d109aa1555a59aa16af1367b8aab277b35b76a5f0b51f00\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688\"" Sep 9 04:57:02.320536 containerd[1873]: time="2025-09-09T04:57:02.320501574Z" level=info msg="StartContainer for \"fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688\"" Sep 9 04:57:02.322310 containerd[1873]: time="2025-09-09T04:57:02.321959142Z" level=info msg="connecting to shim fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688" address="unix:///run/containerd/s/c8915612a7bd667803a6f6b66342337e36d478edc75e8634e677ffc084586b75" protocol=ttrpc version=3 Sep 9 04:57:02.355042 systemd[1]: Started cri-containerd-fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688.scope - libcontainer container fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688. Sep 9 04:57:02.393287 containerd[1873]: time="2025-09-09T04:57:02.393249373Z" level=info msg="StartContainer for \"fcce5ecfc7e9f468081c27c0f9dbfae9e8e9f89ee2658dade7882202b7577688\" returns successfully" Sep 9 04:57:02.527059 systemd-networkd[1682]: cali8a11d86dbaf: Gained IPv6LL Sep 9 04:57:02.816980 containerd[1873]: time="2025-09-09T04:57:02.816475427Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:02.819710 containerd[1873]: time="2025-09-09T04:57:02.819688179Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 04:57:02.822384 containerd[1873]: time="2025-09-09T04:57:02.822364418Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:02.825585 containerd[1873]: time="2025-09-09T04:57:02.825560553Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:02.825922 containerd[1873]: time="2025-09-09T04:57:02.825894748Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.568580012s" Sep 9 04:57:02.825922 containerd[1873]: time="2025-09-09T04:57:02.825923773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 04:57:02.832619 containerd[1873]: time="2025-09-09T04:57:02.832581453Z" level=info msg="CreateContainer within sandbox \"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 04:57:02.849560 containerd[1873]: time="2025-09-09T04:57:02.848805363Z" level=info msg="Container df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:02.862638 containerd[1873]: time="2025-09-09T04:57:02.862603554Z" level=info msg="CreateContainer within sandbox \"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa\"" Sep 9 04:57:02.863344 containerd[1873]: time="2025-09-09T04:57:02.863280824Z" level=info msg="StartContainer for \"df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa\"" Sep 9 04:57:02.864812 containerd[1873]: time="2025-09-09T04:57:02.864788041Z" level=info msg="connecting to shim df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa" address="unix:///run/containerd/s/d303b3cd457500dd8dbd5bbd98cc78bec9a0cc18af770769254588cc467d8106" protocol=ttrpc version=3 Sep 9 04:57:02.878018 systemd[1]: Started cri-containerd-df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa.scope - libcontainer container df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa. Sep 9 04:57:02.907550 containerd[1873]: time="2025-09-09T04:57:02.907460073Z" level=info msg="StartContainer for \"df8051b6861a16d08381b05e9c050c20b15fd1089ce827f64bb44a11c9e824aa\" returns successfully" Sep 9 04:57:02.908978 containerd[1873]: time="2025-09-09T04:57:02.908689113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 04:57:03.254143 kubelet[3403]: I0909 04:57:03.253827 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-bdn2k" podStartSLOduration=34.253813201 podStartE2EDuration="34.253813201s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:57:03.237697902 +0000 UTC m=+40.244096508" watchObservedRunningTime="2025-09-09 04:57:03.253813201 +0000 UTC m=+40.260211807" Sep 9 04:57:03.934074 systemd-networkd[1682]: calied4937b7cf4: Gained IPv6LL Sep 9 04:57:04.071647 containerd[1873]: time="2025-09-09T04:57:04.071611689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-5fphh,Uid:4fdc260c-5577-4668-b511-8035243e1c1f,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:57:04.072297 containerd[1873]: time="2025-09-09T04:57:04.072146363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-cn2np,Uid:2a41e2c8-6787-485d-af56-a43f1e724c36,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:04.072347 containerd[1873]: time="2025-09-09T04:57:04.071678083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-549ccbd58c-24j9h,Uid:3aca8d68-baeb-4a2b-974a-4711dcb4eb5e,Namespace:calico-system,Attempt:0,}" Sep 9 04:57:04.237040 systemd-networkd[1682]: cali5cae8cdb3b1: Link UP Sep 9 04:57:04.238583 systemd-networkd[1682]: cali5cae8cdb3b1: Gained carrier Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.120 [INFO][5096] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.135 [INFO][5096] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0 calico-apiserver-7f6f5df448- calico-apiserver 4fdc260c-5577-4668-b511-8035243e1c1f 806 0 2025-09-09 04:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f6f5df448 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b calico-apiserver-7f6f5df448-5fphh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5cae8cdb3b1 [] [] }} ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.135 [INFO][5096] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.191 [INFO][5135] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" HandleID="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.191 [INFO][5135] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" HandleID="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b970), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"calico-apiserver-7f6f5df448-5fphh", "timestamp":"2025-09-09 04:57:04.191165766 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.191 [INFO][5135] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.191 [INFO][5135] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.191 [INFO][5135] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.200 [INFO][5135] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.205 [INFO][5135] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.209 [INFO][5135] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.211 [INFO][5135] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.213 [INFO][5135] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.213 [INFO][5135] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.214 [INFO][5135] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8 Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.222 [INFO][5135] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.229 [INFO][5135] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.132/26] block=192.168.35.128/26 handle="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.229 [INFO][5135] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.132/26] handle="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.229 [INFO][5135] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:04.253904 containerd[1873]: 2025-09-09 04:57:04.229 [INFO][5135] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.132/26] IPv6=[] ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" HandleID="k8s-pod-network.3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.232 [INFO][5096] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0", GenerateName:"calico-apiserver-7f6f5df448-", Namespace:"calico-apiserver", SelfLink:"", UID:"4fdc260c-5577-4668-b511-8035243e1c1f", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6f5df448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"calico-apiserver-7f6f5df448-5fphh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5cae8cdb3b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.232 [INFO][5096] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.132/32] ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.232 [INFO][5096] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5cae8cdb3b1 ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.239 [INFO][5096] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.239 [INFO][5096] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0", GenerateName:"calico-apiserver-7f6f5df448-", Namespace:"calico-apiserver", SelfLink:"", UID:"4fdc260c-5577-4668-b511-8035243e1c1f", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6f5df448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8", Pod:"calico-apiserver-7f6f5df448-5fphh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5cae8cdb3b1", MAC:"ae:fd:f6:22:cb:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.254301 containerd[1873]: 2025-09-09 04:57:04.251 [INFO][5096] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-5fphh" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--5fphh-eth0" Sep 9 04:57:04.302989 containerd[1873]: time="2025-09-09T04:57:04.302946967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:04.305402 containerd[1873]: time="2025-09-09T04:57:04.305303932Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 04:57:04.310633 containerd[1873]: time="2025-09-09T04:57:04.310597111Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:04.316431 containerd[1873]: time="2025-09-09T04:57:04.316319169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:04.317112 containerd[1873]: time="2025-09-09T04:57:04.317050089Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.408286869s" Sep 9 04:57:04.317112 containerd[1873]: time="2025-09-09T04:57:04.317074481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 04:57:04.326109 containerd[1873]: time="2025-09-09T04:57:04.326051541Z" level=info msg="CreateContainer within sandbox \"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 04:57:04.336192 containerd[1873]: time="2025-09-09T04:57:04.336048649Z" level=info msg="connecting to shim 3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8" address="unix:///run/containerd/s/912162ada4b35536827a2aa8980dfd0e996d63577dd1eb250cab4c653db2a105" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:04.350656 containerd[1873]: time="2025-09-09T04:57:04.350625634Z" level=info msg="Container 0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:04.358366 systemd-networkd[1682]: calif5509619a2b: Link UP Sep 9 04:57:04.358857 systemd-networkd[1682]: calif5509619a2b: Gained carrier Sep 9 04:57:04.371085 containerd[1873]: time="2025-09-09T04:57:04.370775879Z" level=info msg="CreateContainer within sandbox \"cea989028fe77455d6b45e4373c98e2671a0795a5e453acd6f012c8a0a574d7d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176\"" Sep 9 04:57:04.372247 containerd[1873]: time="2025-09-09T04:57:04.372222790Z" level=info msg="StartContainer for \"0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176\"" Sep 9 04:57:04.376293 containerd[1873]: time="2025-09-09T04:57:04.376262457Z" level=info msg="connecting to shim 0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176" address="unix:///run/containerd/s/d303b3cd457500dd8dbd5bbd98cc78bec9a0cc18af770769254588cc467d8106" protocol=ttrpc version=3 Sep 9 04:57:04.379104 systemd[1]: Started cri-containerd-3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8.scope - libcontainer container 3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8. Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.131 [INFO][5104] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.150 [INFO][5104] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0 goldmane-54d579b49d- calico-system 2a41e2c8-6787-485d-af56-a43f1e724c36 808 0 2025-09-09 04:56:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b goldmane-54d579b49d-cn2np eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif5509619a2b [] [] }} ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.152 [INFO][5104] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.192 [INFO][5141] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" HandleID="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Workload="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.192 [INFO][5141] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" HandleID="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Workload="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3150), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"goldmane-54d579b49d-cn2np", "timestamp":"2025-09-09 04:57:04.192108213 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.192 [INFO][5141] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.229 [INFO][5141] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.230 [INFO][5141] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.300 [INFO][5141] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.305 [INFO][5141] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.318 [INFO][5141] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.321 [INFO][5141] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.323 [INFO][5141] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.324 [INFO][5141] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.328 [INFO][5141] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898 Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.336 [INFO][5141] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5141] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.133/26] block=192.168.35.128/26 handle="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5141] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.133/26] handle="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5141] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:04.385238 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5141] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.133/26] IPv6=[] ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" HandleID="k8s-pod-network.9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Workload="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.354 [INFO][5104] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2a41e2c8-6787-485d-af56-a43f1e724c36", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"goldmane-54d579b49d-cn2np", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif5509619a2b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.354 [INFO][5104] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.133/32] ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.354 [INFO][5104] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif5509619a2b ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.360 [INFO][5104] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.363 [INFO][5104] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2a41e2c8-6787-485d-af56-a43f1e724c36", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898", Pod:"goldmane-54d579b49d-cn2np", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif5509619a2b", MAC:"32:e5:f7:cb:56:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.385728 containerd[1873]: 2025-09-09 04:57:04.381 [INFO][5104] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" Namespace="calico-system" Pod="goldmane-54d579b49d-cn2np" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-goldmane--54d579b49d--cn2np-eth0" Sep 9 04:57:04.408021 systemd[1]: Started cri-containerd-0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176.scope - libcontainer container 0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176. Sep 9 04:57:04.439801 containerd[1873]: time="2025-09-09T04:57:04.439769148Z" level=info msg="connecting to shim 9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898" address="unix:///run/containerd/s/3d0d2e579d0a9a89161edef7734e6fb0a91898438c1fc69f2a7639ba85990004" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:04.442793 containerd[1873]: time="2025-09-09T04:57:04.442579928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-5fphh,Uid:4fdc260c-5577-4668-b511-8035243e1c1f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8\"" Sep 9 04:57:04.447145 containerd[1873]: time="2025-09-09T04:57:04.447111226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:57:04.461489 systemd-networkd[1682]: cali8b8904c5012: Link UP Sep 9 04:57:04.462938 systemd-networkd[1682]: cali8b8904c5012: Gained carrier Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.144 [INFO][5119] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.159 [INFO][5119] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0 calico-kube-controllers-549ccbd58c- calico-system 3aca8d68-baeb-4a2b-974a-4711dcb4eb5e 805 0 2025-09-09 04:56:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:549ccbd58c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b calico-kube-controllers-549ccbd58c-24j9h eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8b8904c5012 [] [] }} ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.159 [INFO][5119] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.205 [INFO][5146] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" HandleID="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.206 [INFO][5146] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" HandleID="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dcf70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"calico-kube-controllers-549ccbd58c-24j9h", "timestamp":"2025-09-09 04:57:04.205706694 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.206 [INFO][5146] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5146] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.352 [INFO][5146] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.402 [INFO][5146] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.407 [INFO][5146] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.411 [INFO][5146] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.414 [INFO][5146] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.424 [INFO][5146] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.425 [INFO][5146] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.428 [INFO][5146] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.432 [INFO][5146] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.444 [INFO][5146] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.134/26] block=192.168.35.128/26 handle="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.445 [INFO][5146] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.134/26] handle="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:04.490581 containerd[1873]: 2025-09-09 04:57:04.445 [INFO][5146] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.445 [INFO][5146] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.134/26] IPv6=[] ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" HandleID="k8s-pod-network.f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.454 [INFO][5119] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0", GenerateName:"calico-kube-controllers-549ccbd58c-", Namespace:"calico-system", SelfLink:"", UID:"3aca8d68-baeb-4a2b-974a-4711dcb4eb5e", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"549ccbd58c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"calico-kube-controllers-549ccbd58c-24j9h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8b8904c5012", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.456 [INFO][5119] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.134/32] ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.456 [INFO][5119] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b8904c5012 ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.463 [INFO][5119] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.490977 containerd[1873]: 2025-09-09 04:57:04.464 [INFO][5119] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0", GenerateName:"calico-kube-controllers-549ccbd58c-", Namespace:"calico-system", SelfLink:"", UID:"3aca8d68-baeb-4a2b-974a-4711dcb4eb5e", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"549ccbd58c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e", Pod:"calico-kube-controllers-549ccbd58c-24j9h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8b8904c5012", MAC:"3e:a1:39:83:88:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:04.491106 containerd[1873]: 2025-09-09 04:57:04.481 [INFO][5119] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" Namespace="calico-system" Pod="calico-kube-controllers-549ccbd58c-24j9h" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--kube--controllers--549ccbd58c--24j9h-eth0" Sep 9 04:57:04.493178 containerd[1873]: time="2025-09-09T04:57:04.491671192Z" level=info msg="StartContainer for \"0ddb9f5d85e037c28c7775bb7e842daeb38dc0b50b7cbc00c0ce50e922ae1176\" returns successfully" Sep 9 04:57:04.493029 systemd[1]: Started cri-containerd-9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898.scope - libcontainer container 9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898. Sep 9 04:57:04.536021 containerd[1873]: time="2025-09-09T04:57:04.535988877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-cn2np,Uid:2a41e2c8-6787-485d-af56-a43f1e724c36,Namespace:calico-system,Attempt:0,} returns sandbox id \"9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898\"" Sep 9 04:57:04.542258 containerd[1873]: time="2025-09-09T04:57:04.542181286Z" level=info msg="connecting to shim f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e" address="unix:///run/containerd/s/55ce5ca3521981fda698740c76121c0ecc25273a78bc756ef8c99f682b1e3d9e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:04.567006 systemd[1]: Started cri-containerd-f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e.scope - libcontainer container f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e. Sep 9 04:57:04.596286 containerd[1873]: time="2025-09-09T04:57:04.596255591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-549ccbd58c-24j9h,Uid:3aca8d68-baeb-4a2b-974a-4711dcb4eb5e,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e\"" Sep 9 04:57:05.155699 kubelet[3403]: I0909 04:57:05.155458 3403 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 04:57:05.155699 kubelet[3403]: I0909 04:57:05.155497 3403 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 04:57:05.241541 kubelet[3403]: I0909 04:57:05.241468 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vm259" podStartSLOduration=21.179939376 podStartE2EDuration="24.241449674s" podCreationTimestamp="2025-09-09 04:56:41 +0000 UTC" firstStartedPulling="2025-09-09 04:57:01.256727613 +0000 UTC m=+38.263126219" lastFinishedPulling="2025-09-09 04:57:04.318237911 +0000 UTC m=+41.324636517" observedRunningTime="2025-09-09 04:57:05.241273053 +0000 UTC m=+42.247671699" watchObservedRunningTime="2025-09-09 04:57:05.241449674 +0000 UTC m=+42.247848280" Sep 9 04:57:05.854034 systemd-networkd[1682]: cali5cae8cdb3b1: Gained IPv6LL Sep 9 04:57:06.072407 containerd[1873]: time="2025-09-09T04:57:06.072101476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mkdx,Uid:49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8,Namespace:kube-system,Attempt:0,}" Sep 9 04:57:06.073233 containerd[1873]: time="2025-09-09T04:57:06.072101556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-vgtr8,Uid:d83ef8c9-b986-4f3b-8e65-4557cf85482c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:57:06.174170 systemd-networkd[1682]: cali8b8904c5012: Gained IPv6LL Sep 9 04:57:06.214773 systemd-networkd[1682]: cali651f787449c: Link UP Sep 9 04:57:06.216049 systemd-networkd[1682]: cali651f787449c: Gained carrier Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.111 [INFO][5397] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.124 [INFO][5397] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0 calico-apiserver-7f6f5df448- calico-apiserver d83ef8c9-b986-4f3b-8e65-4557cf85482c 804 0 2025-09-09 04:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f6f5df448 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b calico-apiserver-7f6f5df448-vgtr8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali651f787449c [] [] }} ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.124 [INFO][5397] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.160 [INFO][5413] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" HandleID="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.160 [INFO][5413] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" HandleID="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab4a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"calico-apiserver-7f6f5df448-vgtr8", "timestamp":"2025-09-09 04:57:06.160330217 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.160 [INFO][5413] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.160 [INFO][5413] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.160 [INFO][5413] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.169 [INFO][5413] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.173 [INFO][5413] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.178 [INFO][5413] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.180 [INFO][5413] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.183 [INFO][5413] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.183 [INFO][5413] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.185 [INFO][5413] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63 Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.191 [INFO][5413] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5413] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.135/26] block=192.168.35.128/26 handle="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5413] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.135/26] handle="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5413] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:06.237167 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5413] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.135/26] IPv6=[] ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" HandleID="k8s-pod-network.243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Workload="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.203 [INFO][5397] cni-plugin/k8s.go 418: Populated endpoint ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0", GenerateName:"calico-apiserver-7f6f5df448-", Namespace:"calico-apiserver", SelfLink:"", UID:"d83ef8c9-b986-4f3b-8e65-4557cf85482c", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6f5df448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"calico-apiserver-7f6f5df448-vgtr8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali651f787449c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.203 [INFO][5397] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.135/32] ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.203 [INFO][5397] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali651f787449c ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.217 [INFO][5397] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.217 [INFO][5397] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0", GenerateName:"calico-apiserver-7f6f5df448-", Namespace:"calico-apiserver", SelfLink:"", UID:"d83ef8c9-b986-4f3b-8e65-4557cf85482c", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6f5df448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63", Pod:"calico-apiserver-7f6f5df448-vgtr8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali651f787449c", MAC:"52:34:c9:30:2f:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.238775 containerd[1873]: 2025-09-09 04:57:06.232 [INFO][5397] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" Namespace="calico-apiserver" Pod="calico-apiserver-7f6f5df448-vgtr8" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-calico--apiserver--7f6f5df448--vgtr8-eth0" Sep 9 04:57:06.281804 containerd[1873]: time="2025-09-09T04:57:06.281767331Z" level=info msg="connecting to shim 243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63" address="unix:///run/containerd/s/61866f68688b82509679a77fe947144117f5235ad166cbf25df4e75db059c5bc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:06.316053 systemd[1]: Started cri-containerd-243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63.scope - libcontainer container 243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63. Sep 9 04:57:06.340476 systemd-networkd[1682]: cali4aaab3ad014: Link UP Sep 9 04:57:06.342681 systemd-networkd[1682]: cali4aaab3ad014: Gained carrier Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.109 [INFO][5390] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.124 [INFO][5390] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0 coredns-674b8bbfcf- kube-system 49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8 803 0 2025-09-09 04:56:29 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-e60618bb0b coredns-674b8bbfcf-2mkdx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4aaab3ad014 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.124 [INFO][5390] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.161 [INFO][5418] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" HandleID="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.162 [INFO][5418] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" HandleID="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3640), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-e60618bb0b", "pod":"coredns-674b8bbfcf-2mkdx", "timestamp":"2025-09-09 04:57:06.161775032 +0000 UTC"}, Hostname:"ci-4452.0.0-n-e60618bb0b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.163 [INFO][5418] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5418] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.201 [INFO][5418] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-e60618bb0b' Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.270 [INFO][5418] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.284 [INFO][5418] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.291 [INFO][5418] ipam/ipam.go 511: Trying affinity for 192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.294 [INFO][5418] ipam/ipam.go 158: Attempting to load block cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.297 [INFO][5418] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.35.128/26 host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.297 [INFO][5418] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.35.128/26 handle="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.299 [INFO][5418] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488 Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.304 [INFO][5418] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.35.128/26 handle="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.323 [INFO][5418] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.35.136/26] block=192.168.35.128/26 handle="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.323 [INFO][5418] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.35.136/26] handle="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" host="ci-4452.0.0-n-e60618bb0b" Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.323 [INFO][5418] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:57:06.365862 containerd[1873]: 2025-09-09 04:57:06.323 [INFO][5418] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.35.136/26] IPv6=[] ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" HandleID="k8s-pod-network.e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Workload="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.366357 containerd[1873]: 2025-09-09 04:57:06.326 [INFO][5390] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"", Pod:"coredns-674b8bbfcf-2mkdx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4aaab3ad014", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.366357 containerd[1873]: 2025-09-09 04:57:06.327 [INFO][5390] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.136/32] ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.366357 containerd[1873]: 2025-09-09 04:57:06.327 [INFO][5390] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4aaab3ad014 ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.366357 containerd[1873]: 2025-09-09 04:57:06.343 [INFO][5390] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.366357 containerd[1873]: 2025-09-09 04:57:06.344 [INFO][5390] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 56, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-e60618bb0b", ContainerID:"e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488", Pod:"coredns-674b8bbfcf-2mkdx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4aaab3ad014", MAC:"12:b1:54:57:57:32", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:57:06.366471 containerd[1873]: 2025-09-09 04:57:06.358 [INFO][5390] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mkdx" WorkloadEndpoint="ci--4452.0.0--n--e60618bb0b-k8s-coredns--674b8bbfcf--2mkdx-eth0" Sep 9 04:57:06.366764 systemd-networkd[1682]: calif5509619a2b: Gained IPv6LL Sep 9 04:57:06.388306 containerd[1873]: time="2025-09-09T04:57:06.388262729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6f5df448-vgtr8,Uid:d83ef8c9-b986-4f3b-8e65-4557cf85482c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63\"" Sep 9 04:57:06.414019 containerd[1873]: time="2025-09-09T04:57:06.413976691Z" level=info msg="connecting to shim e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488" address="unix:///run/containerd/s/c411e91151debf9b9d9cd546a7a79263b5c35804ae34fc614f59ecbeb5071296" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:57:06.444046 systemd[1]: Started cri-containerd-e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488.scope - libcontainer container e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488. Sep 9 04:57:06.492363 containerd[1873]: time="2025-09-09T04:57:06.492326591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mkdx,Uid:49d4a79b-d00b-46a0-9f2c-ce1ae43d94e8,Namespace:kube-system,Attempt:0,} returns sandbox id \"e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488\"" Sep 9 04:57:06.501208 containerd[1873]: time="2025-09-09T04:57:06.501142133Z" level=info msg="CreateContainer within sandbox \"e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:57:06.522543 containerd[1873]: time="2025-09-09T04:57:06.521976753Z" level=info msg="Container 8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:06.537388 containerd[1873]: time="2025-09-09T04:57:06.537352267Z" level=info msg="CreateContainer within sandbox \"e420ba5089d2b96ad5ecd791a4741c0455240de355f1c5380168d2270526e488\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8\"" Sep 9 04:57:06.539140 containerd[1873]: time="2025-09-09T04:57:06.539027882Z" level=info msg="StartContainer for \"8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8\"" Sep 9 04:57:06.539671 containerd[1873]: time="2025-09-09T04:57:06.539625645Z" level=info msg="connecting to shim 8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8" address="unix:///run/containerd/s/c411e91151debf9b9d9cd546a7a79263b5c35804ae34fc614f59ecbeb5071296" protocol=ttrpc version=3 Sep 9 04:57:06.571061 systemd[1]: Started cri-containerd-8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8.scope - libcontainer container 8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8. Sep 9 04:57:06.609567 containerd[1873]: time="2025-09-09T04:57:06.609338706Z" level=info msg="StartContainer for \"8a1883c672e3ae5e2449b2bf36e96ebeba38a23ff4e43509bc206146860259d8\" returns successfully" Sep 9 04:57:06.788036 containerd[1873]: time="2025-09-09T04:57:06.787910617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:06.790038 containerd[1873]: time="2025-09-09T04:57:06.789982668Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 04:57:06.793143 containerd[1873]: time="2025-09-09T04:57:06.793093641Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:06.797869 containerd[1873]: time="2025-09-09T04:57:06.797823738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:06.798310 containerd[1873]: time="2025-09-09T04:57:06.798128572Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.350989561s" Sep 9 04:57:06.798310 containerd[1873]: time="2025-09-09T04:57:06.798156301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:57:06.799962 containerd[1873]: time="2025-09-09T04:57:06.799926959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 04:57:06.806932 containerd[1873]: time="2025-09-09T04:57:06.806903393Z" level=info msg="CreateContainer within sandbox \"3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:57:06.820452 containerd[1873]: time="2025-09-09T04:57:06.820416911Z" level=info msg="Container 4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:06.845894 containerd[1873]: time="2025-09-09T04:57:06.845778230Z" level=info msg="CreateContainer within sandbox \"3a503b319eb89868d38c5499fcd1026dec06f3db34719a2ba4a2082f70dd8fd8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1\"" Sep 9 04:57:06.847442 containerd[1873]: time="2025-09-09T04:57:06.847391458Z" level=info msg="StartContainer for \"4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1\"" Sep 9 04:57:06.848839 containerd[1873]: time="2025-09-09T04:57:06.848815576Z" level=info msg="connecting to shim 4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1" address="unix:///run/containerd/s/912162ada4b35536827a2aa8980dfd0e996d63577dd1eb250cab4c653db2a105" protocol=ttrpc version=3 Sep 9 04:57:06.866009 systemd[1]: Started cri-containerd-4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1.scope - libcontainer container 4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1. Sep 9 04:57:06.902834 containerd[1873]: time="2025-09-09T04:57:06.902788230Z" level=info msg="StartContainer for \"4cf8217a2eaf8ed99f6781629d5ff259a18f4a7d1ef6d2c02c630498a8eeeff1\" returns successfully" Sep 9 04:57:07.280703 kubelet[3403]: I0909 04:57:07.280558 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f6f5df448-5fphh" podStartSLOduration=27.927506597 podStartE2EDuration="30.280544257s" podCreationTimestamp="2025-09-09 04:56:37 +0000 UTC" firstStartedPulling="2025-09-09 04:57:04.446063128 +0000 UTC m=+41.452461734" lastFinishedPulling="2025-09-09 04:57:06.799100788 +0000 UTC m=+43.805499394" observedRunningTime="2025-09-09 04:57:07.263650677 +0000 UTC m=+44.270049315" watchObservedRunningTime="2025-09-09 04:57:07.280544257 +0000 UTC m=+44.286942863" Sep 9 04:57:07.774020 systemd-networkd[1682]: cali4aaab3ad014: Gained IPv6LL Sep 9 04:57:07.903021 systemd-networkd[1682]: cali651f787449c: Gained IPv6LL Sep 9 04:57:08.295307 kubelet[3403]: I0909 04:57:08.255203 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:08.323351 kubelet[3403]: I0909 04:57:08.323317 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:08.343698 kubelet[3403]: I0909 04:57:08.343555 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-2mkdx" podStartSLOduration=39.343540609 podStartE2EDuration="39.343540609s" podCreationTimestamp="2025-09-09 04:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:57:07.281675486 +0000 UTC m=+44.288074100" watchObservedRunningTime="2025-09-09 04:57:08.343540609 +0000 UTC m=+45.349939223" Sep 9 04:57:08.962841 systemd-networkd[1682]: vxlan.calico: Link UP Sep 9 04:57:08.962849 systemd-networkd[1682]: vxlan.calico: Gained carrier Sep 9 04:57:09.442094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2155565372.mount: Deactivated successfully. Sep 9 04:57:09.810894 containerd[1873]: time="2025-09-09T04:57:09.810793142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:09.814055 containerd[1873]: time="2025-09-09T04:57:09.813918980Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 04:57:09.816832 containerd[1873]: time="2025-09-09T04:57:09.816557833Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:09.820554 containerd[1873]: time="2025-09-09T04:57:09.820102980Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:09.820776 containerd[1873]: time="2025-09-09T04:57:09.820719807Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.020757807s" Sep 9 04:57:09.820776 containerd[1873]: time="2025-09-09T04:57:09.820748984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 04:57:09.822424 containerd[1873]: time="2025-09-09T04:57:09.822402134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 04:57:09.829502 containerd[1873]: time="2025-09-09T04:57:09.829474835Z" level=info msg="CreateContainer within sandbox \"9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 04:57:09.848121 containerd[1873]: time="2025-09-09T04:57:09.847996794Z" level=info msg="Container 1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:09.855175 kubelet[3403]: I0909 04:57:09.855140 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:09.866437 containerd[1873]: time="2025-09-09T04:57:09.866391557Z" level=info msg="CreateContainer within sandbox \"9e72b4f4adfda812757fbca2b45bdb65b7d1ef2f98f7567d1735e11983402898\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\"" Sep 9 04:57:09.868473 containerd[1873]: time="2025-09-09T04:57:09.868123821Z" level=info msg="StartContainer for \"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\"" Sep 9 04:57:09.872375 containerd[1873]: time="2025-09-09T04:57:09.872346230Z" level=info msg="connecting to shim 1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0" address="unix:///run/containerd/s/3d0d2e579d0a9a89161edef7734e6fb0a91898438c1fc69f2a7639ba85990004" protocol=ttrpc version=3 Sep 9 04:57:09.918013 systemd[1]: Started cri-containerd-1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0.scope - libcontainer container 1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0. Sep 9 04:57:09.990151 containerd[1873]: time="2025-09-09T04:57:09.989748716Z" level=info msg="StartContainer for \"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" returns successfully" Sep 9 04:57:10.142019 systemd-networkd[1682]: vxlan.calico: Gained IPv6LL Sep 9 04:57:10.279201 kubelet[3403]: I0909 04:57:10.279147 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-cn2np" podStartSLOduration=23.994270377 podStartE2EDuration="29.279135336s" podCreationTimestamp="2025-09-09 04:56:41 +0000 UTC" firstStartedPulling="2025-09-09 04:57:04.537478389 +0000 UTC m=+41.543876995" lastFinishedPulling="2025-09-09 04:57:09.822343308 +0000 UTC m=+46.828741954" observedRunningTime="2025-09-09 04:57:10.276248442 +0000 UTC m=+47.282647048" watchObservedRunningTime="2025-09-09 04:57:10.279135336 +0000 UTC m=+47.285533942" Sep 9 04:57:10.333620 containerd[1873]: time="2025-09-09T04:57:10.333546048Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"a9d9bd01c70199185e5872688c66076e95817c87d2878c87f35c8560ebc8d096\" pid:5852 exit_status:1 exited_at:{seconds:1757393830 nanos:328428018}" Sep 9 04:57:11.334310 containerd[1873]: time="2025-09-09T04:57:11.334251322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"267764038d41e08ae81d67f165aa751df0bf8e86c9ed7d7749ecc588fc550564\" pid:5877 exit_status:1 exited_at:{seconds:1757393831 nanos:333067740}" Sep 9 04:57:12.359787 containerd[1873]: time="2025-09-09T04:57:12.359746190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"ff44d46be41f626bc1f84262478a5fd3c661ff953db41b7e6028da78531f5a8d\" pid:5905 exit_status:1 exited_at:{seconds:1757393832 nanos:358450532}" Sep 9 04:57:13.069155 containerd[1873]: time="2025-09-09T04:57:13.069097293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.073560 containerd[1873]: time="2025-09-09T04:57:13.073524284Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 04:57:13.075660 containerd[1873]: time="2025-09-09T04:57:13.075632600Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.079383 containerd[1873]: time="2025-09-09T04:57:13.079351329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.079822 containerd[1873]: time="2025-09-09T04:57:13.079737261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.257307582s" Sep 9 04:57:13.079822 containerd[1873]: time="2025-09-09T04:57:13.079764142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 04:57:13.081278 containerd[1873]: time="2025-09-09T04:57:13.080833065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:57:13.097712 containerd[1873]: time="2025-09-09T04:57:13.097684354Z" level=info msg="CreateContainer within sandbox \"f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 04:57:13.119237 containerd[1873]: time="2025-09-09T04:57:13.118655968Z" level=info msg="Container da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:13.133117 containerd[1873]: time="2025-09-09T04:57:13.133088131Z" level=info msg="CreateContainer within sandbox \"f7fb3f1d1a2acb0ec35e33e36a80c40b3cf70d58447b0a87782effdc7596427e\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\"" Sep 9 04:57:13.133564 containerd[1873]: time="2025-09-09T04:57:13.133545602Z" level=info msg="StartContainer for \"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\"" Sep 9 04:57:13.134526 containerd[1873]: time="2025-09-09T04:57:13.134496337Z" level=info msg="connecting to shim da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3" address="unix:///run/containerd/s/55ce5ca3521981fda698740c76121c0ecc25273a78bc756ef8c99f682b1e3d9e" protocol=ttrpc version=3 Sep 9 04:57:13.167019 systemd[1]: Started cri-containerd-da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3.scope - libcontainer container da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3. Sep 9 04:57:13.200332 containerd[1873]: time="2025-09-09T04:57:13.200294538Z" level=info msg="StartContainer for \"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" returns successfully" Sep 9 04:57:13.329192 containerd[1873]: time="2025-09-09T04:57:13.328806952Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"0f00a647632c92d6b764f8280dba471ae8d0031ce50d0ea7d8b94fcf1847fedc\" pid:5971 exited_at:{seconds:1757393833 nanos:328544535}" Sep 9 04:57:13.343641 kubelet[3403]: I0909 04:57:13.343583 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-549ccbd58c-24j9h" podStartSLOduration=23.860067119 podStartE2EDuration="32.343244867s" podCreationTimestamp="2025-09-09 04:56:41 +0000 UTC" firstStartedPulling="2025-09-09 04:57:04.59727032 +0000 UTC m=+41.603668926" lastFinishedPulling="2025-09-09 04:57:13.080448068 +0000 UTC m=+50.086846674" observedRunningTime="2025-09-09 04:57:13.298808453 +0000 UTC m=+50.305207059" watchObservedRunningTime="2025-09-09 04:57:13.343244867 +0000 UTC m=+50.349643473" Sep 9 04:57:13.446172 containerd[1873]: time="2025-09-09T04:57:13.446120516Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:57:13.448492 containerd[1873]: time="2025-09-09T04:57:13.448184918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 04:57:13.449466 containerd[1873]: time="2025-09-09T04:57:13.449443975Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 368.576525ms" Sep 9 04:57:13.449555 containerd[1873]: time="2025-09-09T04:57:13.449541402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:57:13.457428 containerd[1873]: time="2025-09-09T04:57:13.457293101Z" level=info msg="CreateContainer within sandbox \"243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:57:13.480234 containerd[1873]: time="2025-09-09T04:57:13.480202042Z" level=info msg="Container bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:57:13.497923 containerd[1873]: time="2025-09-09T04:57:13.497110229Z" level=info msg="CreateContainer within sandbox \"243f0b1a4deb5547062ed5a22714ac89f4da2a0478503dea0ff8178f1a88ad63\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e\"" Sep 9 04:57:13.498986 containerd[1873]: time="2025-09-09T04:57:13.498829349Z" level=info msg="StartContainer for \"bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e\"" Sep 9 04:57:13.500588 containerd[1873]: time="2025-09-09T04:57:13.500562549Z" level=info msg="connecting to shim bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e" address="unix:///run/containerd/s/61866f68688b82509679a77fe947144117f5235ad166cbf25df4e75db059c5bc" protocol=ttrpc version=3 Sep 9 04:57:13.519002 systemd[1]: Started cri-containerd-bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e.scope - libcontainer container bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e. Sep 9 04:57:13.809814 containerd[1873]: time="2025-09-09T04:57:13.808808466Z" level=info msg="StartContainer for \"bcdd628ed5d4a5e23a4b05623ed82d765512c81cd3531ec48edd1b24af91766e\" returns successfully" Sep 9 04:57:14.309261 kubelet[3403]: I0909 04:57:14.309212 3403 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f6f5df448-vgtr8" podStartSLOduration=30.248964713 podStartE2EDuration="37.309197752s" podCreationTimestamp="2025-09-09 04:56:37 +0000 UTC" firstStartedPulling="2025-09-09 04:57:06.390049379 +0000 UTC m=+43.396447993" lastFinishedPulling="2025-09-09 04:57:13.450282426 +0000 UTC m=+50.456681032" observedRunningTime="2025-09-09 04:57:14.309037067 +0000 UTC m=+51.315435697" watchObservedRunningTime="2025-09-09 04:57:14.309197752 +0000 UTC m=+51.315596358" Sep 9 04:57:15.286037 kubelet[3403]: I0909 04:57:15.285988 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:24.546257 containerd[1873]: time="2025-09-09T04:57:24.546192862Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"f795d12fb322fc3f6f825359431e5a865eb47a6b340207ff2cd7400ff89e89b7\" pid:6049 exited_at:{seconds:1757393844 nanos:545901909}" Sep 9 04:57:27.258545 containerd[1873]: time="2025-09-09T04:57:27.258505449Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"8709df908f25170f4276fdcadf912c12b923f4334597f01e6bb4725a92b3b6e1\" pid:6071 exited_at:{seconds:1757393847 nanos:258244376}" Sep 9 04:57:39.675812 kubelet[3403]: I0909 04:57:39.675283 3403 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:57:42.323545 containerd[1873]: time="2025-09-09T04:57:42.323475095Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"26ac6c0d971f0f602e0f32a0c67d66358a776c1afe4b8ddf971b5ac11b8097b1\" pid:6112 exited_at:{seconds:1757393862 nanos:323188334}" Sep 9 04:57:43.304054 containerd[1873]: time="2025-09-09T04:57:43.304016632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"f51d20ad9ef2ec267b0c3819bc04919cc2df9aa5ef04e3673c39f686c89a86b0\" pid:6136 exited_at:{seconds:1757393863 nanos:303538928}" Sep 9 04:57:48.927286 containerd[1873]: time="2025-09-09T04:57:48.927108454Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"50388dccdddc419d89387a0fa02212f08679bc8afceae8669fcbce71621fb0ae\" pid:6157 exited_at:{seconds:1757393868 nanos:926779219}" Sep 9 04:57:57.304566 containerd[1873]: time="2025-09-09T04:57:57.304526515Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"4e88454831cf9d375ab2495f67d8afb871a847e1203a2ae99633d07b2d7b6458\" pid:6185 exited_at:{seconds:1757393877 nanos:303925472}" Sep 9 04:58:12.322235 containerd[1873]: time="2025-09-09T04:58:12.322152814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"b72f0d7d0d0d40c33884379dec4299ed61c6d66ee95e31bf518ba30a6232a056\" pid:6211 exited_at:{seconds:1757393892 nanos:321799082}" Sep 9 04:58:13.304954 containerd[1873]: time="2025-09-09T04:58:13.304640669Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"be9c541bac7593671b7314cba5cbe097d85daa57f302dce64d1d013f28cd78b2\" pid:6231 exited_at:{seconds:1757393893 nanos:304486192}" Sep 9 04:58:24.543606 containerd[1873]: time="2025-09-09T04:58:24.543562465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"abbe9c9f1ec462c1679df78903ceeee7b852ed1f921902afdd925661c870b8de\" pid:6257 exited_at:{seconds:1757393904 nanos:543145324}" Sep 9 04:58:27.243772 containerd[1873]: time="2025-09-09T04:58:27.243721302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"7137b517a48dd46e6da50a80e5db9213a9aeb6eb149617584a3334bdc68b5421\" pid:6277 exited_at:{seconds:1757393907 nanos:243327714}" Sep 9 04:58:38.132845 systemd[1]: Started sshd@7-10.200.20.4:22-10.200.16.10:51126.service - OpenSSH per-connection server daemon (10.200.16.10:51126). Sep 9 04:58:38.554179 sshd[6302]: Accepted publickey for core from 10.200.16.10 port 51126 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:38.555946 sshd-session[6302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:38.563802 systemd-logind[1855]: New session 10 of user core. Sep 9 04:58:38.567524 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 04:58:38.927214 sshd[6305]: Connection closed by 10.200.16.10 port 51126 Sep 9 04:58:38.927742 sshd-session[6302]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:38.930683 systemd[1]: sshd@7-10.200.20.4:22-10.200.16.10:51126.service: Deactivated successfully. Sep 9 04:58:38.932355 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 04:58:38.933195 systemd-logind[1855]: Session 10 logged out. Waiting for processes to exit. Sep 9 04:58:38.934513 systemd-logind[1855]: Removed session 10. Sep 9 04:58:42.320783 containerd[1873]: time="2025-09-09T04:58:42.320740949Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"42b06babaedd0e1429fd95e49b7c336e45420a47ed08dc19e9b96dadbe67975a\" pid:6343 exited_at:{seconds:1757393922 nanos:320442307}" Sep 9 04:58:43.328632 containerd[1873]: time="2025-09-09T04:58:43.328586703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"c54215935d4a0b2489f467a33247f9d45a2aaccd1042fbec6315d1af7ae562eb\" pid:6365 exited_at:{seconds:1757393923 nanos:328213739}" Sep 9 04:58:44.011078 systemd[1]: Started sshd@8-10.200.20.4:22-10.200.16.10:46788.service - OpenSSH per-connection server daemon (10.200.16.10:46788). Sep 9 04:58:44.445937 sshd[6375]: Accepted publickey for core from 10.200.16.10 port 46788 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:44.447075 sshd-session[6375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:44.452583 systemd-logind[1855]: New session 11 of user core. Sep 9 04:58:44.460007 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 04:58:44.822773 sshd[6378]: Connection closed by 10.200.16.10 port 46788 Sep 9 04:58:44.823719 sshd-session[6375]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:44.826799 systemd[1]: sshd@8-10.200.20.4:22-10.200.16.10:46788.service: Deactivated successfully. Sep 9 04:58:44.831892 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 04:58:44.833765 systemd-logind[1855]: Session 11 logged out. Waiting for processes to exit. Sep 9 04:58:44.835754 systemd-logind[1855]: Removed session 11. Sep 9 04:58:48.880575 containerd[1873]: time="2025-09-09T04:58:48.880422171Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"e792c7b3b119710dd761146b956e9b7f6da85712f53642e199cc19b53941c2c9\" pid:6409 exited_at:{seconds:1757393928 nanos:880014270}" Sep 9 04:58:49.906936 systemd[1]: Started sshd@9-10.200.20.4:22-10.200.16.10:38394.service - OpenSSH per-connection server daemon (10.200.16.10:38394). Sep 9 04:58:50.324050 sshd[6420]: Accepted publickey for core from 10.200.16.10 port 38394 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:50.325145 sshd-session[6420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:50.329038 systemd-logind[1855]: New session 12 of user core. Sep 9 04:58:50.337202 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 04:58:50.672375 sshd[6423]: Connection closed by 10.200.16.10 port 38394 Sep 9 04:58:50.673049 sshd-session[6420]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:50.676324 systemd[1]: sshd@9-10.200.20.4:22-10.200.16.10:38394.service: Deactivated successfully. Sep 9 04:58:50.678146 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 04:58:50.678933 systemd-logind[1855]: Session 12 logged out. Waiting for processes to exit. Sep 9 04:58:50.680594 systemd-logind[1855]: Removed session 12. Sep 9 04:58:50.754997 systemd[1]: Started sshd@10-10.200.20.4:22-10.200.16.10:38406.service - OpenSSH per-connection server daemon (10.200.16.10:38406). Sep 9 04:58:51.215576 sshd[6436]: Accepted publickey for core from 10.200.16.10 port 38406 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:51.216316 sshd-session[6436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:51.219785 systemd-logind[1855]: New session 13 of user core. Sep 9 04:58:51.224989 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 04:58:51.612711 sshd[6439]: Connection closed by 10.200.16.10 port 38406 Sep 9 04:58:51.612610 sshd-session[6436]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:51.615840 systemd[1]: sshd@10-10.200.20.4:22-10.200.16.10:38406.service: Deactivated successfully. Sep 9 04:58:51.617432 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 04:58:51.618224 systemd-logind[1855]: Session 13 logged out. Waiting for processes to exit. Sep 9 04:58:51.619277 systemd-logind[1855]: Removed session 13. Sep 9 04:58:51.690578 systemd[1]: Started sshd@11-10.200.20.4:22-10.200.16.10:38418.service - OpenSSH per-connection server daemon (10.200.16.10:38418). Sep 9 04:58:52.107366 sshd[6449]: Accepted publickey for core from 10.200.16.10 port 38418 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:52.108453 sshd-session[6449]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:52.112136 systemd-logind[1855]: New session 14 of user core. Sep 9 04:58:52.121191 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 04:58:52.453329 sshd[6452]: Connection closed by 10.200.16.10 port 38418 Sep 9 04:58:52.453992 sshd-session[6449]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:52.457582 systemd[1]: sshd@11-10.200.20.4:22-10.200.16.10:38418.service: Deactivated successfully. Sep 9 04:58:52.459308 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 04:58:52.460923 systemd-logind[1855]: Session 14 logged out. Waiting for processes to exit. Sep 9 04:58:52.462145 systemd-logind[1855]: Removed session 14. Sep 9 04:58:57.249614 containerd[1873]: time="2025-09-09T04:58:57.249431912Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"79c2df2e9bf889f0ef51b87e0da63b065f80a5147a25fbccaf7542dbdb1956f7\" pid:6478 exited_at:{seconds:1757393937 nanos:249165463}" Sep 9 04:58:57.530101 systemd[1]: Started sshd@12-10.200.20.4:22-10.200.16.10:38422.service - OpenSSH per-connection server daemon (10.200.16.10:38422). Sep 9 04:58:57.949461 sshd[6490]: Accepted publickey for core from 10.200.16.10 port 38422 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:58:57.950965 sshd-session[6490]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:57.956607 systemd-logind[1855]: New session 15 of user core. Sep 9 04:58:57.963199 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 04:58:58.303259 sshd[6493]: Connection closed by 10.200.16.10 port 38422 Sep 9 04:58:58.303944 sshd-session[6490]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:58.308224 systemd[1]: sshd@12-10.200.20.4:22-10.200.16.10:38422.service: Deactivated successfully. Sep 9 04:58:58.310094 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 04:58:58.310910 systemd-logind[1855]: Session 15 logged out. Waiting for processes to exit. Sep 9 04:58:58.312696 systemd-logind[1855]: Removed session 15. Sep 9 04:59:03.380938 systemd[1]: Started sshd@13-10.200.20.4:22-10.200.16.10:46910.service - OpenSSH per-connection server daemon (10.200.16.10:46910). Sep 9 04:59:03.813393 sshd[6507]: Accepted publickey for core from 10.200.16.10 port 46910 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:03.815040 sshd-session[6507]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:03.819766 systemd-logind[1855]: New session 16 of user core. Sep 9 04:59:03.823984 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 04:59:04.181994 sshd[6510]: Connection closed by 10.200.16.10 port 46910 Sep 9 04:59:04.182678 sshd-session[6507]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:04.185950 systemd[1]: sshd@13-10.200.20.4:22-10.200.16.10:46910.service: Deactivated successfully. Sep 9 04:59:04.189336 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 04:59:04.189973 systemd-logind[1855]: Session 16 logged out. Waiting for processes to exit. Sep 9 04:59:04.191073 systemd-logind[1855]: Removed session 16. Sep 9 04:59:09.263999 systemd[1]: Started sshd@14-10.200.20.4:22-10.200.16.10:46922.service - OpenSSH per-connection server daemon (10.200.16.10:46922). Sep 9 04:59:09.719472 sshd[6522]: Accepted publickey for core from 10.200.16.10 port 46922 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:09.720544 sshd-session[6522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:09.724128 systemd-logind[1855]: New session 17 of user core. Sep 9 04:59:09.731117 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 04:59:10.090620 sshd[6525]: Connection closed by 10.200.16.10 port 46922 Sep 9 04:59:10.091230 sshd-session[6522]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:10.094520 systemd[1]: sshd@14-10.200.20.4:22-10.200.16.10:46922.service: Deactivated successfully. Sep 9 04:59:10.096434 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 04:59:10.097146 systemd-logind[1855]: Session 17 logged out. Waiting for processes to exit. Sep 9 04:59:10.098426 systemd-logind[1855]: Removed session 17. Sep 9 04:59:10.173792 systemd[1]: Started sshd@15-10.200.20.4:22-10.200.16.10:34370.service - OpenSSH per-connection server daemon (10.200.16.10:34370). Sep 9 04:59:10.636840 sshd[6536]: Accepted publickey for core from 10.200.16.10 port 34370 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:10.637934 sshd-session[6536]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:10.641818 systemd-logind[1855]: New session 18 of user core. Sep 9 04:59:10.647015 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 04:59:11.156288 sshd[6539]: Connection closed by 10.200.16.10 port 34370 Sep 9 04:59:11.155306 sshd-session[6536]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:11.159052 systemd[1]: sshd@15-10.200.20.4:22-10.200.16.10:34370.service: Deactivated successfully. Sep 9 04:59:11.161520 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 04:59:11.163932 systemd-logind[1855]: Session 18 logged out. Waiting for processes to exit. Sep 9 04:59:11.166418 systemd-logind[1855]: Removed session 18. Sep 9 04:59:11.231214 systemd[1]: Started sshd@16-10.200.20.4:22-10.200.16.10:34382.service - OpenSSH per-connection server daemon (10.200.16.10:34382). Sep 9 04:59:11.654615 sshd[6549]: Accepted publickey for core from 10.200.16.10 port 34382 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:11.656439 sshd-session[6549]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:11.661789 systemd-logind[1855]: New session 19 of user core. Sep 9 04:59:11.667995 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 04:59:12.543456 sshd[6552]: Connection closed by 10.200.16.10 port 34382 Sep 9 04:59:12.543774 sshd-session[6549]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:12.546820 containerd[1873]: time="2025-09-09T04:59:12.546782016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"aa439d51c9b84a374aa3d9a664c362a61a97d4298a8b064e753b1aa88c2c11fc\" pid:6573 exited_at:{seconds:1757393952 nanos:546405580}" Sep 9 04:59:12.548028 systemd[1]: sshd@16-10.200.20.4:22-10.200.16.10:34382.service: Deactivated successfully. Sep 9 04:59:12.549677 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 04:59:12.550049 systemd-logind[1855]: Session 19 logged out. Waiting for processes to exit. Sep 9 04:59:12.554689 systemd-logind[1855]: Removed session 19. Sep 9 04:59:12.620689 systemd[1]: Started sshd@17-10.200.20.4:22-10.200.16.10:34390.service - OpenSSH per-connection server daemon (10.200.16.10:34390). Sep 9 04:59:13.043253 sshd[6592]: Accepted publickey for core from 10.200.16.10 port 34390 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:13.044331 sshd-session[6592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:13.047746 systemd-logind[1855]: New session 20 of user core. Sep 9 04:59:13.053360 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 04:59:13.335949 containerd[1873]: time="2025-09-09T04:59:13.335867642Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"f390f47ed958489698eabe379984f22512dc530560641e3ad11365bd818c8714\" pid:6612 exited_at:{seconds:1757393953 nanos:335639218}" Sep 9 04:59:13.522220 sshd[6595]: Connection closed by 10.200.16.10 port 34390 Sep 9 04:59:13.522792 sshd-session[6592]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:13.527774 systemd[1]: sshd@17-10.200.20.4:22-10.200.16.10:34390.service: Deactivated successfully. Sep 9 04:59:13.531443 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 04:59:13.532713 systemd-logind[1855]: Session 20 logged out. Waiting for processes to exit. Sep 9 04:59:13.534615 systemd-logind[1855]: Removed session 20. Sep 9 04:59:13.598081 systemd[1]: Started sshd@18-10.200.20.4:22-10.200.16.10:34400.service - OpenSSH per-connection server daemon (10.200.16.10:34400). Sep 9 04:59:14.017703 sshd[6628]: Accepted publickey for core from 10.200.16.10 port 34400 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:14.018938 sshd-session[6628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:14.025324 systemd-logind[1855]: New session 21 of user core. Sep 9 04:59:14.030978 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 04:59:14.388196 sshd[6631]: Connection closed by 10.200.16.10 port 34400 Sep 9 04:59:14.388951 sshd-session[6628]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:14.392024 systemd[1]: sshd@18-10.200.20.4:22-10.200.16.10:34400.service: Deactivated successfully. Sep 9 04:59:14.393905 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 04:59:14.394798 systemd-logind[1855]: Session 21 logged out. Waiting for processes to exit. Sep 9 04:59:14.396085 systemd-logind[1855]: Removed session 21. Sep 9 04:59:19.465726 systemd[1]: Started sshd@19-10.200.20.4:22-10.200.16.10:34402.service - OpenSSH per-connection server daemon (10.200.16.10:34402). Sep 9 04:59:19.878157 sshd[6645]: Accepted publickey for core from 10.200.16.10 port 34402 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:19.879213 sshd-session[6645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:19.883172 systemd-logind[1855]: New session 22 of user core. Sep 9 04:59:19.890996 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 04:59:20.231337 sshd[6648]: Connection closed by 10.200.16.10 port 34402 Sep 9 04:59:20.231903 sshd-session[6645]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:20.235043 systemd[1]: sshd@19-10.200.20.4:22-10.200.16.10:34402.service: Deactivated successfully. Sep 9 04:59:20.236800 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 04:59:20.237709 systemd-logind[1855]: Session 22 logged out. Waiting for processes to exit. Sep 9 04:59:20.239410 systemd-logind[1855]: Removed session 22. Sep 9 04:59:24.547216 containerd[1873]: time="2025-09-09T04:59:24.547170299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"64fb173f245169297b7f47d729c537847b5262c9e8ce2c83eb21fbd2d2848ee4\" pid:6673 exited_at:{seconds:1757393964 nanos:546691091}" Sep 9 04:59:25.308082 systemd[1]: Started sshd@20-10.200.20.4:22-10.200.16.10:48532.service - OpenSSH per-connection server daemon (10.200.16.10:48532). Sep 9 04:59:25.725349 sshd[6684]: Accepted publickey for core from 10.200.16.10 port 48532 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:25.726419 sshd-session[6684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:25.729906 systemd-logind[1855]: New session 23 of user core. Sep 9 04:59:25.741268 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 04:59:26.071094 sshd[6687]: Connection closed by 10.200.16.10 port 48532 Sep 9 04:59:26.071424 sshd-session[6684]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:26.075392 systemd[1]: sshd@20-10.200.20.4:22-10.200.16.10:48532.service: Deactivated successfully. Sep 9 04:59:26.076810 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 04:59:26.077435 systemd-logind[1855]: Session 23 logged out. Waiting for processes to exit. Sep 9 04:59:26.079641 systemd-logind[1855]: Removed session 23. Sep 9 04:59:27.247778 containerd[1873]: time="2025-09-09T04:59:27.247696037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"26208bde266aa9c726fa5190983d72429c0cbff97f0e5070b7b8638e5288d504\" id:\"37c789b47f7c6948e7e72ed426be9185a9d87fe91b7bcaa8ca9c8f5d2175a64a\" pid:6709 exited_at:{seconds:1757393967 nanos:247286496}" Sep 9 04:59:31.154350 systemd[1]: Started sshd@21-10.200.20.4:22-10.200.16.10:50534.service - OpenSSH per-connection server daemon (10.200.16.10:50534). Sep 9 04:59:31.569527 sshd[6724]: Accepted publickey for core from 10.200.16.10 port 50534 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:31.570615 sshd-session[6724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:31.574575 systemd-logind[1855]: New session 24 of user core. Sep 9 04:59:31.583141 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 04:59:31.917154 sshd[6727]: Connection closed by 10.200.16.10 port 50534 Sep 9 04:59:31.917662 sshd-session[6724]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:31.921479 systemd-logind[1855]: Session 24 logged out. Waiting for processes to exit. Sep 9 04:59:31.921721 systemd[1]: sshd@21-10.200.20.4:22-10.200.16.10:50534.service: Deactivated successfully. Sep 9 04:59:31.923753 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 04:59:31.925258 systemd-logind[1855]: Removed session 24. Sep 9 04:59:36.994464 systemd[1]: Started sshd@22-10.200.20.4:22-10.200.16.10:50550.service - OpenSSH per-connection server daemon (10.200.16.10:50550). Sep 9 04:59:37.417701 sshd[6738]: Accepted publickey for core from 10.200.16.10 port 50550 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:37.419147 sshd-session[6738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:37.422751 systemd-logind[1855]: New session 25 of user core. Sep 9 04:59:37.431123 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 04:59:37.773980 sshd[6741]: Connection closed by 10.200.16.10 port 50550 Sep 9 04:59:37.774546 sshd-session[6738]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:37.778159 systemd[1]: sshd@22-10.200.20.4:22-10.200.16.10:50550.service: Deactivated successfully. Sep 9 04:59:37.780515 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 04:59:37.781787 systemd-logind[1855]: Session 25 logged out. Waiting for processes to exit. Sep 9 04:59:37.782965 systemd-logind[1855]: Removed session 25. Sep 9 04:59:42.322347 containerd[1873]: time="2025-09-09T04:59:42.322261567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1832a28e9366033b146846ad5ffce6e6e5f8a66dcf42db2b5bb02098c89b6eb0\" id:\"3674d0d39e5c0b04c3e5987b40daab353ffbec07c8d37ef70388e2cb124d8106\" pid:6764 exited_at:{seconds:1757393982 nanos:321857610}" Sep 9 04:59:42.850078 systemd[1]: Started sshd@23-10.200.20.4:22-10.200.16.10:34122.service - OpenSSH per-connection server daemon (10.200.16.10:34122). Sep 9 04:59:43.269102 sshd[6775]: Accepted publickey for core from 10.200.16.10 port 34122 ssh2: RSA SHA256:3adVRBnOdwvD6jSty2nKG447nUgR57TdklVE855KMY8 Sep 9 04:59:43.270558 sshd-session[6775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:59:43.275921 systemd-logind[1855]: New session 26 of user core. Sep 9 04:59:43.279986 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 04:59:43.305071 containerd[1873]: time="2025-09-09T04:59:43.305040319Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da31e91eae84835cbbda67290c6ce736ceaa8d088c8f7910da64ea21f0a1ffb3\" id:\"bf8faef908e73ea8df5c664edca9e5d50e53c2cc5190fed5358f2f86dc15c899\" pid:6791 exited_at:{seconds:1757393983 nanos:304729429}" Sep 9 04:59:43.616284 sshd[6779]: Connection closed by 10.200.16.10 port 34122 Sep 9 04:59:43.616901 sshd-session[6775]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:43.620094 systemd[1]: sshd@23-10.200.20.4:22-10.200.16.10:34122.service: Deactivated successfully. Sep 9 04:59:43.622057 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 04:59:43.623374 systemd-logind[1855]: Session 26 logged out. Waiting for processes to exit. Sep 9 04:59:43.625066 systemd-logind[1855]: Removed session 26. Sep 9 04:59:47.113542 update_engine[1856]: I20250909 04:59:47.113415 1856 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 9 04:59:47.113542 update_engine[1856]: I20250909 04:59:47.113462 1856 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 9 04:59:47.115983 update_engine[1856]: I20250909 04:59:47.114019 1856 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 9 04:59:47.116443 update_engine[1856]: I20250909 04:59:47.116410 1856 omaha_request_params.cc:62] Current group set to developer Sep 9 04:59:47.117346 update_engine[1856]: I20250909 04:59:47.116973 1856 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 9 04:59:47.117346 update_engine[1856]: I20250909 04:59:47.117192 1856 update_attempter.cc:643] Scheduling an action processor start. Sep 9 04:59:47.117346 update_engine[1856]: I20250909 04:59:47.117214 1856 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 9 04:59:47.119611 update_engine[1856]: I20250909 04:59:47.118935 1856 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 9 04:59:47.119611 update_engine[1856]: I20250909 04:59:47.119014 1856 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 9 04:59:47.119611 update_engine[1856]: I20250909 04:59:47.119020 1856 omaha_request_action.cc:272] Request: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: Sep 9 04:59:47.119611 update_engine[1856]: I20250909 04:59:47.119024 1856 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 04:59:47.121740 update_engine[1856]: I20250909 04:59:47.120611 1856 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 04:59:47.121740 update_engine[1856]: I20250909 04:59:47.121147 1856 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 04:59:47.124936 locksmithd[1975]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 9 04:59:47.188262 update_engine[1856]: E20250909 04:59:47.188194 1856 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 04:59:47.188425 update_engine[1856]: I20250909 04:59:47.188300 1856 libcurl_http_fetcher.cc:283] No HTTP response, retry 1