Aug 19 00:24:33.830829 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Aug 19 00:24:33.830850 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Mon Aug 18 22:15:14 -00 2025 Aug 19 00:24:33.830860 kernel: KASLR enabled Aug 19 00:24:33.830866 kernel: efi: EFI v2.7 by EDK II Aug 19 00:24:33.830871 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb832018 ACPI 2.0=0xdbfd0018 RNG=0xdbfd0a18 MEMRESERVE=0xdb838218 Aug 19 00:24:33.830878 kernel: random: crng init done Aug 19 00:24:33.830885 kernel: secureboot: Secure boot disabled Aug 19 00:24:33.830890 kernel: ACPI: Early table checksum verification disabled Aug 19 00:24:33.830897 kernel: ACPI: RSDP 0x00000000DBFD0018 000024 (v02 BOCHS ) Aug 19 00:24:33.830904 kernel: ACPI: XSDT 0x00000000DBFD0F18 000064 (v01 BOCHS BXPC 00000001 01000013) Aug 19 00:24:33.830910 kernel: ACPI: FACP 0x00000000DBFD0B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830916 kernel: ACPI: DSDT 0x00000000DBF0E018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830922 kernel: ACPI: APIC 0x00000000DBFD0C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830928 kernel: ACPI: PPTT 0x00000000DBFD0098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830935 kernel: ACPI: GTDT 0x00000000DBFD0818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830943 kernel: ACPI: MCFG 0x00000000DBFD0A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830949 kernel: ACPI: SPCR 0x00000000DBFD0918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830955 kernel: ACPI: DBG2 0x00000000DBFD0998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830961 kernel: ACPI: IORT 0x00000000DBFD0198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:24:33.830967 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Aug 19 00:24:33.830973 kernel: ACPI: Use ACPI SPCR as default console: Yes Aug 19 00:24:33.830979 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Aug 19 00:24:33.830986 kernel: NODE_DATA(0) allocated [mem 0xdc965a00-0xdc96cfff] Aug 19 00:24:33.830992 kernel: Zone ranges: Aug 19 00:24:33.830998 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Aug 19 00:24:33.831005 kernel: DMA32 empty Aug 19 00:24:33.831011 kernel: Normal empty Aug 19 00:24:33.831017 kernel: Device empty Aug 19 00:24:33.831023 kernel: Movable zone start for each node Aug 19 00:24:33.831029 kernel: Early memory node ranges Aug 19 00:24:33.831035 kernel: node 0: [mem 0x0000000040000000-0x00000000db81ffff] Aug 19 00:24:33.831042 kernel: node 0: [mem 0x00000000db820000-0x00000000db82ffff] Aug 19 00:24:33.831048 kernel: node 0: [mem 0x00000000db830000-0x00000000dc09ffff] Aug 19 00:24:33.831054 kernel: node 0: [mem 0x00000000dc0a0000-0x00000000dc2dffff] Aug 19 00:24:33.831061 kernel: node 0: [mem 0x00000000dc2e0000-0x00000000dc36ffff] Aug 19 00:24:33.831067 kernel: node 0: [mem 0x00000000dc370000-0x00000000dc45ffff] Aug 19 00:24:33.831073 kernel: node 0: [mem 0x00000000dc460000-0x00000000dc52ffff] Aug 19 00:24:33.831081 kernel: node 0: [mem 0x00000000dc530000-0x00000000dc5cffff] Aug 19 00:24:33.831094 kernel: node 0: [mem 0x00000000dc5d0000-0x00000000dce1ffff] Aug 19 00:24:33.831100 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Aug 19 00:24:33.831109 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Aug 19 00:24:33.831116 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Aug 19 00:24:33.831122 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Aug 19 00:24:33.831130 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Aug 19 00:24:33.831137 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Aug 19 00:24:33.831143 kernel: cma: Reserved 16 MiB at 0x00000000d8000000 on node -1 Aug 19 00:24:33.831150 kernel: psci: probing for conduit method from ACPI. Aug 19 00:24:33.831156 kernel: psci: PSCIv1.1 detected in firmware. Aug 19 00:24:33.831163 kernel: psci: Using standard PSCI v0.2 function IDs Aug 19 00:24:33.831169 kernel: psci: Trusted OS migration not required Aug 19 00:24:33.831180 kernel: psci: SMC Calling Convention v1.1 Aug 19 00:24:33.831186 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Aug 19 00:24:33.831195 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Aug 19 00:24:33.831206 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Aug 19 00:24:33.831213 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Aug 19 00:24:33.831222 kernel: Detected PIPT I-cache on CPU0 Aug 19 00:24:33.831229 kernel: CPU features: detected: GIC system register CPU interface Aug 19 00:24:33.831235 kernel: CPU features: detected: Spectre-v4 Aug 19 00:24:33.831242 kernel: CPU features: detected: Spectre-BHB Aug 19 00:24:33.831248 kernel: CPU features: kernel page table isolation forced ON by KASLR Aug 19 00:24:33.831255 kernel: CPU features: detected: Kernel page table isolation (KPTI) Aug 19 00:24:33.831261 kernel: CPU features: detected: ARM erratum 1418040 Aug 19 00:24:33.831268 kernel: CPU features: detected: SSBS not fully self-synchronizing Aug 19 00:24:33.831274 kernel: alternatives: applying boot alternatives Aug 19 00:24:33.831282 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:24:33.831290 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 00:24:33.831297 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 00:24:33.831303 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 00:24:33.831310 kernel: Fallback order for Node 0: 0 Aug 19 00:24:33.831317 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Aug 19 00:24:33.831323 kernel: Policy zone: DMA Aug 19 00:24:33.831330 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 00:24:33.831336 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Aug 19 00:24:33.831343 kernel: software IO TLB: area num 4. Aug 19 00:24:33.831349 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Aug 19 00:24:33.831356 kernel: software IO TLB: mapped [mem 0x00000000d7c00000-0x00000000d8000000] (4MB) Aug 19 00:24:33.831364 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Aug 19 00:24:33.831371 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 00:24:33.831378 kernel: rcu: RCU event tracing is enabled. Aug 19 00:24:33.831385 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Aug 19 00:24:33.831392 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 00:24:33.831398 kernel: Tracing variant of Tasks RCU enabled. Aug 19 00:24:33.831405 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 00:24:33.831412 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Aug 19 00:24:33.831418 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 00:24:33.831425 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 00:24:33.831432 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Aug 19 00:24:33.831440 kernel: GICv3: 256 SPIs implemented Aug 19 00:24:33.831447 kernel: GICv3: 0 Extended SPIs implemented Aug 19 00:24:33.831453 kernel: Root IRQ handler: gic_handle_irq Aug 19 00:24:33.831460 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Aug 19 00:24:33.831467 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Aug 19 00:24:33.831473 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Aug 19 00:24:33.831480 kernel: ITS [mem 0x08080000-0x0809ffff] Aug 19 00:24:33.831486 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Aug 19 00:24:33.831493 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Aug 19 00:24:33.831500 kernel: GICv3: using LPI property table @0x0000000040130000 Aug 19 00:24:33.831507 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Aug 19 00:24:33.831513 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 00:24:33.831522 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:24:33.831528 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Aug 19 00:24:33.831535 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Aug 19 00:24:33.831542 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Aug 19 00:24:33.831549 kernel: arm-pv: using stolen time PV Aug 19 00:24:33.831556 kernel: Console: colour dummy device 80x25 Aug 19 00:24:33.831563 kernel: ACPI: Core revision 20240827 Aug 19 00:24:33.831570 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Aug 19 00:24:33.831577 kernel: pid_max: default: 32768 minimum: 301 Aug 19 00:24:33.831584 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 00:24:33.831592 kernel: landlock: Up and running. Aug 19 00:24:33.831599 kernel: SELinux: Initializing. Aug 19 00:24:33.831605 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:24:33.831613 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:24:33.831619 kernel: rcu: Hierarchical SRCU implementation. Aug 19 00:24:33.831627 kernel: rcu: Max phase no-delay instances is 400. Aug 19 00:24:33.831634 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 00:24:33.831640 kernel: Remapping and enabling EFI services. Aug 19 00:24:33.831647 kernel: smp: Bringing up secondary CPUs ... Aug 19 00:24:33.831660 kernel: Detected PIPT I-cache on CPU1 Aug 19 00:24:33.831667 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Aug 19 00:24:33.831675 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Aug 19 00:24:33.831683 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:24:33.831690 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Aug 19 00:24:33.831697 kernel: Detected PIPT I-cache on CPU2 Aug 19 00:24:33.831705 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Aug 19 00:24:33.831712 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Aug 19 00:24:33.831720 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:24:33.831728 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Aug 19 00:24:33.831748 kernel: Detected PIPT I-cache on CPU3 Aug 19 00:24:33.831767 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Aug 19 00:24:33.831775 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Aug 19 00:24:33.831782 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:24:33.831789 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Aug 19 00:24:33.831796 kernel: smp: Brought up 1 node, 4 CPUs Aug 19 00:24:33.831804 kernel: SMP: Total of 4 processors activated. Aug 19 00:24:33.831813 kernel: CPU: All CPU(s) started at EL1 Aug 19 00:24:33.831820 kernel: CPU features: detected: 32-bit EL0 Support Aug 19 00:24:33.831828 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Aug 19 00:24:33.831835 kernel: CPU features: detected: Common not Private translations Aug 19 00:24:33.831842 kernel: CPU features: detected: CRC32 instructions Aug 19 00:24:33.831849 kernel: CPU features: detected: Enhanced Virtualization Traps Aug 19 00:24:33.831856 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Aug 19 00:24:33.831864 kernel: CPU features: detected: LSE atomic instructions Aug 19 00:24:33.831871 kernel: CPU features: detected: Privileged Access Never Aug 19 00:24:33.831878 kernel: CPU features: detected: RAS Extension Support Aug 19 00:24:33.831887 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Aug 19 00:24:33.831894 kernel: alternatives: applying system-wide alternatives Aug 19 00:24:33.831901 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Aug 19 00:24:33.831909 kernel: Memory: 2424544K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 125408K reserved, 16384K cma-reserved) Aug 19 00:24:33.831916 kernel: devtmpfs: initialized Aug 19 00:24:33.831923 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 00:24:33.831931 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Aug 19 00:24:33.831939 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Aug 19 00:24:33.831947 kernel: 0 pages in range for non-PLT usage Aug 19 00:24:33.831955 kernel: 508576 pages in range for PLT usage Aug 19 00:24:33.831962 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 00:24:33.831969 kernel: SMBIOS 3.0.0 present. Aug 19 00:24:33.831977 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Aug 19 00:24:33.831984 kernel: DMI: Memory slots populated: 1/1 Aug 19 00:24:33.831991 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 00:24:33.831999 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Aug 19 00:24:33.832006 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Aug 19 00:24:33.832015 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Aug 19 00:24:33.832022 kernel: audit: initializing netlink subsys (disabled) Aug 19 00:24:33.832029 kernel: audit: type=2000 audit(0.019:1): state=initialized audit_enabled=0 res=1 Aug 19 00:24:33.832037 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 00:24:33.832044 kernel: cpuidle: using governor menu Aug 19 00:24:33.832051 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Aug 19 00:24:33.832058 kernel: ASID allocator initialised with 32768 entries Aug 19 00:24:33.832065 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 00:24:33.832073 kernel: Serial: AMBA PL011 UART driver Aug 19 00:24:33.832081 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 00:24:33.832093 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 00:24:33.832101 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Aug 19 00:24:33.832108 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Aug 19 00:24:33.832115 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 00:24:33.832123 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 00:24:33.832130 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Aug 19 00:24:33.832137 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Aug 19 00:24:33.832144 kernel: ACPI: Added _OSI(Module Device) Aug 19 00:24:33.832153 kernel: ACPI: Added _OSI(Processor Device) Aug 19 00:24:33.832160 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 00:24:33.832168 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 00:24:33.832175 kernel: ACPI: Interpreter enabled Aug 19 00:24:33.832183 kernel: ACPI: Using GIC for interrupt routing Aug 19 00:24:33.832190 kernel: ACPI: MCFG table detected, 1 entries Aug 19 00:24:33.832197 kernel: ACPI: CPU0 has been hot-added Aug 19 00:24:33.832204 kernel: ACPI: CPU1 has been hot-added Aug 19 00:24:33.832211 kernel: ACPI: CPU2 has been hot-added Aug 19 00:24:33.832218 kernel: ACPI: CPU3 has been hot-added Aug 19 00:24:33.832227 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Aug 19 00:24:33.832234 kernel: printk: legacy console [ttyAMA0] enabled Aug 19 00:24:33.832242 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 19 00:24:33.832378 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 19 00:24:33.832446 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Aug 19 00:24:33.832508 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Aug 19 00:24:33.832568 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Aug 19 00:24:33.832631 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Aug 19 00:24:33.832641 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Aug 19 00:24:33.832648 kernel: PCI host bridge to bus 0000:00 Aug 19 00:24:33.832716 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Aug 19 00:24:33.832790 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Aug 19 00:24:33.832848 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Aug 19 00:24:33.832902 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 19 00:24:33.832984 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Aug 19 00:24:33.833061 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Aug 19 00:24:33.833137 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Aug 19 00:24:33.833201 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Aug 19 00:24:33.833263 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Aug 19 00:24:33.833325 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Aug 19 00:24:33.833387 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Aug 19 00:24:33.833453 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Aug 19 00:24:33.833509 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Aug 19 00:24:33.833564 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Aug 19 00:24:33.833618 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Aug 19 00:24:33.833628 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Aug 19 00:24:33.833635 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Aug 19 00:24:33.833643 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Aug 19 00:24:33.833652 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Aug 19 00:24:33.833659 kernel: iommu: Default domain type: Translated Aug 19 00:24:33.833666 kernel: iommu: DMA domain TLB invalidation policy: strict mode Aug 19 00:24:33.833673 kernel: efivars: Registered efivars operations Aug 19 00:24:33.833681 kernel: vgaarb: loaded Aug 19 00:24:33.833688 kernel: clocksource: Switched to clocksource arch_sys_counter Aug 19 00:24:33.833696 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 00:24:33.833703 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 00:24:33.833710 kernel: pnp: PnP ACPI init Aug 19 00:24:33.833793 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Aug 19 00:24:33.833805 kernel: pnp: PnP ACPI: found 1 devices Aug 19 00:24:33.833812 kernel: NET: Registered PF_INET protocol family Aug 19 00:24:33.833820 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 00:24:33.833827 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 19 00:24:33.833834 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 00:24:33.833842 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 00:24:33.833849 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 19 00:24:33.833858 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 19 00:24:33.833866 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:24:33.833873 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:24:33.833881 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 00:24:33.833888 kernel: PCI: CLS 0 bytes, default 64 Aug 19 00:24:33.833895 kernel: kvm [1]: HYP mode not available Aug 19 00:24:33.833903 kernel: Initialise system trusted keyrings Aug 19 00:24:33.833910 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 19 00:24:33.833917 kernel: Key type asymmetric registered Aug 19 00:24:33.833925 kernel: Asymmetric key parser 'x509' registered Aug 19 00:24:33.833933 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Aug 19 00:24:33.833940 kernel: io scheduler mq-deadline registered Aug 19 00:24:33.833948 kernel: io scheduler kyber registered Aug 19 00:24:33.833955 kernel: io scheduler bfq registered Aug 19 00:24:33.833962 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Aug 19 00:24:33.833970 kernel: ACPI: button: Power Button [PWRB] Aug 19 00:24:33.833977 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Aug 19 00:24:33.834041 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Aug 19 00:24:33.834053 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 00:24:33.834060 kernel: thunder_xcv, ver 1.0 Aug 19 00:24:33.834067 kernel: thunder_bgx, ver 1.0 Aug 19 00:24:33.834074 kernel: nicpf, ver 1.0 Aug 19 00:24:33.834082 kernel: nicvf, ver 1.0 Aug 19 00:24:33.834168 kernel: rtc-efi rtc-efi.0: registered as rtc0 Aug 19 00:24:33.834230 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-08-19T00:24:33 UTC (1755563073) Aug 19 00:24:33.834240 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 19 00:24:33.834248 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Aug 19 00:24:33.834258 kernel: watchdog: NMI not fully supported Aug 19 00:24:33.834265 kernel: watchdog: Hard watchdog permanently disabled Aug 19 00:24:33.834272 kernel: NET: Registered PF_INET6 protocol family Aug 19 00:24:33.834279 kernel: Segment Routing with IPv6 Aug 19 00:24:33.834287 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 00:24:33.834294 kernel: NET: Registered PF_PACKET protocol family Aug 19 00:24:33.834301 kernel: Key type dns_resolver registered Aug 19 00:24:33.834309 kernel: registered taskstats version 1 Aug 19 00:24:33.834316 kernel: Loading compiled-in X.509 certificates Aug 19 00:24:33.834324 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: becc5a61d1c5dcbcd174f4649c64b863031dbaa8' Aug 19 00:24:33.834332 kernel: Demotion targets for Node 0: null Aug 19 00:24:33.834339 kernel: Key type .fscrypt registered Aug 19 00:24:33.834346 kernel: Key type fscrypt-provisioning registered Aug 19 00:24:33.834353 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 00:24:33.834360 kernel: ima: Allocated hash algorithm: sha1 Aug 19 00:24:33.834368 kernel: ima: No architecture policies found Aug 19 00:24:33.834375 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Aug 19 00:24:33.834383 kernel: clk: Disabling unused clocks Aug 19 00:24:33.834391 kernel: PM: genpd: Disabling unused power domains Aug 19 00:24:33.834398 kernel: Warning: unable to open an initial console. Aug 19 00:24:33.834405 kernel: Freeing unused kernel memory: 38912K Aug 19 00:24:33.834412 kernel: Run /init as init process Aug 19 00:24:33.834419 kernel: with arguments: Aug 19 00:24:33.834427 kernel: /init Aug 19 00:24:33.834434 kernel: with environment: Aug 19 00:24:33.834440 kernel: HOME=/ Aug 19 00:24:33.834448 kernel: TERM=linux Aug 19 00:24:33.834456 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 00:24:33.834464 systemd[1]: Successfully made /usr/ read-only. Aug 19 00:24:33.834475 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:24:33.834483 systemd[1]: Detected virtualization kvm. Aug 19 00:24:33.834491 systemd[1]: Detected architecture arm64. Aug 19 00:24:33.834498 systemd[1]: Running in initrd. Aug 19 00:24:33.834506 systemd[1]: No hostname configured, using default hostname. Aug 19 00:24:33.834515 systemd[1]: Hostname set to . Aug 19 00:24:33.834523 systemd[1]: Initializing machine ID from VM UUID. Aug 19 00:24:33.834531 systemd[1]: Queued start job for default target initrd.target. Aug 19 00:24:33.834538 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:24:33.834546 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:24:33.834555 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 00:24:33.834563 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:24:33.834571 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 00:24:33.834580 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 00:24:33.834589 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 00:24:33.834597 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 00:24:33.834605 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:24:33.834613 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:24:33.834621 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:24:33.834628 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:24:33.834637 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:24:33.834645 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:24:33.834653 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:24:33.834661 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:24:33.834669 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 00:24:33.834677 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 00:24:33.834685 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:24:33.834693 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:24:33.834702 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:24:33.834710 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:24:33.834717 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 00:24:33.834725 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:24:33.834742 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 00:24:33.834752 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 00:24:33.834760 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 00:24:33.834768 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:24:33.834776 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:24:33.834786 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:24:33.834794 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 00:24:33.834802 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:24:33.834810 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 00:24:33.834819 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 00:24:33.834844 systemd-journald[244]: Collecting audit messages is disabled. Aug 19 00:24:33.834864 systemd-journald[244]: Journal started Aug 19 00:24:33.834884 systemd-journald[244]: Runtime Journal (/run/log/journal/a9e498ed3d404dfdb5114743f5c82f5c) is 6M, max 48.5M, 42.4M free. Aug 19 00:24:33.830139 systemd-modules-load[245]: Inserted module 'overlay' Aug 19 00:24:33.842266 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:24:33.845879 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:24:33.848678 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 00:24:33.850628 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:24:33.860014 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 00:24:33.860050 kernel: Bridge firewalling registered Aug 19 00:24:33.858874 systemd-modules-load[245]: Inserted module 'br_netfilter' Aug 19 00:24:33.861905 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 00:24:33.863744 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:24:33.867458 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:24:33.869004 systemd-tmpfiles[263]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 00:24:33.869151 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:24:33.876102 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:24:33.882362 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:24:33.885051 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:24:33.886448 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:24:33.888809 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:24:33.897513 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 00:24:33.921490 dracut-cmdline[291]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:24:33.938701 systemd-resolved[287]: Positive Trust Anchors: Aug 19 00:24:33.938722 systemd-resolved[287]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:24:33.938763 systemd-resolved[287]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:24:33.948060 systemd-resolved[287]: Defaulting to hostname 'linux'. Aug 19 00:24:33.950226 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:24:33.951628 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:24:34.045772 kernel: SCSI subsystem initialized Aug 19 00:24:34.050754 kernel: Loading iSCSI transport class v2.0-870. Aug 19 00:24:34.060767 kernel: iscsi: registered transport (tcp) Aug 19 00:24:34.073762 kernel: iscsi: registered transport (qla4xxx) Aug 19 00:24:34.073785 kernel: QLogic iSCSI HBA Driver Aug 19 00:24:34.091845 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:24:34.117338 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:24:34.119115 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:24:34.193129 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 00:24:34.195794 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 00:24:34.274790 kernel: raid6: neonx8 gen() 15776 MB/s Aug 19 00:24:34.291766 kernel: raid6: neonx4 gen() 15791 MB/s Aug 19 00:24:34.308769 kernel: raid6: neonx2 gen() 13187 MB/s Aug 19 00:24:34.325773 kernel: raid6: neonx1 gen() 10428 MB/s Aug 19 00:24:34.342782 kernel: raid6: int64x8 gen() 6893 MB/s Aug 19 00:24:34.359769 kernel: raid6: int64x4 gen() 7347 MB/s Aug 19 00:24:34.376777 kernel: raid6: int64x2 gen() 6102 MB/s Aug 19 00:24:34.394001 kernel: raid6: int64x1 gen() 5050 MB/s Aug 19 00:24:34.394029 kernel: raid6: using algorithm neonx4 gen() 15791 MB/s Aug 19 00:24:34.411920 kernel: raid6: .... xor() 12342 MB/s, rmw enabled Aug 19 00:24:34.411944 kernel: raid6: using neon recovery algorithm Aug 19 00:24:34.417888 kernel: xor: measuring software checksum speed Aug 19 00:24:34.417917 kernel: 8regs : 21607 MB/sec Aug 19 00:24:34.419279 kernel: 32regs : 21567 MB/sec Aug 19 00:24:34.419294 kernel: arm64_neon : 27813 MB/sec Aug 19 00:24:34.419304 kernel: xor: using function: arm64_neon (27813 MB/sec) Aug 19 00:24:34.479767 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 00:24:34.486709 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:24:34.490702 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:24:34.526337 systemd-udevd[498]: Using default interface naming scheme 'v255'. Aug 19 00:24:34.530577 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:24:34.532716 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 00:24:34.561283 dracut-pre-trigger[505]: rd.md=0: removing MD RAID activation Aug 19 00:24:34.597024 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:24:34.599784 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:24:34.663549 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:24:34.666986 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 00:24:34.719849 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Aug 19 00:24:34.720059 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Aug 19 00:24:34.725877 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:24:34.726057 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:24:34.730157 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:24:34.735148 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 19 00:24:34.735173 kernel: GPT:9289727 != 19775487 Aug 19 00:24:34.735184 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 19 00:24:34.735193 kernel: GPT:9289727 != 19775487 Aug 19 00:24:34.735202 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 19 00:24:34.735211 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 00:24:34.732242 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:24:34.764908 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Aug 19 00:24:34.771020 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:24:34.784198 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Aug 19 00:24:34.785865 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 00:24:34.793244 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Aug 19 00:24:34.794613 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Aug 19 00:24:34.804665 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 00:24:34.806301 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:24:34.808602 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:24:34.810985 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:24:34.813888 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 00:24:34.815870 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 00:24:34.840008 disk-uuid[590]: Primary Header is updated. Aug 19 00:24:34.840008 disk-uuid[590]: Secondary Entries is updated. Aug 19 00:24:34.840008 disk-uuid[590]: Secondary Header is updated. Aug 19 00:24:34.845458 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:24:34.847708 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 00:24:35.867757 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 00:24:35.868516 disk-uuid[595]: The operation has completed successfully. Aug 19 00:24:35.897783 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 00:24:35.897902 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 00:24:35.918849 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 00:24:35.945963 sh[608]: Success Aug 19 00:24:35.960767 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 00:24:35.960820 kernel: device-mapper: uevent: version 1.0.3 Aug 19 00:24:35.960833 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 00:24:35.972806 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Aug 19 00:24:36.026486 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 00:24:36.029519 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 00:24:36.044638 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 00:24:36.051781 kernel: BTRFS: device fsid 1e492084-d287-4a43-8dc6-ad086a072625 devid 1 transid 45 /dev/mapper/usr (253:0) scanned by mount (620) Aug 19 00:24:36.051822 kernel: BTRFS info (device dm-0): first mount of filesystem 1e492084-d287-4a43-8dc6-ad086a072625 Aug 19 00:24:36.051832 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:24:36.053398 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 00:24:36.057408 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 00:24:36.058825 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:24:36.060327 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 00:24:36.061263 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 00:24:36.063024 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 00:24:36.087852 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (652) Aug 19 00:24:36.087909 kernel: BTRFS info (device vda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:24:36.090125 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:24:36.090176 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 00:24:36.097757 kernel: BTRFS info (device vda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:24:36.098843 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 00:24:36.101882 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 00:24:36.178839 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:24:36.183325 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:24:36.234305 systemd-networkd[796]: lo: Link UP Aug 19 00:24:36.234317 systemd-networkd[796]: lo: Gained carrier Aug 19 00:24:36.235226 systemd-networkd[796]: Enumeration completed Aug 19 00:24:36.235318 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:24:36.235781 systemd-networkd[796]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:24:36.235785 systemd-networkd[796]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:24:36.236355 systemd-networkd[796]: eth0: Link UP Aug 19 00:24:36.236657 systemd-networkd[796]: eth0: Gained carrier Aug 19 00:24:36.236667 systemd-networkd[796]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:24:36.238012 systemd[1]: Reached target network.target - Network. Aug 19 00:24:36.261794 systemd-networkd[796]: eth0: DHCPv4 address 10.0.0.116/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 00:24:36.312028 ignition[701]: Ignition 2.21.0 Aug 19 00:24:36.312042 ignition[701]: Stage: fetch-offline Aug 19 00:24:36.312073 ignition[701]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:36.312089 ignition[701]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:36.312347 ignition[701]: parsed url from cmdline: "" Aug 19 00:24:36.312351 ignition[701]: no config URL provided Aug 19 00:24:36.312355 ignition[701]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 00:24:36.312362 ignition[701]: no config at "/usr/lib/ignition/user.ign" Aug 19 00:24:36.312382 ignition[701]: op(1): [started] loading QEMU firmware config module Aug 19 00:24:36.312387 ignition[701]: op(1): executing: "modprobe" "qemu_fw_cfg" Aug 19 00:24:36.329408 ignition[701]: op(1): [finished] loading QEMU firmware config module Aug 19 00:24:36.369272 ignition[701]: parsing config with SHA512: 91e4532626f44db51acd728b1ee9d909dae45bcb355cdab7aa93597c8843a9b7993308af6d03ff7cb2d872b81edf32fa715985deedc80ad1d4876396e4c28ff3 Aug 19 00:24:36.373983 unknown[701]: fetched base config from "system" Aug 19 00:24:36.374008 unknown[701]: fetched user config from "qemu" Aug 19 00:24:36.374489 ignition[701]: fetch-offline: fetch-offline passed Aug 19 00:24:36.374555 ignition[701]: Ignition finished successfully Aug 19 00:24:36.376590 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:24:36.378002 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 19 00:24:36.379860 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 00:24:36.406189 ignition[809]: Ignition 2.21.0 Aug 19 00:24:36.406207 ignition[809]: Stage: kargs Aug 19 00:24:36.406899 ignition[809]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:36.406911 ignition[809]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:36.408446 ignition[809]: kargs: kargs passed Aug 19 00:24:36.410810 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 00:24:36.408506 ignition[809]: Ignition finished successfully Aug 19 00:24:36.412830 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 00:24:36.445520 ignition[817]: Ignition 2.21.0 Aug 19 00:24:36.445540 ignition[817]: Stage: disks Aug 19 00:24:36.445697 ignition[817]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:36.445706 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:36.447853 ignition[817]: disks: disks passed Aug 19 00:24:36.447959 ignition[817]: Ignition finished successfully Aug 19 00:24:36.449593 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 00:24:36.451413 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 00:24:36.453033 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 00:24:36.455173 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:24:36.457190 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:24:36.458988 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:24:36.461618 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 00:24:36.497042 systemd-fsck[826]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 19 00:24:36.502290 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 00:24:36.504693 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 00:24:36.586761 kernel: EXT4-fs (vda9): mounted filesystem 593a9299-85f8-44ab-a00f-cf95b7233713 r/w with ordered data mode. Quota mode: none. Aug 19 00:24:36.587571 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 00:24:36.588899 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 00:24:36.592106 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:24:36.594443 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 00:24:36.595475 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 19 00:24:36.595528 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 00:24:36.595566 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:24:36.612405 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 00:24:36.614478 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 00:24:36.623190 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (834) Aug 19 00:24:36.623230 kernel: BTRFS info (device vda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:24:36.623241 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:24:36.624785 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 00:24:36.627685 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:24:36.660966 initrd-setup-root[858]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 00:24:36.664331 initrd-setup-root[865]: cut: /sysroot/etc/group: No such file or directory Aug 19 00:24:36.667693 initrd-setup-root[872]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 00:24:36.671867 initrd-setup-root[879]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 00:24:36.745320 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 00:24:36.747476 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 00:24:36.749239 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 00:24:36.767754 kernel: BTRFS info (device vda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:24:36.785327 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 00:24:36.795512 ignition[947]: INFO : Ignition 2.21.0 Aug 19 00:24:36.795512 ignition[947]: INFO : Stage: mount Aug 19 00:24:36.797707 ignition[947]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:36.797707 ignition[947]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:36.797707 ignition[947]: INFO : mount: mount passed Aug 19 00:24:36.797707 ignition[947]: INFO : Ignition finished successfully Aug 19 00:24:36.799547 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 00:24:36.802759 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 00:24:37.049962 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 00:24:37.051516 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:24:37.069755 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (960) Aug 19 00:24:37.069801 kernel: BTRFS info (device vda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:24:37.071984 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:24:37.072993 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 00:24:37.075553 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:24:37.101984 ignition[978]: INFO : Ignition 2.21.0 Aug 19 00:24:37.101984 ignition[978]: INFO : Stage: files Aug 19 00:24:37.104265 ignition[978]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:37.104265 ignition[978]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:37.104265 ignition[978]: DEBUG : files: compiled without relabeling support, skipping Aug 19 00:24:37.107700 ignition[978]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 00:24:37.107700 ignition[978]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 00:24:37.110367 ignition[978]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 00:24:37.110367 ignition[978]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 00:24:37.113145 ignition[978]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 00:24:37.113145 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:24:37.113145 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Aug 19 00:24:37.110624 unknown[978]: wrote ssh authorized keys file for user: core Aug 19 00:24:37.167167 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 00:24:37.700102 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:24:37.702252 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:24:37.716246 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Aug 19 00:24:38.050028 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 00:24:38.123868 systemd-networkd[796]: eth0: Gained IPv6LL Aug 19 00:24:38.378782 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:24:38.378782 ignition[978]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 19 00:24:38.382857 ignition[978]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Aug 19 00:24:38.424817 ignition[978]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 00:24:38.429211 ignition[978]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 00:24:38.430759 ignition[978]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Aug 19 00:24:38.430759 ignition[978]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Aug 19 00:24:38.430759 ignition[978]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 00:24:38.430759 ignition[978]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:24:38.430759 ignition[978]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:24:38.430759 ignition[978]: INFO : files: files passed Aug 19 00:24:38.430759 ignition[978]: INFO : Ignition finished successfully Aug 19 00:24:38.431844 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 00:24:38.436329 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 00:24:38.438908 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 00:24:38.454549 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 00:24:38.454666 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 00:24:38.456770 initrd-setup-root-after-ignition[1006]: grep: /sysroot/oem/oem-release: No such file or directory Aug 19 00:24:38.462174 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:24:38.462174 initrd-setup-root-after-ignition[1008]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:24:38.465918 initrd-setup-root-after-ignition[1012]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:24:38.467724 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:24:38.469863 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 00:24:38.472651 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 00:24:38.516894 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 00:24:38.517826 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 00:24:38.519462 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 00:24:38.522960 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 00:24:38.525014 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 00:24:38.525898 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 00:24:38.564842 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:24:38.567646 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 00:24:38.589774 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:24:38.591170 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:24:38.593217 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 00:24:38.594913 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 00:24:38.595055 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:24:38.597662 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 00:24:38.599806 systemd[1]: Stopped target basic.target - Basic System. Aug 19 00:24:38.601452 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 00:24:38.603165 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:24:38.605213 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 00:24:38.607185 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:24:38.609187 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 00:24:38.611026 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:24:38.613027 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 00:24:38.615345 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 00:24:38.617279 systemd[1]: Stopped target swap.target - Swaps. Aug 19 00:24:38.618877 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 00:24:38.619015 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:24:38.621385 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:24:38.623443 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:24:38.625488 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 00:24:38.625571 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:24:38.627613 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 00:24:38.627751 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 00:24:38.630688 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 00:24:38.630841 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:24:38.632875 systemd[1]: Stopped target paths.target - Path Units. Aug 19 00:24:38.634489 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 00:24:38.637795 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:24:38.639767 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 00:24:38.642265 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 00:24:38.643913 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 00:24:38.644009 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:24:38.645579 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 00:24:38.645663 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:24:38.647313 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 00:24:38.647440 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:24:38.649244 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 00:24:38.649353 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 00:24:38.651699 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 00:24:38.653633 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 00:24:38.653796 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:24:38.656800 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 00:24:38.658552 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 00:24:38.658697 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:24:38.660667 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 00:24:38.660787 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:24:38.668432 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 00:24:38.668594 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 00:24:38.673985 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 00:24:38.678543 ignition[1032]: INFO : Ignition 2.21.0 Aug 19 00:24:38.678543 ignition[1032]: INFO : Stage: umount Aug 19 00:24:38.680218 ignition[1032]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:24:38.680218 ignition[1032]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 00:24:38.682321 ignition[1032]: INFO : umount: umount passed Aug 19 00:24:38.682321 ignition[1032]: INFO : Ignition finished successfully Aug 19 00:24:38.684431 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 00:24:38.684574 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 00:24:38.686875 systemd[1]: Stopped target network.target - Network. Aug 19 00:24:38.687862 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 00:24:38.687933 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 00:24:38.689818 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 00:24:38.689866 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 00:24:38.691543 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 00:24:38.691591 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 00:24:38.693275 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 00:24:38.693316 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 00:24:38.695252 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 00:24:38.697084 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 00:24:38.699001 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 00:24:38.699108 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 00:24:38.701384 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 00:24:38.701475 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 00:24:38.706539 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 00:24:38.706668 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 00:24:38.710186 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 00:24:38.710386 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 00:24:38.710502 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 00:24:38.714121 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 00:24:38.714949 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 00:24:38.716788 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 00:24:38.716832 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:24:38.719959 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 00:24:38.721809 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 00:24:38.721897 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:24:38.724526 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 00:24:38.724572 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:24:38.727936 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 00:24:38.727987 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 00:24:38.730024 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 00:24:38.730081 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:24:38.733073 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:24:38.738348 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 00:24:38.738419 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:24:38.748336 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 00:24:38.755946 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:24:38.758073 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 00:24:38.758153 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 00:24:38.759896 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 00:24:38.759933 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:24:38.761874 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 00:24:38.761926 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:24:38.764861 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 00:24:38.764910 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 00:24:38.767658 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 00:24:38.767722 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:24:38.771569 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 00:24:38.772813 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 00:24:38.772887 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:24:38.776133 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 00:24:38.776210 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:24:38.779618 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:24:38.779673 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:24:38.784443 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 00:24:38.784495 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 00:24:38.784528 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:24:38.784832 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 00:24:38.793948 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 00:24:38.800353 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 00:24:38.800476 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 00:24:38.802819 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 00:24:38.805598 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 00:24:38.843078 systemd[1]: Switching root. Aug 19 00:24:38.878350 systemd-journald[244]: Journal stopped Aug 19 00:24:39.765835 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Aug 19 00:24:39.765898 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 00:24:39.765914 kernel: SELinux: policy capability open_perms=1 Aug 19 00:24:39.765923 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 00:24:39.765933 kernel: SELinux: policy capability always_check_network=0 Aug 19 00:24:39.765942 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 00:24:39.765956 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 00:24:39.765965 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 00:24:39.765973 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 00:24:39.765984 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 00:24:39.765993 kernel: audit: type=1403 audit(1755563079.103:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 00:24:39.766007 systemd[1]: Successfully loaded SELinux policy in 69.266ms. Aug 19 00:24:39.766030 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.527ms. Aug 19 00:24:39.766046 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:24:39.766057 systemd[1]: Detected virtualization kvm. Aug 19 00:24:39.766078 systemd[1]: Detected architecture arm64. Aug 19 00:24:39.766090 systemd[1]: Detected first boot. Aug 19 00:24:39.766099 systemd[1]: Initializing machine ID from VM UUID. Aug 19 00:24:39.766109 zram_generator::config[1078]: No configuration found. Aug 19 00:24:39.766120 kernel: NET: Registered PF_VSOCK protocol family Aug 19 00:24:39.766129 systemd[1]: Populated /etc with preset unit settings. Aug 19 00:24:39.766143 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 00:24:39.766153 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 00:24:39.766162 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 00:24:39.766172 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 00:24:39.766182 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 00:24:39.766192 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 00:24:39.766202 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 00:24:39.766211 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 00:24:39.766225 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 00:24:39.766235 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 00:24:39.766245 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 00:24:39.766258 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 00:24:39.766268 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:24:39.766278 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:24:39.766288 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 00:24:39.766298 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 00:24:39.766308 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 00:24:39.766327 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:24:39.766340 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Aug 19 00:24:39.766352 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:24:39.766364 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:24:39.766373 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 00:24:39.766386 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 00:24:39.766396 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 00:24:39.766406 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 00:24:39.766418 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:24:39.766428 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:24:39.766438 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:24:39.766448 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:24:39.766458 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 00:24:39.766468 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 00:24:39.766477 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 00:24:39.766487 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:24:39.766497 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:24:39.766508 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:24:39.766518 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 00:24:39.766531 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 00:24:39.766542 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 00:24:39.766557 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 00:24:39.766568 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 00:24:39.766578 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 00:24:39.766588 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 00:24:39.766598 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 00:24:39.766611 systemd[1]: Reached target machines.target - Containers. Aug 19 00:24:39.766621 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 00:24:39.766631 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:24:39.766641 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:24:39.766651 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 00:24:39.766661 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:24:39.766671 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:24:39.766680 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:24:39.766692 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 00:24:39.766701 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:24:39.766711 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 00:24:39.766721 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 00:24:39.766731 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 00:24:39.766752 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 00:24:39.766769 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 00:24:39.766785 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:24:39.766804 kernel: loop: module loaded Aug 19 00:24:39.766814 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:24:39.766824 kernel: fuse: init (API version 7.41) Aug 19 00:24:39.766833 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:24:39.766843 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:24:39.766852 kernel: ACPI: bus type drm_connector registered Aug 19 00:24:39.766862 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 00:24:39.766872 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 00:24:39.766882 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:24:39.766893 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 00:24:39.766903 systemd[1]: Stopped verity-setup.service. Aug 19 00:24:39.766912 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 00:24:39.766948 systemd-journald[1143]: Collecting audit messages is disabled. Aug 19 00:24:39.766972 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 00:24:39.766983 systemd-journald[1143]: Journal started Aug 19 00:24:39.767004 systemd-journald[1143]: Runtime Journal (/run/log/journal/a9e498ed3d404dfdb5114743f5c82f5c) is 6M, max 48.5M, 42.4M free. Aug 19 00:24:39.528920 systemd[1]: Queued start job for default target multi-user.target. Aug 19 00:24:39.544758 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Aug 19 00:24:39.545155 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 00:24:39.768803 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:24:39.770871 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 00:24:39.772006 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 00:24:39.773240 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 00:24:39.774475 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 00:24:39.775852 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 00:24:39.778783 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:24:39.780319 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 00:24:39.780504 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 00:24:39.782001 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:24:39.782190 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:24:39.783585 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:24:39.783809 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:24:39.785055 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:24:39.785234 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:24:39.786686 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 00:24:39.786882 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 00:24:39.788230 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:24:39.788375 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:24:39.789791 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:24:39.791203 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:24:39.792922 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 00:24:39.794425 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 00:24:39.806299 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:24:39.808629 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 00:24:39.810806 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 00:24:39.811963 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 00:24:39.811993 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:24:39.813873 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 00:24:39.822870 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 00:24:39.824161 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:24:39.828150 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 00:24:39.830499 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 00:24:39.832028 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:24:39.835902 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 00:24:39.837230 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:24:39.838981 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:24:39.845011 systemd-journald[1143]: Time spent on flushing to /var/log/journal/a9e498ed3d404dfdb5114743f5c82f5c is 23.339ms for 882 entries. Aug 19 00:24:39.845011 systemd-journald[1143]: System Journal (/var/log/journal/a9e498ed3d404dfdb5114743f5c82f5c) is 8M, max 195.6M, 187.6M free. Aug 19 00:24:39.876203 systemd-journald[1143]: Received client request to flush runtime journal. Aug 19 00:24:39.876237 kernel: loop0: detected capacity change from 0 to 119320 Aug 19 00:24:39.843376 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 00:24:39.846415 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 00:24:39.849968 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:24:39.852579 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 00:24:39.855212 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 00:24:39.869775 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 00:24:39.871865 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 00:24:39.876957 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 00:24:39.878483 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 00:24:39.895602 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 00:24:39.899953 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:24:39.902007 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 00:24:39.903289 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:24:39.920903 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 00:24:39.932297 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Aug 19 00:24:39.932340 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Aug 19 00:24:39.935865 kernel: loop1: detected capacity change from 0 to 100608 Aug 19 00:24:39.937175 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:24:39.967308 kernel: loop2: detected capacity change from 0 to 211168 Aug 19 00:24:39.989775 kernel: loop3: detected capacity change from 0 to 119320 Aug 19 00:24:39.997756 kernel: loop4: detected capacity change from 0 to 100608 Aug 19 00:24:40.004762 kernel: loop5: detected capacity change from 0 to 211168 Aug 19 00:24:40.010584 (sd-merge)[1219]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Aug 19 00:24:40.011055 (sd-merge)[1219]: Merged extensions into '/usr'. Aug 19 00:24:40.016402 systemd[1]: Reload requested from client PID 1194 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 00:24:40.016525 systemd[1]: Reloading... Aug 19 00:24:40.082828 zram_generator::config[1245]: No configuration found. Aug 19 00:24:40.145315 ldconfig[1189]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 00:24:40.228380 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 00:24:40.228485 systemd[1]: Reloading finished in 211 ms. Aug 19 00:24:40.266420 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 00:24:40.268023 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 00:24:40.278997 systemd[1]: Starting ensure-sysext.service... Aug 19 00:24:40.280888 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:24:40.302404 systemd[1]: Reload requested from client PID 1279 ('systemctl') (unit ensure-sysext.service)... Aug 19 00:24:40.302422 systemd[1]: Reloading... Aug 19 00:24:40.312156 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 00:24:40.312641 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 00:24:40.313004 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 00:24:40.313292 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 00:24:40.314009 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 00:24:40.314325 systemd-tmpfiles[1280]: ACLs are not supported, ignoring. Aug 19 00:24:40.314441 systemd-tmpfiles[1280]: ACLs are not supported, ignoring. Aug 19 00:24:40.317720 systemd-tmpfiles[1280]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:24:40.317864 systemd-tmpfiles[1280]: Skipping /boot Aug 19 00:24:40.323692 systemd-tmpfiles[1280]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:24:40.323797 systemd-tmpfiles[1280]: Skipping /boot Aug 19 00:24:40.351767 zram_generator::config[1308]: No configuration found. Aug 19 00:24:40.485625 systemd[1]: Reloading finished in 182 ms. Aug 19 00:24:40.510050 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 00:24:40.515966 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:24:40.522788 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:24:40.525405 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 00:24:40.528196 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 00:24:40.531254 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:24:40.536910 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:24:40.539438 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 00:24:40.546341 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:24:40.552587 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:24:40.557011 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:24:40.560661 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:24:40.561853 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:24:40.561979 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:24:40.565387 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 00:24:40.571557 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 00:24:40.573757 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:24:40.573917 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:24:40.575653 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:24:40.581988 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:24:40.587474 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:24:40.588343 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:24:40.590064 systemd-udevd[1348]: Using default interface naming scheme 'v255'. Aug 19 00:24:40.593577 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 00:24:40.596251 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:24:40.599180 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:24:40.601600 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:24:40.604231 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:24:40.605518 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:24:40.605658 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:24:40.607941 augenrules[1379]: No rules Aug 19 00:24:40.611109 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 00:24:40.612858 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 00:24:40.614099 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:24:40.615778 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:24:40.617604 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 00:24:40.619384 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:24:40.622314 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:24:40.622479 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:24:40.624216 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:24:40.624400 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:24:40.626103 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:24:40.627049 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:24:40.634397 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 00:24:40.648767 systemd[1]: Finished ensure-sysext.service. Aug 19 00:24:40.650977 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 00:24:40.656847 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:24:40.660022 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:24:40.661907 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:24:40.665434 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:24:40.679135 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:24:40.683227 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:24:40.684565 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:24:40.684637 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:24:40.688434 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:24:40.692459 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 19 00:24:40.693690 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 00:24:40.694459 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:24:40.694672 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:24:40.696259 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:24:40.697804 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:24:40.709328 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:24:40.709511 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:24:40.712613 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Aug 19 00:24:40.712676 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:24:40.715102 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:24:40.715285 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:24:40.721703 augenrules[1425]: /sbin/augenrules: No change Aug 19 00:24:40.723547 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:24:40.737226 augenrules[1460]: No rules Aug 19 00:24:40.739091 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:24:40.741599 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:24:40.763540 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 00:24:40.766683 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 00:24:40.818841 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 00:24:40.827867 systemd-networkd[1433]: lo: Link UP Aug 19 00:24:40.827877 systemd-networkd[1433]: lo: Gained carrier Aug 19 00:24:40.828984 systemd-networkd[1433]: Enumeration completed Aug 19 00:24:40.829126 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:24:40.829490 systemd-networkd[1433]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:24:40.829502 systemd-networkd[1433]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:24:40.830097 systemd-networkd[1433]: eth0: Link UP Aug 19 00:24:40.830212 systemd-networkd[1433]: eth0: Gained carrier Aug 19 00:24:40.830227 systemd-networkd[1433]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:24:40.832896 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 00:24:40.837364 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 00:24:40.838643 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 19 00:24:40.839945 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 00:24:40.841984 systemd-networkd[1433]: eth0: DHCPv4 address 10.0.0.116/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 00:24:40.842587 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Aug 19 00:24:40.848267 systemd-resolved[1347]: Positive Trust Anchors: Aug 19 00:24:40.848292 systemd-resolved[1347]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:24:40.848326 systemd-resolved[1347]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:24:40.442495 systemd-timesyncd[1435]: Contacted time server 10.0.0.1:123 (10.0.0.1). Aug 19 00:24:40.450982 systemd-journald[1143]: Time jumped backwards, rotating. Aug 19 00:24:40.442569 systemd-timesyncd[1435]: Initial clock synchronization to Tue 2025-08-19 00:24:40.442341 UTC. Aug 19 00:24:40.462980 systemd-resolved[1347]: Defaulting to hostname 'linux'. Aug 19 00:24:40.463630 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 00:24:40.469412 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:24:40.471631 systemd[1]: Reached target network.target - Network. Aug 19 00:24:40.472600 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:24:40.473815 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:24:40.476570 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 00:24:40.477972 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 00:24:40.479441 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 00:24:40.480710 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 00:24:40.483368 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 00:24:40.484711 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 00:24:40.484749 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:24:40.485737 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:24:40.487915 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 00:24:40.491494 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 00:24:40.495424 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 00:24:40.498151 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 00:24:40.500780 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 00:24:40.504418 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 00:24:40.506993 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 00:24:40.509198 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 00:24:40.517392 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:24:40.518470 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:24:40.519589 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:24:40.519618 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:24:40.520734 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 00:24:40.522975 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 00:24:40.524997 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 00:24:40.533108 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 00:24:40.535302 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 00:24:40.536474 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 00:24:40.537517 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 00:24:40.541309 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 00:24:40.543438 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 00:24:40.546701 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 00:24:40.554475 jq[1496]: false Aug 19 00:24:40.551634 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 00:24:40.554218 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:24:40.556282 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 00:24:40.556792 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 00:24:40.557665 extend-filesystems[1497]: Found /dev/vda6 Aug 19 00:24:40.559977 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 00:24:40.568481 extend-filesystems[1497]: Found /dev/vda9 Aug 19 00:24:40.568824 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 00:24:40.575051 extend-filesystems[1497]: Checking size of /dev/vda9 Aug 19 00:24:40.576347 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 00:24:40.582005 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 00:24:40.582252 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 00:24:40.582495 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 00:24:40.582668 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 00:24:40.587514 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 00:24:40.592884 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 00:24:40.602877 jq[1517]: true Aug 19 00:24:40.604505 (ntainerd)[1525]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 00:24:40.613755 extend-filesystems[1497]: Resized partition /dev/vda9 Aug 19 00:24:40.617251 extend-filesystems[1538]: resize2fs 1.47.2 (1-Jan-2025) Aug 19 00:24:40.623694 jq[1536]: true Aug 19 00:24:40.635241 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Aug 19 00:24:40.644304 update_engine[1510]: I20250819 00:24:40.643744 1510 main.cc:92] Flatcar Update Engine starting Aug 19 00:24:40.669107 dbus-daemon[1494]: [system] SELinux support is enabled Aug 19 00:24:40.671035 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 00:24:40.672648 update_engine[1510]: I20250819 00:24:40.672266 1510 update_check_scheduler.cc:74] Next update check in 8m39s Aug 19 00:24:40.674834 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 00:24:40.674861 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 00:24:40.675287 tar[1522]: linux-arm64/LICENSE Aug 19 00:24:40.675731 tar[1522]: linux-arm64/helm Aug 19 00:24:40.677539 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 00:24:40.677565 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 00:24:40.679274 systemd[1]: Started update-engine.service - Update Engine. Aug 19 00:24:40.689457 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 00:24:40.693233 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:24:40.697372 systemd-logind[1507]: Watching system buttons on /dev/input/event0 (Power Button) Aug 19 00:24:40.698696 systemd-logind[1507]: New seat seat0. Aug 19 00:24:40.706461 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Aug 19 00:24:40.706948 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 00:24:40.739278 extend-filesystems[1538]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Aug 19 00:24:40.739278 extend-filesystems[1538]: old_desc_blocks = 1, new_desc_blocks = 1 Aug 19 00:24:40.739278 extend-filesystems[1538]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Aug 19 00:24:40.743258 extend-filesystems[1497]: Resized filesystem in /dev/vda9 Aug 19 00:24:40.741484 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 00:24:40.741894 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 00:24:40.748886 bash[1560]: Updated "/home/core/.ssh/authorized_keys" Aug 19 00:24:40.750775 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 00:24:40.755053 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 19 00:24:40.799469 locksmithd[1559]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 00:24:40.955277 containerd[1525]: time="2025-08-19T00:24:40Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 00:24:40.956713 containerd[1525]: time="2025-08-19T00:24:40.956664385Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 00:24:40.968238 containerd[1525]: time="2025-08-19T00:24:40.967785625Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.76µs" Aug 19 00:24:40.968238 containerd[1525]: time="2025-08-19T00:24:40.967834065Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 00:24:40.968238 containerd[1525]: time="2025-08-19T00:24:40.967853025Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 00:24:40.968238 containerd[1525]: time="2025-08-19T00:24:40.968158585Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 00:24:40.968238 containerd[1525]: time="2025-08-19T00:24:40.968177865Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 00:24:40.968443 containerd[1525]: time="2025-08-19T00:24:40.968422625Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:24:40.968596 containerd[1525]: time="2025-08-19T00:24:40.968572945Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:24:40.968653 containerd[1525]: time="2025-08-19T00:24:40.968639505Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:24:40.968962 containerd[1525]: time="2025-08-19T00:24:40.968936545Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969021 containerd[1525]: time="2025-08-19T00:24:40.969006705Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969070 containerd[1525]: time="2025-08-19T00:24:40.969056545Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969129 containerd[1525]: time="2025-08-19T00:24:40.969114665Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969279 containerd[1525]: time="2025-08-19T00:24:40.969259105Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969579 containerd[1525]: time="2025-08-19T00:24:40.969553905Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969663 containerd[1525]: time="2025-08-19T00:24:40.969647625Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:24:40.969726 containerd[1525]: time="2025-08-19T00:24:40.969711985Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 00:24:40.969803 containerd[1525]: time="2025-08-19T00:24:40.969788785Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 00:24:40.970261 containerd[1525]: time="2025-08-19T00:24:40.970151985Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 00:24:40.970422 containerd[1525]: time="2025-08-19T00:24:40.970402305Z" level=info msg="metadata content store policy set" policy=shared Aug 19 00:24:40.975175 containerd[1525]: time="2025-08-19T00:24:40.975134265Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 00:24:40.975338 containerd[1525]: time="2025-08-19T00:24:40.975321745Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975455465Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975476585Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975489385Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975500905Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975514465Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975536065Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975550425Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975560625Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975570905Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975583305Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975728185Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975753025Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975767945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 00:24:40.976128 containerd[1525]: time="2025-08-19T00:24:40.975779185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975789865Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975801625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975814665Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975825345Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975836705Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975850065Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.975861225Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.976076025Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 00:24:40.976428 containerd[1525]: time="2025-08-19T00:24:40.976093105Z" level=info msg="Start snapshots syncer" Aug 19 00:24:40.976640 containerd[1525]: time="2025-08-19T00:24:40.976619145Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 00:24:40.976928 containerd[1525]: time="2025-08-19T00:24:40.976889185Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 00:24:40.977082 containerd[1525]: time="2025-08-19T00:24:40.977064985Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 00:24:40.977229 containerd[1525]: time="2025-08-19T00:24:40.977195305Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 00:24:40.977462 containerd[1525]: time="2025-08-19T00:24:40.977440545Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 00:24:40.977560 containerd[1525]: time="2025-08-19T00:24:40.977542785Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 00:24:40.977616 containerd[1525]: time="2025-08-19T00:24:40.977602385Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 00:24:40.977668 containerd[1525]: time="2025-08-19T00:24:40.977654825Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 00:24:40.977725 containerd[1525]: time="2025-08-19T00:24:40.977710665Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 00:24:40.977778 containerd[1525]: time="2025-08-19T00:24:40.977763385Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 00:24:40.977835 containerd[1525]: time="2025-08-19T00:24:40.977820025Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 00:24:40.977916 containerd[1525]: time="2025-08-19T00:24:40.977901065Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 00:24:40.977970 containerd[1525]: time="2025-08-19T00:24:40.977957625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 00:24:40.978044 containerd[1525]: time="2025-08-19T00:24:40.978029825Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 00:24:40.978139 containerd[1525]: time="2025-08-19T00:24:40.978122265Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:24:40.978218 containerd[1525]: time="2025-08-19T00:24:40.978185465Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978253425Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978271425Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978279785Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978291345Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978304385Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978386145Z" level=info msg="runtime interface created" Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978391065Z" level=info msg="created NRI interface" Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978399425Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978410745Z" level=info msg="Connect containerd service" Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.978442745Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 00:24:40.979231 containerd[1525]: time="2025-08-19T00:24:40.979171025Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:24:40.994266 tar[1522]: linux-arm64/README.md Aug 19 00:24:41.011271 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068638745Z" level=info msg="Start subscribing containerd event" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068711105Z" level=info msg="Start recovering state" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068800665Z" level=info msg="Start event monitor" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068812385Z" level=info msg="Start cni network conf syncer for default" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068824025Z" level=info msg="Start streaming server" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068833905Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068841145Z" level=info msg="runtime interface starting up..." Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068846345Z" level=info msg="starting plugins..." Aug 19 00:24:41.069050 containerd[1525]: time="2025-08-19T00:24:41.068860945Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 00:24:41.069292 containerd[1525]: time="2025-08-19T00:24:41.069093185Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 00:24:41.069292 containerd[1525]: time="2025-08-19T00:24:41.069144305Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 00:24:41.069302 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 00:24:41.070494 containerd[1525]: time="2025-08-19T00:24:41.070450265Z" level=info msg="containerd successfully booted in 0.115747s" Aug 19 00:24:41.317083 sshd_keygen[1514]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 00:24:41.342285 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 00:24:41.345953 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 00:24:41.366096 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 00:24:41.366362 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 00:24:41.369338 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 00:24:41.400789 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 00:24:41.405719 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 00:24:41.408066 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Aug 19 00:24:41.409608 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 00:24:42.325683 systemd-networkd[1433]: eth0: Gained IPv6LL Aug 19 00:24:42.328829 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 00:24:42.330549 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 00:24:42.334709 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Aug 19 00:24:42.338807 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:24:42.343168 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 00:24:42.364989 systemd[1]: coreos-metadata.service: Deactivated successfully. Aug 19 00:24:42.365316 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Aug 19 00:24:42.367323 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 00:24:42.375714 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 00:24:42.933914 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:24:42.936072 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 00:24:42.938393 (kubelet)[1633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:24:42.942312 systemd[1]: Startup finished in 2.073s (kernel) + 5.445s (initrd) + 4.329s (userspace) = 11.848s. Aug 19 00:24:43.472998 kubelet[1633]: E0819 00:24:43.472927 1633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:24:43.475978 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:24:43.476126 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:24:43.476491 systemd[1]: kubelet.service: Consumed 839ms CPU time, 257.1M memory peak. Aug 19 00:24:46.496282 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 00:24:46.497303 systemd[1]: Started sshd@0-10.0.0.116:22-10.0.0.1:53008.service - OpenSSH per-connection server daemon (10.0.0.1:53008). Aug 19 00:24:46.667789 sshd[1646]: Accepted publickey for core from 10.0.0.1 port 53008 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:46.672131 sshd-session[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:46.681291 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 00:24:46.682882 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 00:24:46.692958 systemd-logind[1507]: New session 1 of user core. Aug 19 00:24:46.729736 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 00:24:46.739574 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 00:24:46.766263 (systemd)[1651]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 00:24:46.770827 systemd-logind[1507]: New session c1 of user core. Aug 19 00:24:46.914154 systemd[1651]: Queued start job for default target default.target. Aug 19 00:24:46.933361 systemd[1651]: Created slice app.slice - User Application Slice. Aug 19 00:24:46.933393 systemd[1651]: Reached target paths.target - Paths. Aug 19 00:24:46.933433 systemd[1651]: Reached target timers.target - Timers. Aug 19 00:24:46.934714 systemd[1651]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 00:24:46.950567 systemd[1651]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 00:24:46.950702 systemd[1651]: Reached target sockets.target - Sockets. Aug 19 00:24:46.950761 systemd[1651]: Reached target basic.target - Basic System. Aug 19 00:24:46.950790 systemd[1651]: Reached target default.target - Main User Target. Aug 19 00:24:46.950819 systemd[1651]: Startup finished in 166ms. Aug 19 00:24:46.950901 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 00:24:46.952281 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 00:24:47.021515 systemd[1]: Started sshd@1-10.0.0.116:22-10.0.0.1:53012.service - OpenSSH per-connection server daemon (10.0.0.1:53012). Aug 19 00:24:47.086786 sshd[1662]: Accepted publickey for core from 10.0.0.1 port 53012 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:47.088261 sshd-session[1662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:47.092756 systemd-logind[1507]: New session 2 of user core. Aug 19 00:24:47.103446 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 00:24:47.157776 sshd[1665]: Connection closed by 10.0.0.1 port 53012 Aug 19 00:24:47.158333 sshd-session[1662]: pam_unix(sshd:session): session closed for user core Aug 19 00:24:47.168632 systemd[1]: sshd@1-10.0.0.116:22-10.0.0.1:53012.service: Deactivated successfully. Aug 19 00:24:47.172442 systemd[1]: session-2.scope: Deactivated successfully. Aug 19 00:24:47.176553 systemd-logind[1507]: Session 2 logged out. Waiting for processes to exit. Aug 19 00:24:47.182165 systemd[1]: Started sshd@2-10.0.0.116:22-10.0.0.1:53022.service - OpenSSH per-connection server daemon (10.0.0.1:53022). Aug 19 00:24:47.183358 systemd-logind[1507]: Removed session 2. Aug 19 00:24:47.246850 sshd[1671]: Accepted publickey for core from 10.0.0.1 port 53022 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:47.248752 sshd-session[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:47.256653 systemd-logind[1507]: New session 3 of user core. Aug 19 00:24:47.267434 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 00:24:47.318588 sshd[1674]: Connection closed by 10.0.0.1 port 53022 Aug 19 00:24:47.319123 sshd-session[1671]: pam_unix(sshd:session): session closed for user core Aug 19 00:24:47.332725 systemd[1]: sshd@2-10.0.0.116:22-10.0.0.1:53022.service: Deactivated successfully. Aug 19 00:24:47.334231 systemd[1]: session-3.scope: Deactivated successfully. Aug 19 00:24:47.335864 systemd-logind[1507]: Session 3 logged out. Waiting for processes to exit. Aug 19 00:24:47.338368 systemd[1]: Started sshd@3-10.0.0.116:22-10.0.0.1:53028.service - OpenSSH per-connection server daemon (10.0.0.1:53028). Aug 19 00:24:47.339296 systemd-logind[1507]: Removed session 3. Aug 19 00:24:47.394875 sshd[1680]: Accepted publickey for core from 10.0.0.1 port 53028 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:47.396199 sshd-session[1680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:47.400640 systemd-logind[1507]: New session 4 of user core. Aug 19 00:24:47.414408 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 00:24:47.471694 sshd[1683]: Connection closed by 10.0.0.1 port 53028 Aug 19 00:24:47.472282 sshd-session[1680]: pam_unix(sshd:session): session closed for user core Aug 19 00:24:47.494159 systemd[1]: sshd@3-10.0.0.116:22-10.0.0.1:53028.service: Deactivated successfully. Aug 19 00:24:47.498010 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 00:24:47.499025 systemd-logind[1507]: Session 4 logged out. Waiting for processes to exit. Aug 19 00:24:47.501584 systemd[1]: Started sshd@4-10.0.0.116:22-10.0.0.1:53044.service - OpenSSH per-connection server daemon (10.0.0.1:53044). Aug 19 00:24:47.502715 systemd-logind[1507]: Removed session 4. Aug 19 00:24:47.579138 sshd[1689]: Accepted publickey for core from 10.0.0.1 port 53044 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:47.580480 sshd-session[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:47.585286 systemd-logind[1507]: New session 5 of user core. Aug 19 00:24:47.602455 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 00:24:47.668473 sudo[1693]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 00:24:47.668749 sudo[1693]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:24:47.692269 sudo[1693]: pam_unix(sudo:session): session closed for user root Aug 19 00:24:47.701271 sshd[1692]: Connection closed by 10.0.0.1 port 53044 Aug 19 00:24:47.701937 sshd-session[1689]: pam_unix(sshd:session): session closed for user core Aug 19 00:24:47.714838 systemd[1]: sshd@4-10.0.0.116:22-10.0.0.1:53044.service: Deactivated successfully. Aug 19 00:24:47.716794 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 00:24:47.719744 systemd-logind[1507]: Session 5 logged out. Waiting for processes to exit. Aug 19 00:24:47.722481 systemd[1]: Started sshd@5-10.0.0.116:22-10.0.0.1:53054.service - OpenSSH per-connection server daemon (10.0.0.1:53054). Aug 19 00:24:47.723796 systemd-logind[1507]: Removed session 5. Aug 19 00:24:47.792611 sshd[1699]: Accepted publickey for core from 10.0.0.1 port 53054 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:47.794110 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:47.798582 systemd-logind[1507]: New session 6 of user core. Aug 19 00:24:47.813418 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 00:24:47.865372 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 00:24:47.865662 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:24:47.961893 sudo[1704]: pam_unix(sudo:session): session closed for user root Aug 19 00:24:47.967741 sudo[1703]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 00:24:47.968007 sudo[1703]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:24:47.981189 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:24:48.031613 augenrules[1726]: No rules Aug 19 00:24:48.033241 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:24:48.033501 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:24:48.035543 sudo[1703]: pam_unix(sudo:session): session closed for user root Aug 19 00:24:48.039299 sshd[1702]: Connection closed by 10.0.0.1 port 53054 Aug 19 00:24:48.039757 sshd-session[1699]: pam_unix(sshd:session): session closed for user core Aug 19 00:24:48.049985 systemd[1]: sshd@5-10.0.0.116:22-10.0.0.1:53054.service: Deactivated successfully. Aug 19 00:24:48.052165 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 00:24:48.053104 systemd-logind[1507]: Session 6 logged out. Waiting for processes to exit. Aug 19 00:24:48.055598 systemd-logind[1507]: Removed session 6. Aug 19 00:24:48.057036 systemd[1]: Started sshd@6-10.0.0.116:22-10.0.0.1:53060.service - OpenSSH per-connection server daemon (10.0.0.1:53060). Aug 19 00:24:48.119396 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 53060 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:24:48.120771 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:24:48.126943 systemd-logind[1507]: New session 7 of user core. Aug 19 00:24:48.141428 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 00:24:48.193309 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 00:24:48.193590 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:24:48.574689 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 00:24:48.591604 (dockerd)[1760]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 00:24:48.931928 dockerd[1760]: time="2025-08-19T00:24:48.931797985Z" level=info msg="Starting up" Aug 19 00:24:48.933072 dockerd[1760]: time="2025-08-19T00:24:48.932998345Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 00:24:48.947018 dockerd[1760]: time="2025-08-19T00:24:48.946845305Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 00:24:49.047723 dockerd[1760]: time="2025-08-19T00:24:49.047477945Z" level=info msg="Loading containers: start." Aug 19 00:24:49.062251 kernel: Initializing XFRM netlink socket Aug 19 00:24:49.321791 systemd-networkd[1433]: docker0: Link UP Aug 19 00:24:49.330052 dockerd[1760]: time="2025-08-19T00:24:49.329993905Z" level=info msg="Loading containers: done." Aug 19 00:24:49.354651 dockerd[1760]: time="2025-08-19T00:24:49.354585705Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 00:24:49.354815 dockerd[1760]: time="2025-08-19T00:24:49.354778905Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 00:24:49.354928 dockerd[1760]: time="2025-08-19T00:24:49.354889265Z" level=info msg="Initializing buildkit" Aug 19 00:24:49.385576 dockerd[1760]: time="2025-08-19T00:24:49.385508865Z" level=info msg="Completed buildkit initialization" Aug 19 00:24:49.391158 dockerd[1760]: time="2025-08-19T00:24:49.391089425Z" level=info msg="Daemon has completed initialization" Aug 19 00:24:49.391442 dockerd[1760]: time="2025-08-19T00:24:49.391223505Z" level=info msg="API listen on /run/docker.sock" Aug 19 00:24:49.391577 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 00:24:50.004636 containerd[1525]: time="2025-08-19T00:24:50.004525545Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Aug 19 00:24:50.740963 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount510205778.mount: Deactivated successfully. Aug 19 00:24:51.722584 containerd[1525]: time="2025-08-19T00:24:51.722511865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:51.723603 containerd[1525]: time="2025-08-19T00:24:51.723369345Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352615" Aug 19 00:24:51.724879 containerd[1525]: time="2025-08-19T00:24:51.724839905Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:51.728272 containerd[1525]: time="2025-08-19T00:24:51.727421145Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:51.729530 containerd[1525]: time="2025-08-19T00:24:51.729468105Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 1.7248518s" Aug 19 00:24:51.729584 containerd[1525]: time="2025-08-19T00:24:51.729533105Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Aug 19 00:24:51.736510 containerd[1525]: time="2025-08-19T00:24:51.736235625Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Aug 19 00:24:52.879177 containerd[1525]: time="2025-08-19T00:24:52.879128745Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:52.881748 containerd[1525]: time="2025-08-19T00:24:52.881687545Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536979" Aug 19 00:24:52.882816 containerd[1525]: time="2025-08-19T00:24:52.882759265Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:52.885010 containerd[1525]: time="2025-08-19T00:24:52.884976505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:52.886711 containerd[1525]: time="2025-08-19T00:24:52.886675825Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 1.15039408s" Aug 19 00:24:52.886744 containerd[1525]: time="2025-08-19T00:24:52.886711705Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Aug 19 00:24:52.887210 containerd[1525]: time="2025-08-19T00:24:52.887149305Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Aug 19 00:24:53.726662 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 00:24:53.730046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:24:53.941411 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:24:53.957794 (kubelet)[2048]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:24:53.998092 kubelet[2048]: E0819 00:24:53.997961 2048 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:24:54.001478 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:24:54.001612 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:24:54.003307 systemd[1]: kubelet.service: Consumed 159ms CPU time, 106.5M memory peak. Aug 19 00:24:54.371535 containerd[1525]: time="2025-08-19T00:24:54.371424185Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:54.387600 containerd[1525]: time="2025-08-19T00:24:54.387549425Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292016" Aug 19 00:24:54.401438 containerd[1525]: time="2025-08-19T00:24:54.401378185Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:54.418911 containerd[1525]: time="2025-08-19T00:24:54.418828905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:54.420028 containerd[1525]: time="2025-08-19T00:24:54.419901265Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 1.53271052s" Aug 19 00:24:54.420028 containerd[1525]: time="2025-08-19T00:24:54.419933825Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Aug 19 00:24:54.420803 containerd[1525]: time="2025-08-19T00:24:54.420671345Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Aug 19 00:24:55.646587 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount334103647.mount: Deactivated successfully. Aug 19 00:24:56.297938 containerd[1525]: time="2025-08-19T00:24:56.297875705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:56.299230 containerd[1525]: time="2025-08-19T00:24:56.299175225Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199961" Aug 19 00:24:56.301134 containerd[1525]: time="2025-08-19T00:24:56.301095225Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:56.304378 containerd[1525]: time="2025-08-19T00:24:56.304338705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:56.305462 containerd[1525]: time="2025-08-19T00:24:56.305411785Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.884608s" Aug 19 00:24:56.305462 containerd[1525]: time="2025-08-19T00:24:56.305447985Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Aug 19 00:24:56.305883 containerd[1525]: time="2025-08-19T00:24:56.305852145Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 19 00:24:57.220753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3014537990.mount: Deactivated successfully. Aug 19 00:24:58.589004 containerd[1525]: time="2025-08-19T00:24:58.588949905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:58.600774 containerd[1525]: time="2025-08-19T00:24:58.600722625Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152119" Aug 19 00:24:58.604228 containerd[1525]: time="2025-08-19T00:24:58.604162265Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:58.625557 containerd[1525]: time="2025-08-19T00:24:58.625476545Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:24:58.627057 containerd[1525]: time="2025-08-19T00:24:58.627005505Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 2.3211228s" Aug 19 00:24:58.627057 containerd[1525]: time="2025-08-19T00:24:58.627053665Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Aug 19 00:24:58.627747 containerd[1525]: time="2025-08-19T00:24:58.627693905Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 00:24:59.571474 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3222022520.mount: Deactivated successfully. Aug 19 00:24:59.635230 containerd[1525]: time="2025-08-19T00:24:59.635024785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:24:59.650232 containerd[1525]: time="2025-08-19T00:24:59.650170385Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Aug 19 00:24:59.665626 containerd[1525]: time="2025-08-19T00:24:59.665575705Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:24:59.679527 containerd[1525]: time="2025-08-19T00:24:59.679435465Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:24:59.680253 containerd[1525]: time="2025-08-19T00:24:59.680101825Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 1.0523674s" Aug 19 00:24:59.680253 containerd[1525]: time="2025-08-19T00:24:59.680151305Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Aug 19 00:24:59.680718 containerd[1525]: time="2025-08-19T00:24:59.680680505Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 19 00:25:00.410274 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount203393965.mount: Deactivated successfully. Aug 19 00:25:02.520601 containerd[1525]: time="2025-08-19T00:25:02.520541825Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:02.535514 containerd[1525]: time="2025-08-19T00:25:02.535446465Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465297" Aug 19 00:25:02.542030 containerd[1525]: time="2025-08-19T00:25:02.541956345Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:02.559391 containerd[1525]: time="2025-08-19T00:25:02.559347705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:02.561029 containerd[1525]: time="2025-08-19T00:25:02.560926425Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.88020928s" Aug 19 00:25:02.561029 containerd[1525]: time="2025-08-19T00:25:02.560964265Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Aug 19 00:25:04.252016 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 19 00:25:04.253585 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:04.408975 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:04.420628 (kubelet)[2207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:25:04.461101 kubelet[2207]: E0819 00:25:04.461019 2207 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:25:04.463878 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:25:04.464115 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:25:04.466292 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.4M memory peak. Aug 19 00:25:07.426733 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:07.427294 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.4M memory peak. Aug 19 00:25:07.430141 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:07.460860 systemd[1]: Reload requested from client PID 2223 ('systemctl') (unit session-7.scope)... Aug 19 00:25:07.460876 systemd[1]: Reloading... Aug 19 00:25:07.544469 zram_generator::config[2269]: No configuration found. Aug 19 00:25:07.905734 systemd[1]: Reloading finished in 444 ms. Aug 19 00:25:07.956172 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:07.958798 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 00:25:07.959161 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:07.959234 systemd[1]: kubelet.service: Consumed 98ms CPU time, 95.2M memory peak. Aug 19 00:25:07.960981 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:08.086020 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:08.091057 (kubelet)[2313]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:25:08.128777 kubelet[2313]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:25:08.128777 kubelet[2313]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:25:08.128777 kubelet[2313]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:25:08.129143 kubelet[2313]: I0819 00:25:08.128876 2313 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:25:08.758569 kubelet[2313]: I0819 00:25:08.758519 2313 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:25:08.758569 kubelet[2313]: I0819 00:25:08.758555 2313 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:25:08.758858 kubelet[2313]: I0819 00:25:08.758826 2313 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:25:08.813718 kubelet[2313]: E0819 00:25:08.813649 2313 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.116:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 00:25:08.814355 kubelet[2313]: I0819 00:25:08.814331 2313 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:25:08.828361 kubelet[2313]: I0819 00:25:08.828318 2313 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:25:08.831339 kubelet[2313]: I0819 00:25:08.831304 2313 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:25:08.832526 kubelet[2313]: I0819 00:25:08.832461 2313 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:25:08.832700 kubelet[2313]: I0819 00:25:08.832516 2313 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:25:08.832795 kubelet[2313]: I0819 00:25:08.832773 2313 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:25:08.832795 kubelet[2313]: I0819 00:25:08.832783 2313 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:25:08.833515 kubelet[2313]: I0819 00:25:08.833484 2313 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:25:08.836510 kubelet[2313]: I0819 00:25:08.836480 2313 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:25:08.836510 kubelet[2313]: I0819 00:25:08.836509 2313 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:25:08.836558 kubelet[2313]: I0819 00:25:08.836532 2313 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:25:08.836558 kubelet[2313]: I0819 00:25:08.836546 2313 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:25:08.838223 kubelet[2313]: I0819 00:25:08.837655 2313 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:25:08.838377 kubelet[2313]: I0819 00:25:08.838349 2313 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:25:08.838507 kubelet[2313]: W0819 00:25:08.838490 2313 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 00:25:08.838543 kubelet[2313]: E0819 00:25:08.838501 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.116:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:25:08.841665 kubelet[2313]: E0819 00:25:08.841614 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.116:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:25:08.843715 kubelet[2313]: I0819 00:25:08.843686 2313 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:25:08.843759 kubelet[2313]: I0819 00:25:08.843744 2313 server.go:1289] "Started kubelet" Aug 19 00:25:08.843972 kubelet[2313]: I0819 00:25:08.843927 2313 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:25:08.845737 kubelet[2313]: I0819 00:25:08.845697 2313 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:25:08.847072 kubelet[2313]: I0819 00:25:08.847049 2313 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:25:08.850564 kubelet[2313]: I0819 00:25:08.850527 2313 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:25:08.850857 kubelet[2313]: E0819 00:25:08.850824 2313 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 00:25:08.850857 kubelet[2313]: I0819 00:25:08.850857 2313 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:25:08.851045 kubelet[2313]: I0819 00:25:08.851024 2313 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:25:08.851119 kubelet[2313]: I0819 00:25:08.851102 2313 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:25:08.851614 kubelet[2313]: E0819 00:25:08.851574 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.116:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:25:08.852126 kubelet[2313]: I0819 00:25:08.852095 2313 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:25:08.852256 kubelet[2313]: I0819 00:25:08.852183 2313 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:25:08.853051 kubelet[2313]: I0819 00:25:08.852881 2313 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:25:08.853258 kubelet[2313]: E0819 00:25:08.853192 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.116:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.116:6443: connect: connection refused" interval="200ms" Aug 19 00:25:08.853464 kubelet[2313]: I0819 00:25:08.853434 2313 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:25:08.853575 kubelet[2313]: I0819 00:25:08.853557 2313 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:25:08.853818 kubelet[2313]: E0819 00:25:08.853796 2313 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 00:25:08.863578 kubelet[2313]: E0819 00:25:08.853900 2313 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.116:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.116:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185d03589b68d9f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-19 00:25:08.843706865 +0000 UTC m=+0.749313441,LastTimestamp:2025-08-19 00:25:08.843706865 +0000 UTC m=+0.749313441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 19 00:25:08.866036 kubelet[2313]: I0819 00:25:08.866003 2313 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:25:08.866036 kubelet[2313]: I0819 00:25:08.866025 2313 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:25:08.866120 kubelet[2313]: I0819 00:25:08.866044 2313 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:25:08.870189 kubelet[2313]: I0819 00:25:08.870123 2313 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:25:08.871239 kubelet[2313]: I0819 00:25:08.871196 2313 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:25:08.871239 kubelet[2313]: I0819 00:25:08.871240 2313 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:25:08.871304 kubelet[2313]: I0819 00:25:08.871264 2313 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:25:08.871304 kubelet[2313]: I0819 00:25:08.871272 2313 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:25:08.871363 kubelet[2313]: E0819 00:25:08.871312 2313 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 00:25:08.872044 kubelet[2313]: E0819 00:25:08.872013 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.116:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:25:08.951858 kubelet[2313]: E0819 00:25:08.951811 2313 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 00:25:08.972158 kubelet[2313]: E0819 00:25:08.972126 2313 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Aug 19 00:25:08.983940 kubelet[2313]: I0819 00:25:08.983898 2313 policy_none.go:49] "None policy: Start" Aug 19 00:25:08.983940 kubelet[2313]: I0819 00:25:08.983931 2313 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:25:08.984027 kubelet[2313]: I0819 00:25:08.983948 2313 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:25:08.991298 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 00:25:09.003139 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 00:25:09.006131 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 00:25:09.016145 kubelet[2313]: E0819 00:25:09.016056 2313 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:25:09.016500 kubelet[2313]: I0819 00:25:09.016480 2313 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:25:09.016539 kubelet[2313]: I0819 00:25:09.016497 2313 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:25:09.017177 kubelet[2313]: I0819 00:25:09.016891 2313 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:25:09.019034 kubelet[2313]: E0819 00:25:09.019000 2313 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:25:09.019093 kubelet[2313]: E0819 00:25:09.019065 2313 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Aug 19 00:25:09.054494 kubelet[2313]: E0819 00:25:09.054442 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.116:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.116:6443: connect: connection refused" interval="400ms" Aug 19 00:25:09.117802 kubelet[2313]: I0819 00:25:09.117737 2313 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 00:25:09.118269 kubelet[2313]: E0819 00:25:09.118229 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.116:6443/api/v1/nodes\": dial tcp 10.0.0.116:6443: connect: connection refused" node="localhost" Aug 19 00:25:09.187553 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Aug 19 00:25:09.204983 kubelet[2313]: E0819 00:25:09.204954 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.208704 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Aug 19 00:25:09.230683 kubelet[2313]: E0819 00:25:09.230626 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.233221 systemd[1]: Created slice kubepods-burstable-podde500244ccd11df6d010757ff3aa8f1c.slice - libcontainer container kubepods-burstable-podde500244ccd11df6d010757ff3aa8f1c.slice. Aug 19 00:25:09.234899 kubelet[2313]: E0819 00:25:09.234847 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.253380 kubelet[2313]: I0819 00:25:09.253331 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:09.253380 kubelet[2313]: I0819 00:25:09.253371 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:09.253499 kubelet[2313]: I0819 00:25:09.253395 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Aug 19 00:25:09.253499 kubelet[2313]: I0819 00:25:09.253420 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:09.253499 kubelet[2313]: I0819 00:25:09.253438 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:09.253499 kubelet[2313]: I0819 00:25:09.253451 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:09.253499 kubelet[2313]: I0819 00:25:09.253464 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:09.253598 kubelet[2313]: I0819 00:25:09.253487 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:09.253598 kubelet[2313]: I0819 00:25:09.253502 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:09.320260 kubelet[2313]: I0819 00:25:09.320165 2313 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 00:25:09.320576 kubelet[2313]: E0819 00:25:09.320531 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.116:6443/api/v1/nodes\": dial tcp 10.0.0.116:6443: connect: connection refused" node="localhost" Aug 19 00:25:09.455166 kubelet[2313]: E0819 00:25:09.455095 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.116:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.116:6443: connect: connection refused" interval="800ms" Aug 19 00:25:09.507257 containerd[1525]: time="2025-08-19T00:25:09.507040585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:09.531909 containerd[1525]: time="2025-08-19T00:25:09.531831145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:09.536855 containerd[1525]: time="2025-08-19T00:25:09.536801945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:de500244ccd11df6d010757ff3aa8f1c,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:09.553227 containerd[1525]: time="2025-08-19T00:25:09.552512745Z" level=info msg="connecting to shim d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd" address="unix:///run/containerd/s/d24ab11f6f6cba5432b894cc3e539a22654fdad8c391b278bed926f440809d94" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:09.578102 containerd[1525]: time="2025-08-19T00:25:09.577980625Z" level=info msg="connecting to shim f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b" address="unix:///run/containerd/s/b700587408dd8a855b2e3cfa6b229c0ef7ff997a9571de182278cb68df619f89" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:09.586116 containerd[1525]: time="2025-08-19T00:25:09.586068825Z" level=info msg="connecting to shim 438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188" address="unix:///run/containerd/s/987a814866221354ca254d4cbe8ab3640aad326c84c9a318656dbf2be139eab4" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:09.596426 systemd[1]: Started cri-containerd-d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd.scope - libcontainer container d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd. Aug 19 00:25:09.617821 systemd[1]: Started cri-containerd-f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b.scope - libcontainer container f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b. Aug 19 00:25:09.623663 systemd[1]: Started cri-containerd-438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188.scope - libcontainer container 438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188. Aug 19 00:25:09.655784 containerd[1525]: time="2025-08-19T00:25:09.655741025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd\"" Aug 19 00:25:09.667604 containerd[1525]: time="2025-08-19T00:25:09.667562305Z" level=info msg="CreateContainer within sandbox \"d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 00:25:09.672820 containerd[1525]: time="2025-08-19T00:25:09.672760345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:de500244ccd11df6d010757ff3aa8f1c,Namespace:kube-system,Attempt:0,} returns sandbox id \"438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188\"" Aug 19 00:25:09.677444 containerd[1525]: time="2025-08-19T00:25:09.677389025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b\"" Aug 19 00:25:09.679317 containerd[1525]: time="2025-08-19T00:25:09.679279265Z" level=info msg="CreateContainer within sandbox \"438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 00:25:09.685159 containerd[1525]: time="2025-08-19T00:25:09.685116145Z" level=info msg="CreateContainer within sandbox \"f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 00:25:09.698909 containerd[1525]: time="2025-08-19T00:25:09.698841185Z" level=info msg="Container 5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:09.721920 kubelet[2313]: I0819 00:25:09.721871 2313 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 00:25:09.722264 kubelet[2313]: E0819 00:25:09.722241 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.116:6443/api/v1/nodes\": dial tcp 10.0.0.116:6443: connect: connection refused" node="localhost" Aug 19 00:25:09.738689 containerd[1525]: time="2025-08-19T00:25:09.738636625Z" level=info msg="Container ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:09.754000 kubelet[2313]: E0819 00:25:09.753954 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.116:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:25:09.772644 containerd[1525]: time="2025-08-19T00:25:09.772594825Z" level=info msg="Container da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:09.772955 containerd[1525]: time="2025-08-19T00:25:09.772911545Z" level=info msg="CreateContainer within sandbox \"d5dfdd12bd3147b86a9627a794871835fe1e8cd67b0fc1da67d4766cee9483cd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e\"" Aug 19 00:25:09.773871 containerd[1525]: time="2025-08-19T00:25:09.773837705Z" level=info msg="StartContainer for \"5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e\"" Aug 19 00:25:09.775029 containerd[1525]: time="2025-08-19T00:25:09.774991625Z" level=info msg="connecting to shim 5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e" address="unix:///run/containerd/s/d24ab11f6f6cba5432b894cc3e539a22654fdad8c391b278bed926f440809d94" protocol=ttrpc version=3 Aug 19 00:25:09.780753 containerd[1525]: time="2025-08-19T00:25:09.780705425Z" level=info msg="CreateContainer within sandbox \"438231ae26074f1f247910cc62996ff95d6c76a61985294a17640a985df44188\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b\"" Aug 19 00:25:09.781548 containerd[1525]: time="2025-08-19T00:25:09.781517505Z" level=info msg="StartContainer for \"ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b\"" Aug 19 00:25:09.782939 containerd[1525]: time="2025-08-19T00:25:09.782903985Z" level=info msg="connecting to shim ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b" address="unix:///run/containerd/s/987a814866221354ca254d4cbe8ab3640aad326c84c9a318656dbf2be139eab4" protocol=ttrpc version=3 Aug 19 00:25:09.785578 containerd[1525]: time="2025-08-19T00:25:09.785541065Z" level=info msg="CreateContainer within sandbox \"f4c52a959f9d4b275d8010ec1e3da69696019a1a4478e941e7bf23cf7f42a93b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e\"" Aug 19 00:25:09.786539 containerd[1525]: time="2025-08-19T00:25:09.786286545Z" level=info msg="StartContainer for \"da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e\"" Aug 19 00:25:09.788010 containerd[1525]: time="2025-08-19T00:25:09.787977105Z" level=info msg="connecting to shim da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e" address="unix:///run/containerd/s/b700587408dd8a855b2e3cfa6b229c0ef7ff997a9571de182278cb68df619f89" protocol=ttrpc version=3 Aug 19 00:25:09.800428 systemd[1]: Started cri-containerd-5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e.scope - libcontainer container 5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e. Aug 19 00:25:09.804252 systemd[1]: Started cri-containerd-ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b.scope - libcontainer container ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b. Aug 19 00:25:09.809152 systemd[1]: Started cri-containerd-da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e.scope - libcontainer container da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e. Aug 19 00:25:09.850568 kubelet[2313]: E0819 00:25:09.850454 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.116:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:25:09.859140 containerd[1525]: time="2025-08-19T00:25:09.859103705Z" level=info msg="StartContainer for \"5a2c47e24b8d9a31b51ad814a2fe729cfa52d9477a8560a588bcd9614a4d023e\" returns successfully" Aug 19 00:25:09.869574 containerd[1525]: time="2025-08-19T00:25:09.869532945Z" level=info msg="StartContainer for \"da52f887a58a1b91fdce50ae9ec3b57abb70cbe42da4bf2033b97e9132112b5e\" returns successfully" Aug 19 00:25:09.871193 containerd[1525]: time="2025-08-19T00:25:09.871138585Z" level=info msg="StartContainer for \"ff8c43b07969efbd76de11096725f317305bfa7b0395a5154dcdf0ecaacb645b\" returns successfully" Aug 19 00:25:09.882728 kubelet[2313]: E0819 00:25:09.882691 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.885423 kubelet[2313]: E0819 00:25:09.885359 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.893494 kubelet[2313]: E0819 00:25:09.893464 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:09.979249 kubelet[2313]: E0819 00:25:09.979130 2313 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.116:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.116:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185d03589b68d9f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-19 00:25:08.843706865 +0000 UTC m=+0.749313441,LastTimestamp:2025-08-19 00:25:08.843706865 +0000 UTC m=+0.749313441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 19 00:25:10.067757 kubelet[2313]: E0819 00:25:10.067706 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.116:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:25:10.256406 kubelet[2313]: E0819 00:25:10.256272 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.116:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.116:6443: connect: connection refused" interval="1.6s" Aug 19 00:25:10.325550 kubelet[2313]: E0819 00:25:10.325486 2313 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.116:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.116:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:25:10.524212 kubelet[2313]: I0819 00:25:10.523589 2313 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 00:25:10.890007 kubelet[2313]: E0819 00:25:10.889853 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:10.891226 kubelet[2313]: E0819 00:25:10.891136 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:12.131223 kubelet[2313]: E0819 00:25:12.130281 2313 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 00:25:12.242690 kubelet[2313]: E0819 00:25:12.242649 2313 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Aug 19 00:25:12.356808 kubelet[2313]: I0819 00:25:12.355608 2313 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 00:25:12.356808 kubelet[2313]: E0819 00:25:12.355654 2313 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Aug 19 00:25:12.368616 kubelet[2313]: E0819 00:25:12.368564 2313 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 00:25:12.454030 kubelet[2313]: I0819 00:25:12.453876 2313 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:12.460692 kubelet[2313]: E0819 00:25:12.460625 2313 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:12.460692 kubelet[2313]: I0819 00:25:12.460665 2313 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 00:25:12.462458 kubelet[2313]: E0819 00:25:12.462422 2313 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Aug 19 00:25:12.462458 kubelet[2313]: I0819 00:25:12.462448 2313 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:12.464720 kubelet[2313]: E0819 00:25:12.464670 2313 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:12.841075 kubelet[2313]: I0819 00:25:12.840829 2313 apiserver.go:52] "Watching apiserver" Aug 19 00:25:12.852087 kubelet[2313]: I0819 00:25:12.852045 2313 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:25:13.697684 kubelet[2313]: I0819 00:25:13.697627 2313 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:14.562832 systemd[1]: Reload requested from client PID 2598 ('systemctl') (unit session-7.scope)... Aug 19 00:25:14.562854 systemd[1]: Reloading... Aug 19 00:25:14.649251 zram_generator::config[2641]: No configuration found. Aug 19 00:25:14.848691 systemd[1]: Reloading finished in 285 ms. Aug 19 00:25:14.870560 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:14.871690 kubelet[2313]: I0819 00:25:14.871565 2313 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:25:14.887663 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 00:25:14.887952 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:14.888017 systemd[1]: kubelet.service: Consumed 1.169s CPU time, 128.5M memory peak. Aug 19 00:25:14.890525 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:25:15.074400 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:25:15.081110 (kubelet)[2683]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:25:15.122347 kubelet[2683]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:25:15.122347 kubelet[2683]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:25:15.122347 kubelet[2683]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:25:15.122347 kubelet[2683]: I0819 00:25:15.121768 2683 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:25:15.129274 kubelet[2683]: I0819 00:25:15.128884 2683 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:25:15.129274 kubelet[2683]: I0819 00:25:15.128915 2683 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:25:15.129274 kubelet[2683]: I0819 00:25:15.129156 2683 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:25:15.130842 kubelet[2683]: I0819 00:25:15.130797 2683 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 19 00:25:15.133750 kubelet[2683]: I0819 00:25:15.133478 2683 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:25:15.138884 kubelet[2683]: I0819 00:25:15.138851 2683 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:25:15.142809 kubelet[2683]: I0819 00:25:15.142772 2683 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:25:15.143047 kubelet[2683]: I0819 00:25:15.143018 2683 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:25:15.144426 kubelet[2683]: I0819 00:25:15.143047 2683 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:25:15.144426 kubelet[2683]: I0819 00:25:15.143241 2683 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:25:15.144426 kubelet[2683]: I0819 00:25:15.143249 2683 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:25:15.144426 kubelet[2683]: I0819 00:25:15.143295 2683 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:25:15.144426 kubelet[2683]: I0819 00:25:15.143452 2683 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:25:15.144626 kubelet[2683]: I0819 00:25:15.143464 2683 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:25:15.144626 kubelet[2683]: I0819 00:25:15.143487 2683 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:25:15.144626 kubelet[2683]: I0819 00:25:15.143500 2683 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:25:15.146330 kubelet[2683]: I0819 00:25:15.144996 2683 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:25:15.146330 kubelet[2683]: I0819 00:25:15.145582 2683 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:25:15.148100 kubelet[2683]: I0819 00:25:15.148060 2683 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:25:15.148100 kubelet[2683]: I0819 00:25:15.148108 2683 server.go:1289] "Started kubelet" Aug 19 00:25:15.148533 kubelet[2683]: I0819 00:25:15.148484 2683 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:25:15.150253 kubelet[2683]: I0819 00:25:15.148807 2683 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:25:15.150253 kubelet[2683]: I0819 00:25:15.148860 2683 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:25:15.150253 kubelet[2683]: I0819 00:25:15.149871 2683 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:25:15.154437 kubelet[2683]: E0819 00:25:15.153801 2683 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 00:25:15.156669 kubelet[2683]: I0819 00:25:15.154780 2683 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:25:15.156669 kubelet[2683]: I0819 00:25:15.156138 2683 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:25:15.158456 kubelet[2683]: I0819 00:25:15.158424 2683 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:25:15.158535 kubelet[2683]: E0819 00:25:15.158487 2683 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 00:25:15.159348 kubelet[2683]: I0819 00:25:15.159190 2683 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:25:15.162005 kubelet[2683]: I0819 00:25:15.161954 2683 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:25:15.163090 kubelet[2683]: I0819 00:25:15.163061 2683 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:25:15.165585 kubelet[2683]: I0819 00:25:15.165276 2683 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:25:15.177781 kubelet[2683]: I0819 00:25:15.177446 2683 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:25:15.180979 kubelet[2683]: I0819 00:25:15.180913 2683 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:25:15.182004 kubelet[2683]: I0819 00:25:15.181963 2683 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:25:15.182004 kubelet[2683]: I0819 00:25:15.181994 2683 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:25:15.182088 kubelet[2683]: I0819 00:25:15.182016 2683 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:25:15.182088 kubelet[2683]: I0819 00:25:15.182025 2683 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:25:15.182133 kubelet[2683]: E0819 00:25:15.182070 2683 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 00:25:15.217965 kubelet[2683]: I0819 00:25:15.217918 2683 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:25:15.217965 kubelet[2683]: I0819 00:25:15.217940 2683 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:25:15.217965 kubelet[2683]: I0819 00:25:15.217962 2683 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:25:15.218137 kubelet[2683]: I0819 00:25:15.218113 2683 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 00:25:15.218137 kubelet[2683]: I0819 00:25:15.218123 2683 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 00:25:15.218194 kubelet[2683]: I0819 00:25:15.218139 2683 policy_none.go:49] "None policy: Start" Aug 19 00:25:15.218194 kubelet[2683]: I0819 00:25:15.218148 2683 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:25:15.218194 kubelet[2683]: I0819 00:25:15.218156 2683 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:25:15.218285 kubelet[2683]: I0819 00:25:15.218255 2683 state_mem.go:75] "Updated machine memory state" Aug 19 00:25:15.222730 kubelet[2683]: E0819 00:25:15.222687 2683 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:25:15.223003 kubelet[2683]: I0819 00:25:15.222973 2683 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:25:15.223150 kubelet[2683]: I0819 00:25:15.222991 2683 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:25:15.223409 kubelet[2683]: I0819 00:25:15.223393 2683 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:25:15.228240 kubelet[2683]: E0819 00:25:15.226396 2683 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:25:15.283487 kubelet[2683]: I0819 00:25:15.283435 2683 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:15.283893 kubelet[2683]: I0819 00:25:15.283762 2683 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.283893 kubelet[2683]: I0819 00:25:15.283841 2683 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 00:25:15.325512 kubelet[2683]: I0819 00:25:15.325443 2683 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 00:25:15.363353 kubelet[2683]: I0819 00:25:15.363288 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.363493 kubelet[2683]: I0819 00:25:15.363388 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.363493 kubelet[2683]: I0819 00:25:15.363428 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Aug 19 00:25:15.363493 kubelet[2683]: I0819 00:25:15.363449 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:15.363493 kubelet[2683]: I0819 00:25:15.363465 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:15.363493 kubelet[2683]: I0819 00:25:15.363478 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.363636 kubelet[2683]: I0819 00:25:15.363493 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.363636 kubelet[2683]: I0819 00:25:15.363510 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:15.363636 kubelet[2683]: I0819 00:25:15.363528 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/de500244ccd11df6d010757ff3aa8f1c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"de500244ccd11df6d010757ff3aa8f1c\") " pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:15.397216 kubelet[2683]: E0819 00:25:15.396984 2683 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:15.414973 kubelet[2683]: I0819 00:25:15.414854 2683 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Aug 19 00:25:15.414973 kubelet[2683]: I0819 00:25:15.414949 2683 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 00:25:16.144987 kubelet[2683]: I0819 00:25:16.144703 2683 apiserver.go:52] "Watching apiserver" Aug 19 00:25:16.159739 kubelet[2683]: I0819 00:25:16.159686 2683 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:25:16.205315 kubelet[2683]: I0819 00:25:16.205055 2683 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:16.205315 kubelet[2683]: I0819 00:25:16.205058 2683 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:16.215224 kubelet[2683]: E0819 00:25:16.214958 2683 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Aug 19 00:25:16.215436 kubelet[2683]: E0819 00:25:16.215404 2683 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Aug 19 00:25:16.230766 kubelet[2683]: I0819 00:25:16.230695 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.230681424 podStartE2EDuration="1.230681424s" podCreationTimestamp="2025-08-19 00:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:25:16.230620904 +0000 UTC m=+1.145753426" watchObservedRunningTime="2025-08-19 00:25:16.230681424 +0000 UTC m=+1.145813946" Aug 19 00:25:16.254345 kubelet[2683]: I0819 00:25:16.254286 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.25426752 podStartE2EDuration="3.25426752s" podCreationTimestamp="2025-08-19 00:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:25:16.243415436 +0000 UTC m=+1.158547958" watchObservedRunningTime="2025-08-19 00:25:16.25426752 +0000 UTC m=+1.169400042" Aug 19 00:25:16.284738 kubelet[2683]: I0819 00:25:16.284275 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.284259482 podStartE2EDuration="1.284259482s" podCreationTimestamp="2025-08-19 00:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:25:16.254769482 +0000 UTC m=+1.169902004" watchObservedRunningTime="2025-08-19 00:25:16.284259482 +0000 UTC m=+1.199392004" Aug 19 00:25:19.594055 kubelet[2683]: I0819 00:25:19.593988 2683 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 00:25:19.594535 containerd[1525]: time="2025-08-19T00:25:19.594368684Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 00:25:19.594710 kubelet[2683]: I0819 00:25:19.594571 2683 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 00:25:20.567984 systemd[1]: Created slice kubepods-besteffort-pod2f423074_5b15_4f7a_96ac_47614149825d.slice - libcontainer container kubepods-besteffort-pod2f423074_5b15_4f7a_96ac_47614149825d.slice. Aug 19 00:25:20.598605 kubelet[2683]: I0819 00:25:20.598550 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f423074-5b15-4f7a-96ac-47614149825d-lib-modules\") pod \"kube-proxy-7tm5j\" (UID: \"2f423074-5b15-4f7a-96ac-47614149825d\") " pod="kube-system/kube-proxy-7tm5j" Aug 19 00:25:20.598969 kubelet[2683]: I0819 00:25:20.598632 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbcj\" (UniqueName: \"kubernetes.io/projected/2f423074-5b15-4f7a-96ac-47614149825d-kube-api-access-xhbcj\") pod \"kube-proxy-7tm5j\" (UID: \"2f423074-5b15-4f7a-96ac-47614149825d\") " pod="kube-system/kube-proxy-7tm5j" Aug 19 00:25:20.598969 kubelet[2683]: I0819 00:25:20.598671 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2f423074-5b15-4f7a-96ac-47614149825d-kube-proxy\") pod \"kube-proxy-7tm5j\" (UID: \"2f423074-5b15-4f7a-96ac-47614149825d\") " pod="kube-system/kube-proxy-7tm5j" Aug 19 00:25:20.598969 kubelet[2683]: I0819 00:25:20.598690 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2f423074-5b15-4f7a-96ac-47614149825d-xtables-lock\") pod \"kube-proxy-7tm5j\" (UID: \"2f423074-5b15-4f7a-96ac-47614149825d\") " pod="kube-system/kube-proxy-7tm5j" Aug 19 00:25:20.865874 systemd[1]: Created slice kubepods-besteffort-pod31094031_1731_4d48_9829_db0600442a19.slice - libcontainer container kubepods-besteffort-pod31094031_1731_4d48_9829_db0600442a19.slice. Aug 19 00:25:20.884390 containerd[1525]: time="2025-08-19T00:25:20.884325819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7tm5j,Uid:2f423074-5b15-4f7a-96ac-47614149825d,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:20.903594 kubelet[2683]: I0819 00:25:20.901865 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/31094031-1731-4d48-9829-db0600442a19-var-lib-calico\") pod \"tigera-operator-747864d56d-fhfgv\" (UID: \"31094031-1731-4d48-9829-db0600442a19\") " pod="tigera-operator/tigera-operator-747864d56d-fhfgv" Aug 19 00:25:20.903594 kubelet[2683]: I0819 00:25:20.901919 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnkpr\" (UniqueName: \"kubernetes.io/projected/31094031-1731-4d48-9829-db0600442a19-kube-api-access-nnkpr\") pod \"tigera-operator-747864d56d-fhfgv\" (UID: \"31094031-1731-4d48-9829-db0600442a19\") " pod="tigera-operator/tigera-operator-747864d56d-fhfgv" Aug 19 00:25:20.913483 containerd[1525]: time="2025-08-19T00:25:20.913426070Z" level=info msg="connecting to shim 0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64" address="unix:///run/containerd/s/99f6852e132439f2c258ac0be6ca58039060fadc9b033e207290a4afe0bfabc3" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:20.951479 systemd[1]: Started cri-containerd-0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64.scope - libcontainer container 0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64. Aug 19 00:25:21.050637 containerd[1525]: time="2025-08-19T00:25:21.050519411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7tm5j,Uid:2f423074-5b15-4f7a-96ac-47614149825d,Namespace:kube-system,Attempt:0,} returns sandbox id \"0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64\"" Aug 19 00:25:21.077603 containerd[1525]: time="2025-08-19T00:25:21.077542650Z" level=info msg="CreateContainer within sandbox \"0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 00:25:21.108503 containerd[1525]: time="2025-08-19T00:25:21.108439581Z" level=info msg="Container 8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:21.121256 containerd[1525]: time="2025-08-19T00:25:21.120733377Z" level=info msg="CreateContainer within sandbox \"0b61df0cf32a73e62c59efc8365633c9452b5e5e5804e43b7ed482626b75ef64\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16\"" Aug 19 00:25:21.121680 containerd[1525]: time="2025-08-19T00:25:21.121534940Z" level=info msg="StartContainer for \"8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16\"" Aug 19 00:25:21.125854 containerd[1525]: time="2025-08-19T00:25:21.125800272Z" level=info msg="connecting to shim 8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16" address="unix:///run/containerd/s/99f6852e132439f2c258ac0be6ca58039060fadc9b033e207290a4afe0bfabc3" protocol=ttrpc version=3 Aug 19 00:25:21.151459 systemd[1]: Started cri-containerd-8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16.scope - libcontainer container 8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16. Aug 19 00:25:21.174404 containerd[1525]: time="2025-08-19T00:25:21.174340095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-fhfgv,Uid:31094031-1731-4d48-9829-db0600442a19,Namespace:tigera-operator,Attempt:0,}" Aug 19 00:25:21.200492 containerd[1525]: time="2025-08-19T00:25:21.200437132Z" level=info msg="StartContainer for \"8b1d30d950f611616eebbacd211d9893e04aa846374e29dc230b37bd94d63e16\" returns successfully" Aug 19 00:25:21.220549 containerd[1525]: time="2025-08-19T00:25:21.220454711Z" level=info msg="connecting to shim 09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc" address="unix:///run/containerd/s/062c4856adf54a1c54fe831800747b00131c69420b2b5a96f9d82abb5b0553d3" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:21.235936 kubelet[2683]: I0819 00:25:21.235869 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7tm5j" podStartSLOduration=1.235849156 podStartE2EDuration="1.235849156s" podCreationTimestamp="2025-08-19 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:25:21.234879033 +0000 UTC m=+6.150011555" watchObservedRunningTime="2025-08-19 00:25:21.235849156 +0000 UTC m=+6.150981678" Aug 19 00:25:21.260442 systemd[1]: Started cri-containerd-09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc.scope - libcontainer container 09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc. Aug 19 00:25:21.310629 containerd[1525]: time="2025-08-19T00:25:21.310503456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-fhfgv,Uid:31094031-1731-4d48-9829-db0600442a19,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc\"" Aug 19 00:25:21.315437 containerd[1525]: time="2025-08-19T00:25:21.315390950Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 00:25:22.529299 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3659765355.mount: Deactivated successfully. Aug 19 00:25:23.469420 containerd[1525]: time="2025-08-19T00:25:23.469304139Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:23.470882 containerd[1525]: time="2025-08-19T00:25:23.470845743Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Aug 19 00:25:23.472151 containerd[1525]: time="2025-08-19T00:25:23.472098386Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:23.476397 containerd[1525]: time="2025-08-19T00:25:23.476338357Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:23.477098 containerd[1525]: time="2025-08-19T00:25:23.477051839Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.161326328s" Aug 19 00:25:23.477098 containerd[1525]: time="2025-08-19T00:25:23.477089039Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Aug 19 00:25:23.488454 containerd[1525]: time="2025-08-19T00:25:23.488398188Z" level=info msg="CreateContainer within sandbox \"09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 00:25:23.504548 containerd[1525]: time="2025-08-19T00:25:23.504460230Z" level=info msg="Container 5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:23.510064 containerd[1525]: time="2025-08-19T00:25:23.510015884Z" level=info msg="CreateContainer within sandbox \"09fa4a50fcb2a6b62ed52b3c9424d22eab52be1a79c14c080e32e27e4b6f98bc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9\"" Aug 19 00:25:23.510565 containerd[1525]: time="2025-08-19T00:25:23.510534085Z" level=info msg="StartContainer for \"5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9\"" Aug 19 00:25:23.511688 containerd[1525]: time="2025-08-19T00:25:23.511653008Z" level=info msg="connecting to shim 5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9" address="unix:///run/containerd/s/062c4856adf54a1c54fe831800747b00131c69420b2b5a96f9d82abb5b0553d3" protocol=ttrpc version=3 Aug 19 00:25:23.538449 systemd[1]: Started cri-containerd-5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9.scope - libcontainer container 5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9. Aug 19 00:25:23.577537 containerd[1525]: time="2025-08-19T00:25:23.577489019Z" level=info msg="StartContainer for \"5751f38c33febdcca3986047bee34c97a4227b969d6edfaf953e39ba5ea8c1b9\" returns successfully" Aug 19 00:25:24.242839 kubelet[2683]: I0819 00:25:24.242736 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-fhfgv" podStartSLOduration=2.076935641 podStartE2EDuration="4.2427171s" podCreationTimestamp="2025-08-19 00:25:20 +0000 UTC" firstStartedPulling="2025-08-19 00:25:21.314958989 +0000 UTC m=+6.230091511" lastFinishedPulling="2025-08-19 00:25:23.480740448 +0000 UTC m=+8.395872970" observedRunningTime="2025-08-19 00:25:24.241888058 +0000 UTC m=+9.157655062" watchObservedRunningTime="2025-08-19 00:25:24.2427171 +0000 UTC m=+9.157849622" Aug 19 00:25:26.097349 update_engine[1510]: I20250819 00:25:26.097262 1510 update_attempter.cc:509] Updating boot flags... Aug 19 00:25:29.729290 sudo[1739]: pam_unix(sudo:session): session closed for user root Aug 19 00:25:29.732798 sshd[1738]: Connection closed by 10.0.0.1 port 53060 Aug 19 00:25:29.733659 sshd-session[1735]: pam_unix(sshd:session): session closed for user core Aug 19 00:25:29.737681 systemd[1]: sshd@6-10.0.0.116:22-10.0.0.1:53060.service: Deactivated successfully. Aug 19 00:25:29.739622 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 00:25:29.739819 systemd[1]: session-7.scope: Consumed 7.435s CPU time, 226.2M memory peak. Aug 19 00:25:29.741542 systemd-logind[1507]: Session 7 logged out. Waiting for processes to exit. Aug 19 00:25:29.743725 systemd-logind[1507]: Removed session 7. Aug 19 00:25:35.426316 systemd[1]: Created slice kubepods-besteffort-pod6a82c2a5_be9f_4402_927f_e8bb9c0e9ee7.slice - libcontainer container kubepods-besteffort-pod6a82c2a5_be9f_4402_927f_e8bb9c0e9ee7.slice. Aug 19 00:25:35.511462 kubelet[2683]: I0819 00:25:35.511343 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7-typha-certs\") pod \"calico-typha-fc65b964b-tvv24\" (UID: \"6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7\") " pod="calico-system/calico-typha-fc65b964b-tvv24" Aug 19 00:25:35.511462 kubelet[2683]: I0819 00:25:35.511410 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7-tigera-ca-bundle\") pod \"calico-typha-fc65b964b-tvv24\" (UID: \"6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7\") " pod="calico-system/calico-typha-fc65b964b-tvv24" Aug 19 00:25:35.511462 kubelet[2683]: I0819 00:25:35.511432 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxq4q\" (UniqueName: \"kubernetes.io/projected/6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7-kube-api-access-jxq4q\") pod \"calico-typha-fc65b964b-tvv24\" (UID: \"6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7\") " pod="calico-system/calico-typha-fc65b964b-tvv24" Aug 19 00:25:35.746789 containerd[1525]: time="2025-08-19T00:25:35.746660367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fc65b964b-tvv24,Uid:6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:35.866247 systemd[1]: Created slice kubepods-besteffort-podc5dab353_577d_4533_9e86_0d67591443f0.slice - libcontainer container kubepods-besteffort-podc5dab353_577d_4533_9e86_0d67591443f0.slice. Aug 19 00:25:35.873503 containerd[1525]: time="2025-08-19T00:25:35.873437678Z" level=info msg="connecting to shim ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a" address="unix:///run/containerd/s/18fa5f98ba8f817acd3e8080bf32da4e4becec96321c352da149c5a4c7f97a4c" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:35.914270 kubelet[2683]: I0819 00:25:35.914222 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-cni-log-dir\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914270 kubelet[2683]: I0819 00:25:35.914279 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-cni-net-dir\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914448 kubelet[2683]: I0819 00:25:35.914303 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-cni-bin-dir\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914448 kubelet[2683]: I0819 00:25:35.914321 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-var-lib-calico\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914448 kubelet[2683]: I0819 00:25:35.914339 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-lib-modules\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914448 kubelet[2683]: I0819 00:25:35.914367 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-var-run-calico\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914448 kubelet[2683]: I0819 00:25:35.914386 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c5dab353-577d-4533-9e86-0d67591443f0-node-certs\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914573 kubelet[2683]: I0819 00:25:35.914477 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-xtables-lock\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914573 kubelet[2683]: I0819 00:25:35.914522 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fg9\" (UniqueName: \"kubernetes.io/projected/c5dab353-577d-4533-9e86-0d67591443f0-kube-api-access-x8fg9\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914573 kubelet[2683]: I0819 00:25:35.914549 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-policysync\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914573 kubelet[2683]: I0819 00:25:35.914565 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5dab353-577d-4533-9e86-0d67591443f0-tigera-ca-bundle\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.914677 kubelet[2683]: I0819 00:25:35.914596 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c5dab353-577d-4533-9e86-0d67591443f0-flexvol-driver-host\") pod \"calico-node-4wbjt\" (UID: \"c5dab353-577d-4533-9e86-0d67591443f0\") " pod="calico-system/calico-node-4wbjt" Aug 19 00:25:35.948465 systemd[1]: Started cri-containerd-ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a.scope - libcontainer container ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a. Aug 19 00:25:36.003698 kubelet[2683]: E0819 00:25:36.002591 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4ccpq" podUID="2c50faac-e631-4180-82e0-eced6cb1cc4a" Aug 19 00:25:36.042841 kubelet[2683]: E0819 00:25:36.042502 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.042841 kubelet[2683]: W0819 00:25:36.042545 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.042841 kubelet[2683]: E0819 00:25:36.042574 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.043335 kubelet[2683]: E0819 00:25:36.043305 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.043335 kubelet[2683]: W0819 00:25:36.043325 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.043335 kubelet[2683]: E0819 00:25:36.043354 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.043866 kubelet[2683]: E0819 00:25:36.043645 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.043866 kubelet[2683]: W0819 00:25:36.043660 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.043866 kubelet[2683]: E0819 00:25:36.043671 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.044136 kubelet[2683]: E0819 00:25:36.044112 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.044136 kubelet[2683]: W0819 00:25:36.044132 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.044136 kubelet[2683]: E0819 00:25:36.044145 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.045153 kubelet[2683]: E0819 00:25:36.045119 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.045153 kubelet[2683]: W0819 00:25:36.045145 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.045260 kubelet[2683]: E0819 00:25:36.045168 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.045500 kubelet[2683]: E0819 00:25:36.045478 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.045533 kubelet[2683]: W0819 00:25:36.045496 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.045533 kubelet[2683]: E0819 00:25:36.045524 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.046876 kubelet[2683]: E0819 00:25:36.046839 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.046876 kubelet[2683]: W0819 00:25:36.046866 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.046953 kubelet[2683]: E0819 00:25:36.046886 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.047177 kubelet[2683]: E0819 00:25:36.047142 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.047177 kubelet[2683]: W0819 00:25:36.047162 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.047280 kubelet[2683]: E0819 00:25:36.047189 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.072710 containerd[1525]: time="2025-08-19T00:25:36.072618190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fc65b964b-tvv24,Uid:6a82c2a5-be9f-4402-927f-e8bb9c0e9ee7,Namespace:calico-system,Attempt:0,} returns sandbox id \"ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a\"" Aug 19 00:25:36.084954 containerd[1525]: time="2025-08-19T00:25:36.084906044Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 00:25:36.088420 kubelet[2683]: E0819 00:25:36.088366 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.088420 kubelet[2683]: W0819 00:25:36.088407 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.088593 kubelet[2683]: E0819 00:25:36.088432 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.088670 kubelet[2683]: E0819 00:25:36.088633 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.088707 kubelet[2683]: W0819 00:25:36.088648 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.088707 kubelet[2683]: E0819 00:25:36.088702 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.088860 kubelet[2683]: E0819 00:25:36.088837 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.088860 kubelet[2683]: W0819 00:25:36.088848 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.088860 kubelet[2683]: E0819 00:25:36.088857 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089021 kubelet[2683]: E0819 00:25:36.089000 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089054 kubelet[2683]: W0819 00:25:36.089026 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089054 kubelet[2683]: E0819 00:25:36.089035 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089179 kubelet[2683]: E0819 00:25:36.089168 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089215 kubelet[2683]: W0819 00:25:36.089178 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089215 kubelet[2683]: E0819 00:25:36.089187 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089348 kubelet[2683]: E0819 00:25:36.089331 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089380 kubelet[2683]: W0819 00:25:36.089341 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089380 kubelet[2683]: E0819 00:25:36.089359 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089626 kubelet[2683]: E0819 00:25:36.089601 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089626 kubelet[2683]: W0819 00:25:36.089615 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089626 kubelet[2683]: E0819 00:25:36.089625 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089784 kubelet[2683]: E0819 00:25:36.089773 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089807 kubelet[2683]: W0819 00:25:36.089785 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089807 kubelet[2683]: E0819 00:25:36.089793 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.089954 kubelet[2683]: E0819 00:25:36.089943 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.089983 kubelet[2683]: W0819 00:25:36.089954 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.089983 kubelet[2683]: E0819 00:25:36.089961 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.090103 kubelet[2683]: E0819 00:25:36.090093 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.090103 kubelet[2683]: W0819 00:25:36.090102 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.090152 kubelet[2683]: E0819 00:25:36.090109 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.090248 kubelet[2683]: E0819 00:25:36.090237 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.090282 kubelet[2683]: W0819 00:25:36.090249 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.090282 kubelet[2683]: E0819 00:25:36.090257 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.090432 kubelet[2683]: E0819 00:25:36.090419 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.090465 kubelet[2683]: W0819 00:25:36.090431 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.090465 kubelet[2683]: E0819 00:25:36.090439 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.090589 kubelet[2683]: E0819 00:25:36.090579 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.090634 kubelet[2683]: W0819 00:25:36.090589 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.090634 kubelet[2683]: E0819 00:25:36.090597 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.090720 kubelet[2683]: E0819 00:25:36.090710 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.090748 kubelet[2683]: W0819 00:25:36.090720 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.090748 kubelet[2683]: E0819 00:25:36.090727 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091043 kubelet[2683]: E0819 00:25:36.091017 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091043 kubelet[2683]: W0819 00:25:36.091031 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091043 kubelet[2683]: E0819 00:25:36.091040 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091195 kubelet[2683]: E0819 00:25:36.091183 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091195 kubelet[2683]: W0819 00:25:36.091194 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091274 kubelet[2683]: E0819 00:25:36.091212 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091365 kubelet[2683]: E0819 00:25:36.091352 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091393 kubelet[2683]: W0819 00:25:36.091364 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091393 kubelet[2683]: E0819 00:25:36.091372 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091510 kubelet[2683]: E0819 00:25:36.091497 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091534 kubelet[2683]: W0819 00:25:36.091509 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091534 kubelet[2683]: E0819 00:25:36.091520 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091639 kubelet[2683]: E0819 00:25:36.091629 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091660 kubelet[2683]: W0819 00:25:36.091640 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091660 kubelet[2683]: E0819 00:25:36.091648 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.091775 kubelet[2683]: E0819 00:25:36.091765 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.091797 kubelet[2683]: W0819 00:25:36.091774 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.091797 kubelet[2683]: E0819 00:25:36.091781 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.116405 kubelet[2683]: E0819 00:25:36.116301 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.116405 kubelet[2683]: W0819 00:25:36.116398 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.116576 kubelet[2683]: E0819 00:25:36.116424 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.116576 kubelet[2683]: I0819 00:25:36.116456 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c50faac-e631-4180-82e0-eced6cb1cc4a-kubelet-dir\") pod \"csi-node-driver-4ccpq\" (UID: \"2c50faac-e631-4180-82e0-eced6cb1cc4a\") " pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:36.117068 kubelet[2683]: E0819 00:25:36.116639 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.117068 kubelet[2683]: W0819 00:25:36.116656 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.117068 kubelet[2683]: E0819 00:25:36.116665 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.117068 kubelet[2683]: I0819 00:25:36.116692 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c50faac-e631-4180-82e0-eced6cb1cc4a-registration-dir\") pod \"csi-node-driver-4ccpq\" (UID: \"2c50faac-e631-4180-82e0-eced6cb1cc4a\") " pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:36.117068 kubelet[2683]: E0819 00:25:36.116974 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.117068 kubelet[2683]: W0819 00:25:36.116986 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.117068 kubelet[2683]: E0819 00:25:36.116996 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.117068 kubelet[2683]: I0819 00:25:36.117020 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c50faac-e631-4180-82e0-eced6cb1cc4a-socket-dir\") pod \"csi-node-driver-4ccpq\" (UID: \"2c50faac-e631-4180-82e0-eced6cb1cc4a\") " pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:36.118381 kubelet[2683]: E0819 00:25:36.118312 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.118381 kubelet[2683]: W0819 00:25:36.118338 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.118381 kubelet[2683]: E0819 00:25:36.118367 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.119387 kubelet[2683]: E0819 00:25:36.119304 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.119387 kubelet[2683]: W0819 00:25:36.119326 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.119387 kubelet[2683]: E0819 00:25:36.119351 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.119686 kubelet[2683]: E0819 00:25:36.119662 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.119686 kubelet[2683]: W0819 00:25:36.119678 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.119686 kubelet[2683]: E0819 00:25:36.119692 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.120790 kubelet[2683]: E0819 00:25:36.120323 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.120790 kubelet[2683]: W0819 00:25:36.120463 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.120790 kubelet[2683]: E0819 00:25:36.120491 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.120790 kubelet[2683]: I0819 00:25:36.120548 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2c50faac-e631-4180-82e0-eced6cb1cc4a-varrun\") pod \"csi-node-driver-4ccpq\" (UID: \"2c50faac-e631-4180-82e0-eced6cb1cc4a\") " pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:36.121786 kubelet[2683]: E0819 00:25:36.121745 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.121786 kubelet[2683]: W0819 00:25:36.121769 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.121786 kubelet[2683]: E0819 00:25:36.121789 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.122584 kubelet[2683]: E0819 00:25:36.122558 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.122584 kubelet[2683]: W0819 00:25:36.122580 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.122584 kubelet[2683]: E0819 00:25:36.122605 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.122872 kubelet[2683]: E0819 00:25:36.122853 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.122872 kubelet[2683]: W0819 00:25:36.122870 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.122918 kubelet[2683]: E0819 00:25:36.122881 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.122918 kubelet[2683]: I0819 00:25:36.122901 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pm7\" (UniqueName: \"kubernetes.io/projected/2c50faac-e631-4180-82e0-eced6cb1cc4a-kube-api-access-v6pm7\") pod \"csi-node-driver-4ccpq\" (UID: \"2c50faac-e631-4180-82e0-eced6cb1cc4a\") " pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:36.123127 kubelet[2683]: E0819 00:25:36.123108 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.123195 kubelet[2683]: W0819 00:25:36.123128 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.123195 kubelet[2683]: E0819 00:25:36.123142 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.123581 kubelet[2683]: E0819 00:25:36.123563 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.123581 kubelet[2683]: W0819 00:25:36.123580 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.123665 kubelet[2683]: E0819 00:25:36.123594 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.123825 kubelet[2683]: E0819 00:25:36.123813 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.123871 kubelet[2683]: W0819 00:25:36.123825 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.123871 kubelet[2683]: E0819 00:25:36.123836 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.123979 kubelet[2683]: E0819 00:25:36.123965 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.123979 kubelet[2683]: W0819 00:25:36.123976 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.124069 kubelet[2683]: E0819 00:25:36.123986 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.124193 kubelet[2683]: E0819 00:25:36.124177 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.124242 kubelet[2683]: W0819 00:25:36.124193 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.124242 kubelet[2683]: E0819 00:25:36.124224 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.172215 containerd[1525]: time="2025-08-19T00:25:36.172148542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4wbjt,Uid:c5dab353-577d-4533-9e86-0d67591443f0,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:36.225194 kubelet[2683]: E0819 00:25:36.225145 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.225194 kubelet[2683]: W0819 00:25:36.225176 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.225386 kubelet[2683]: E0819 00:25:36.225276 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.225654 kubelet[2683]: E0819 00:25:36.225625 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.225654 kubelet[2683]: W0819 00:25:36.225643 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.225699 kubelet[2683]: E0819 00:25:36.225656 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.225913 kubelet[2683]: E0819 00:25:36.225897 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.225913 kubelet[2683]: W0819 00:25:36.225912 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.225968 kubelet[2683]: E0819 00:25:36.225924 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226146 kubelet[2683]: E0819 00:25:36.226131 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226146 kubelet[2683]: W0819 00:25:36.226144 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.226191 kubelet[2683]: E0819 00:25:36.226152 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226335 kubelet[2683]: E0819 00:25:36.226321 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226335 kubelet[2683]: W0819 00:25:36.226335 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.226402 kubelet[2683]: E0819 00:25:36.226351 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226539 kubelet[2683]: E0819 00:25:36.226524 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226539 kubelet[2683]: W0819 00:25:36.226537 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.226578 kubelet[2683]: E0819 00:25:36.226546 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226698 kubelet[2683]: E0819 00:25:36.226672 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226698 kubelet[2683]: W0819 00:25:36.226685 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.226698 kubelet[2683]: E0819 00:25:36.226692 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226810 kubelet[2683]: E0819 00:25:36.226798 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226810 kubelet[2683]: W0819 00:25:36.226808 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.226850 kubelet[2683]: E0819 00:25:36.226815 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.226964 kubelet[2683]: E0819 00:25:36.226953 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.226964 kubelet[2683]: W0819 00:25:36.226963 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.227015 kubelet[2683]: E0819 00:25:36.226970 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.227623 kubelet[2683]: E0819 00:25:36.227457 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.227623 kubelet[2683]: W0819 00:25:36.227478 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.227623 kubelet[2683]: E0819 00:25:36.227492 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.227728 kubelet[2683]: E0819 00:25:36.227635 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.227728 kubelet[2683]: W0819 00:25:36.227643 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.227728 kubelet[2683]: E0819 00:25:36.227651 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.227840 kubelet[2683]: E0819 00:25:36.227755 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.227840 kubelet[2683]: W0819 00:25:36.227761 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.227840 kubelet[2683]: E0819 00:25:36.227769 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.228427 kubelet[2683]: E0819 00:25:36.228304 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.228427 kubelet[2683]: W0819 00:25:36.228321 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.228427 kubelet[2683]: E0819 00:25:36.228333 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.228920 kubelet[2683]: E0819 00:25:36.228763 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.228920 kubelet[2683]: W0819 00:25:36.228780 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.228920 kubelet[2683]: E0819 00:25:36.228792 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.229263 kubelet[2683]: E0819 00:25:36.229235 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.229470 kubelet[2683]: W0819 00:25:36.229323 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.229470 kubelet[2683]: E0819 00:25:36.229354 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.229758 kubelet[2683]: E0819 00:25:36.229603 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.229758 kubelet[2683]: W0819 00:25:36.229618 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.229758 kubelet[2683]: E0819 00:25:36.229629 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.229900 kubelet[2683]: E0819 00:25:36.229887 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.229948 kubelet[2683]: W0819 00:25:36.229938 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.230011 kubelet[2683]: E0819 00:25:36.229998 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.230377 kubelet[2683]: E0819 00:25:36.230261 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.230377 kubelet[2683]: W0819 00:25:36.230275 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.230377 kubelet[2683]: E0819 00:25:36.230286 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.230601 kubelet[2683]: E0819 00:25:36.230582 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.230656 kubelet[2683]: W0819 00:25:36.230643 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.230731 kubelet[2683]: E0819 00:25:36.230718 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.231275 kubelet[2683]: E0819 00:25:36.231252 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.232240 kubelet[2683]: W0819 00:25:36.231605 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.232240 kubelet[2683]: E0819 00:25:36.231643 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.235913 kubelet[2683]: E0819 00:25:36.235685 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.235913 kubelet[2683]: W0819 00:25:36.235710 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.235913 kubelet[2683]: E0819 00:25:36.235735 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.236136 kubelet[2683]: E0819 00:25:36.236121 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.236229 kubelet[2683]: W0819 00:25:36.236179 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.237245 kubelet[2683]: E0819 00:25:36.237230 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.237617 kubelet[2683]: E0819 00:25:36.237508 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.237617 kubelet[2683]: W0819 00:25:36.237521 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.237617 kubelet[2683]: E0819 00:25:36.237531 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.237786 kubelet[2683]: E0819 00:25:36.237773 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.237975 kubelet[2683]: W0819 00:25:36.237825 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.237975 kubelet[2683]: E0819 00:25:36.237840 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.238090 kubelet[2683]: E0819 00:25:36.238078 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.238138 kubelet[2683]: W0819 00:25:36.238127 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.238212 kubelet[2683]: E0819 00:25:36.238188 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.242431 kubelet[2683]: E0819 00:25:36.242405 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:36.242594 kubelet[2683]: W0819 00:25:36.242531 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:36.242594 kubelet[2683]: E0819 00:25:36.242555 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:36.256263 containerd[1525]: time="2025-08-19T00:25:36.253742993Z" level=info msg="connecting to shim 70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5" address="unix:///run/containerd/s/0a261ec516a7782e61a8e43d6ced28a1a0fdbd2929663ff3a68ec7ae2760de7b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:36.296460 systemd[1]: Started cri-containerd-70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5.scope - libcontainer container 70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5. Aug 19 00:25:36.347247 containerd[1525]: time="2025-08-19T00:25:36.347109337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4wbjt,Uid:c5dab353-577d-4533-9e86-0d67591443f0,Namespace:calico-system,Attempt:0,} returns sandbox id \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\"" Aug 19 00:25:37.086076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1622676408.mount: Deactivated successfully. Aug 19 00:25:37.976948 containerd[1525]: time="2025-08-19T00:25:37.976878451Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:37.978589 containerd[1525]: time="2025-08-19T00:25:37.978548853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Aug 19 00:25:37.980819 containerd[1525]: time="2025-08-19T00:25:37.980759535Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:37.983538 containerd[1525]: time="2025-08-19T00:25:37.983492578Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:37.984190 containerd[1525]: time="2025-08-19T00:25:37.984047378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 1.899096974s" Aug 19 00:25:37.984190 containerd[1525]: time="2025-08-19T00:25:37.984087978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Aug 19 00:25:37.985396 containerd[1525]: time="2025-08-19T00:25:37.985343660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 00:25:38.003011 containerd[1525]: time="2025-08-19T00:25:38.002960438Z" level=info msg="CreateContainer within sandbox \"ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 00:25:38.019478 containerd[1525]: time="2025-08-19T00:25:38.019415614Z" level=info msg="Container f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:38.039516 containerd[1525]: time="2025-08-19T00:25:38.039462434Z" level=info msg="CreateContainer within sandbox \"ff336ae3fdf99e92843fbef28143c41377a5fd353311acc7456670fe70446f4a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea\"" Aug 19 00:25:38.040153 containerd[1525]: time="2025-08-19T00:25:38.039990474Z" level=info msg="StartContainer for \"f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea\"" Aug 19 00:25:38.041119 containerd[1525]: time="2025-08-19T00:25:38.041091596Z" level=info msg="connecting to shim f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea" address="unix:///run/containerd/s/18fa5f98ba8f817acd3e8080bf32da4e4becec96321c352da149c5a4c7f97a4c" protocol=ttrpc version=3 Aug 19 00:25:38.061431 systemd[1]: Started cri-containerd-f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea.scope - libcontainer container f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea. Aug 19 00:25:38.116223 containerd[1525]: time="2025-08-19T00:25:38.116175629Z" level=info msg="StartContainer for \"f015d7d3ae1c2ede44cd3acd82c854e8a83a06c85a55665869de74509fe3e3ea\" returns successfully" Aug 19 00:25:38.183447 kubelet[2683]: E0819 00:25:38.183327 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4ccpq" podUID="2c50faac-e631-4180-82e0-eced6cb1cc4a" Aug 19 00:25:38.308140 kubelet[2683]: E0819 00:25:38.308045 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.308140 kubelet[2683]: W0819 00:25:38.308070 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.308140 kubelet[2683]: E0819 00:25:38.308093 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.308314 kubelet[2683]: E0819 00:25:38.308254 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.308314 kubelet[2683]: W0819 00:25:38.308261 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.308381 kubelet[2683]: E0819 00:25:38.308301 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.308735 kubelet[2683]: E0819 00:25:38.308704 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.308735 kubelet[2683]: W0819 00:25:38.308721 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.308735 kubelet[2683]: E0819 00:25:38.308732 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.309728 kubelet[2683]: E0819 00:25:38.309293 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.309728 kubelet[2683]: W0819 00:25:38.309394 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.309728 kubelet[2683]: E0819 00:25:38.309410 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.309728 kubelet[2683]: E0819 00:25:38.309606 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.309728 kubelet[2683]: W0819 00:25:38.309615 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.309728 kubelet[2683]: E0819 00:25:38.309624 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.309938 kubelet[2683]: E0819 00:25:38.309803 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.309938 kubelet[2683]: W0819 00:25:38.309812 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.309938 kubelet[2683]: E0819 00:25:38.309819 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.310001 kubelet[2683]: E0819 00:25:38.309945 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.310001 kubelet[2683]: W0819 00:25:38.309955 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.310001 kubelet[2683]: E0819 00:25:38.309962 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.310138 kubelet[2683]: E0819 00:25:38.310123 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.310174 kubelet[2683]: W0819 00:25:38.310133 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.310199 kubelet[2683]: E0819 00:25:38.310176 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.310633 kubelet[2683]: E0819 00:25:38.310616 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.310633 kubelet[2683]: W0819 00:25:38.310630 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.310702 kubelet[2683]: E0819 00:25:38.310642 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.310794 kubelet[2683]: E0819 00:25:38.310781 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.310827 kubelet[2683]: W0819 00:25:38.310793 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.310827 kubelet[2683]: E0819 00:25:38.310801 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.311067 kubelet[2683]: E0819 00:25:38.311050 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.311112 kubelet[2683]: W0819 00:25:38.311098 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.311138 kubelet[2683]: E0819 00:25:38.311114 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.311285 kubelet[2683]: E0819 00:25:38.311269 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.311526 kubelet[2683]: W0819 00:25:38.311502 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.311582 kubelet[2683]: E0819 00:25:38.311527 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.312403 kubelet[2683]: E0819 00:25:38.312378 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.312403 kubelet[2683]: W0819 00:25:38.312393 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.312800 kubelet[2683]: E0819 00:25:38.312750 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.313178 kubelet[2683]: E0819 00:25:38.313163 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.313231 kubelet[2683]: W0819 00:25:38.313178 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.313231 kubelet[2683]: E0819 00:25:38.313190 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.313624 kubelet[2683]: E0819 00:25:38.313603 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.313624 kubelet[2683]: W0819 00:25:38.313617 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.313624 kubelet[2683]: E0819 00:25:38.313627 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.348090 kubelet[2683]: E0819 00:25:38.348038 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.348090 kubelet[2683]: W0819 00:25:38.348065 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.348090 kubelet[2683]: E0819 00:25:38.348087 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.348411 kubelet[2683]: E0819 00:25:38.348385 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.348411 kubelet[2683]: W0819 00:25:38.348404 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.348484 kubelet[2683]: E0819 00:25:38.348416 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.348833 kubelet[2683]: E0819 00:25:38.348793 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.348833 kubelet[2683]: W0819 00:25:38.348812 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.348833 kubelet[2683]: E0819 00:25:38.348826 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.349733 kubelet[2683]: E0819 00:25:38.349702 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.349733 kubelet[2683]: W0819 00:25:38.349720 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.349733 kubelet[2683]: E0819 00:25:38.349734 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.350244 kubelet[2683]: E0819 00:25:38.350035 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.350244 kubelet[2683]: W0819 00:25:38.350228 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.350244 kubelet[2683]: E0819 00:25:38.350245 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.351414 kubelet[2683]: E0819 00:25:38.351383 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.351414 kubelet[2683]: W0819 00:25:38.351400 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.351414 kubelet[2683]: E0819 00:25:38.351413 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.351928 kubelet[2683]: E0819 00:25:38.351889 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.351928 kubelet[2683]: W0819 00:25:38.351911 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.351928 kubelet[2683]: E0819 00:25:38.351924 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.353070 kubelet[2683]: E0819 00:25:38.353039 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.353070 kubelet[2683]: W0819 00:25:38.353058 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.353184 kubelet[2683]: E0819 00:25:38.353091 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.354009 kubelet[2683]: E0819 00:25:38.353971 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.354009 kubelet[2683]: W0819 00:25:38.353992 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.354009 kubelet[2683]: E0819 00:25:38.354005 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.354598 kubelet[2683]: E0819 00:25:38.354579 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.354598 kubelet[2683]: W0819 00:25:38.354596 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.354672 kubelet[2683]: E0819 00:25:38.354607 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.355176 kubelet[2683]: E0819 00:25:38.354963 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.355176 kubelet[2683]: W0819 00:25:38.354998 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.355176 kubelet[2683]: E0819 00:25:38.355011 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.357531 kubelet[2683]: E0819 00:25:38.356281 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.357531 kubelet[2683]: W0819 00:25:38.356296 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.357531 kubelet[2683]: E0819 00:25:38.356306 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.357745 kubelet[2683]: E0819 00:25:38.357719 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.357794 kubelet[2683]: W0819 00:25:38.357747 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.357794 kubelet[2683]: E0819 00:25:38.357761 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.359301 kubelet[2683]: E0819 00:25:38.359269 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.359301 kubelet[2683]: W0819 00:25:38.359287 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.359301 kubelet[2683]: E0819 00:25:38.359302 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.362317 kubelet[2683]: E0819 00:25:38.362276 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.362317 kubelet[2683]: W0819 00:25:38.362300 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.362317 kubelet[2683]: E0819 00:25:38.362315 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.362656 kubelet[2683]: E0819 00:25:38.362635 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.362792 kubelet[2683]: W0819 00:25:38.362656 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.362792 kubelet[2683]: E0819 00:25:38.362668 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.363092 kubelet[2683]: E0819 00:25:38.363063 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.363092 kubelet[2683]: W0819 00:25:38.363079 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.363092 kubelet[2683]: E0819 00:25:38.363091 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:38.364465 kubelet[2683]: E0819 00:25:38.364431 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:38.364465 kubelet[2683]: W0819 00:25:38.364449 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:38.364465 kubelet[2683]: E0819 00:25:38.364462 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.276628 kubelet[2683]: I0819 00:25:39.276594 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:25:39.319696 kubelet[2683]: E0819 00:25:39.319639 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.319696 kubelet[2683]: W0819 00:25:39.319667 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.319696 kubelet[2683]: E0819 00:25:39.319693 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.319967 kubelet[2683]: E0819 00:25:39.319884 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.319967 kubelet[2683]: W0819 00:25:39.319893 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.319967 kubelet[2683]: E0819 00:25:39.319904 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.320063 kubelet[2683]: E0819 00:25:39.320057 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.320092 kubelet[2683]: W0819 00:25:39.320064 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.320092 kubelet[2683]: E0819 00:25:39.320073 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.378794 kubelet[2683]: E0819 00:25:39.378719 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.378794 kubelet[2683]: W0819 00:25:39.378771 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.378794 kubelet[2683]: E0819 00:25:39.378795 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379077 kubelet[2683]: E0819 00:25:39.379024 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379077 kubelet[2683]: W0819 00:25:39.379033 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379077 kubelet[2683]: E0819 00:25:39.379041 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379235 kubelet[2683]: E0819 00:25:39.379219 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379235 kubelet[2683]: W0819 00:25:39.379230 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379302 kubelet[2683]: E0819 00:25:39.379240 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379396 kubelet[2683]: E0819 00:25:39.379369 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379396 kubelet[2683]: W0819 00:25:39.379382 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379396 kubelet[2683]: E0819 00:25:39.379390 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379565 kubelet[2683]: E0819 00:25:39.379553 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379641 kubelet[2683]: W0819 00:25:39.379567 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379641 kubelet[2683]: E0819 00:25:39.379575 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379740 kubelet[2683]: E0819 00:25:39.379727 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379740 kubelet[2683]: W0819 00:25:39.379738 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379810 kubelet[2683]: E0819 00:25:39.379747 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.379935 kubelet[2683]: E0819 00:25:39.379912 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.379935 kubelet[2683]: W0819 00:25:39.379926 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.379935 kubelet[2683]: E0819 00:25:39.379934 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.380077 kubelet[2683]: E0819 00:25:39.380067 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.380077 kubelet[2683]: W0819 00:25:39.380076 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.380131 kubelet[2683]: E0819 00:25:39.380083 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.380228 kubelet[2683]: E0819 00:25:39.380218 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.380266 kubelet[2683]: W0819 00:25:39.380228 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.380266 kubelet[2683]: E0819 00:25:39.380236 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.380444 kubelet[2683]: E0819 00:25:39.380425 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.380444 kubelet[2683]: W0819 00:25:39.380434 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.380539 kubelet[2683]: E0819 00:25:39.380452 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.380695 kubelet[2683]: E0819 00:25:39.380671 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.380695 kubelet[2683]: W0819 00:25:39.380683 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.380695 kubelet[2683]: E0819 00:25:39.380693 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.381040 kubelet[2683]: E0819 00:25:39.381024 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.381040 kubelet[2683]: W0819 00:25:39.381037 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.381123 kubelet[2683]: E0819 00:25:39.381047 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.381379 kubelet[2683]: E0819 00:25:39.381360 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.381412 kubelet[2683]: W0819 00:25:39.381379 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.381412 kubelet[2683]: E0819 00:25:39.381394 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.381619 kubelet[2683]: E0819 00:25:39.381605 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.381619 kubelet[2683]: W0819 00:25:39.381615 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.381756 kubelet[2683]: E0819 00:25:39.381624 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.381802 kubelet[2683]: E0819 00:25:39.381792 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.381802 kubelet[2683]: W0819 00:25:39.381800 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.381974 kubelet[2683]: E0819 00:25:39.381808 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.383564 kubelet[2683]: E0819 00:25:39.383542 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.383781 kubelet[2683]: W0819 00:25:39.383653 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.383781 kubelet[2683]: E0819 00:25:39.383674 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.383941 kubelet[2683]: E0819 00:25:39.383925 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.384007 kubelet[2683]: W0819 00:25:39.383996 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.384056 kubelet[2683]: E0819 00:25:39.384046 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.384410 kubelet[2683]: E0819 00:25:39.384283 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.384410 kubelet[2683]: W0819 00:25:39.384297 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.384410 kubelet[2683]: E0819 00:25:39.384307 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.384605 kubelet[2683]: E0819 00:25:39.384590 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.384665 kubelet[2683]: W0819 00:25:39.384653 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.384741 kubelet[2683]: E0819 00:25:39.384730 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.384975 kubelet[2683]: E0819 00:25:39.384961 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.385186 kubelet[2683]: W0819 00:25:39.385049 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.385186 kubelet[2683]: E0819 00:25:39.385067 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.385351 kubelet[2683]: E0819 00:25:39.385327 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.385484 kubelet[2683]: W0819 00:25:39.385468 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.385547 kubelet[2683]: E0819 00:25:39.385535 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.385987 kubelet[2683]: E0819 00:25:39.385965 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.385987 kubelet[2683]: W0819 00:25:39.385984 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.386078 kubelet[2683]: E0819 00:25:39.385999 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.386199 kubelet[2683]: E0819 00:25:39.386186 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.386259 kubelet[2683]: W0819 00:25:39.386199 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.386259 kubelet[2683]: E0819 00:25:39.386221 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.386439 kubelet[2683]: E0819 00:25:39.386420 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.386439 kubelet[2683]: W0819 00:25:39.386437 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.386537 kubelet[2683]: E0819 00:25:39.386448 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.389410 kubelet[2683]: E0819 00:25:39.389379 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.389410 kubelet[2683]: W0819 00:25:39.389405 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.389553 kubelet[2683]: E0819 00:25:39.389445 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.390190 kubelet[2683]: E0819 00:25:39.390153 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.390190 kubelet[2683]: W0819 00:25:39.390174 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.390317 kubelet[2683]: E0819 00:25:39.390234 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.390668 kubelet[2683]: E0819 00:25:39.390639 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.390747 kubelet[2683]: W0819 00:25:39.390732 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.390808 kubelet[2683]: E0819 00:25:39.390795 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.391108 kubelet[2683]: E0819 00:25:39.391087 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.391236 kubelet[2683]: W0819 00:25:39.391190 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.391301 kubelet[2683]: E0819 00:25:39.391288 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.391908 kubelet[2683]: E0819 00:25:39.391863 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.391908 kubelet[2683]: W0819 00:25:39.391886 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.391908 kubelet[2683]: E0819 00:25:39.391899 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.392122 kubelet[2683]: E0819 00:25:39.392096 2683 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:25:39.392122 kubelet[2683]: W0819 00:25:39.392108 2683 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:25:39.392122 kubelet[2683]: E0819 00:25:39.392118 2683 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:25:39.413428 containerd[1525]: time="2025-08-19T00:25:39.413372438Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:39.414353 containerd[1525]: time="2025-08-19T00:25:39.414305799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Aug 19 00:25:39.419633 containerd[1525]: time="2025-08-19T00:25:39.419296484Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:39.421853 containerd[1525]: time="2025-08-19T00:25:39.421812806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:39.422327 containerd[1525]: time="2025-08-19T00:25:39.422295327Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.436921707s" Aug 19 00:25:39.422406 containerd[1525]: time="2025-08-19T00:25:39.422330127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Aug 19 00:25:39.428199 containerd[1525]: time="2025-08-19T00:25:39.428138372Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 00:25:39.440235 containerd[1525]: time="2025-08-19T00:25:39.438485622Z" level=info msg="Container ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:39.448286 containerd[1525]: time="2025-08-19T00:25:39.448231111Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\"" Aug 19 00:25:39.449323 containerd[1525]: time="2025-08-19T00:25:39.449290192Z" level=info msg="StartContainer for \"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\"" Aug 19 00:25:39.453336 containerd[1525]: time="2025-08-19T00:25:39.453272595Z" level=info msg="connecting to shim ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a" address="unix:///run/containerd/s/0a261ec516a7782e61a8e43d6ced28a1a0fdbd2929663ff3a68ec7ae2760de7b" protocol=ttrpc version=3 Aug 19 00:25:39.485437 systemd[1]: Started cri-containerd-ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a.scope - libcontainer container ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a. Aug 19 00:25:39.538760 containerd[1525]: time="2025-08-19T00:25:39.538514434Z" level=info msg="StartContainer for \"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\" returns successfully" Aug 19 00:25:39.572236 systemd[1]: cri-containerd-ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a.scope: Deactivated successfully. Aug 19 00:25:39.594155 containerd[1525]: time="2025-08-19T00:25:39.593238124Z" level=info msg="received exit event container_id:\"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\" id:\"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\" pid:3425 exited_at:{seconds:1755563139 nanos:589342561}" Aug 19 00:25:39.595496 containerd[1525]: time="2025-08-19T00:25:39.595454446Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\" id:\"ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a\" pid:3425 exited_at:{seconds:1755563139 nanos:589342561}" Aug 19 00:25:39.702684 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ba23b22c424df42ef04f6bf0748b0d8e2f6e8c3af266273cf9aa35ff1596cd8a-rootfs.mount: Deactivated successfully. Aug 19 00:25:40.183718 kubelet[2683]: E0819 00:25:40.183299 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4ccpq" podUID="2c50faac-e631-4180-82e0-eced6cb1cc4a" Aug 19 00:25:40.287644 containerd[1525]: time="2025-08-19T00:25:40.287602027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 00:25:40.320321 kubelet[2683]: I0819 00:25:40.320200 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-fc65b964b-tvv24" podStartSLOduration=3.419193158 podStartE2EDuration="5.320176215s" podCreationTimestamp="2025-08-19 00:25:35 +0000 UTC" firstStartedPulling="2025-08-19 00:25:36.084230883 +0000 UTC m=+20.999363405" lastFinishedPulling="2025-08-19 00:25:37.98521394 +0000 UTC m=+22.900346462" observedRunningTime="2025-08-19 00:25:38.295532766 +0000 UTC m=+23.210665288" watchObservedRunningTime="2025-08-19 00:25:40.320176215 +0000 UTC m=+25.235308737" Aug 19 00:25:42.141313 containerd[1525]: time="2025-08-19T00:25:42.141259399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:42.142178 containerd[1525]: time="2025-08-19T00:25:42.142131120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Aug 19 00:25:42.143040 containerd[1525]: time="2025-08-19T00:25:42.142999801Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:42.145824 containerd[1525]: time="2025-08-19T00:25:42.145784123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:42.146462 containerd[1525]: time="2025-08-19T00:25:42.146423563Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 1.858778296s" Aug 19 00:25:42.146503 containerd[1525]: time="2025-08-19T00:25:42.146481363Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Aug 19 00:25:42.157408 containerd[1525]: time="2025-08-19T00:25:42.157359971Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 00:25:42.177804 containerd[1525]: time="2025-08-19T00:25:42.177469387Z" level=info msg="Container 66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:42.182872 kubelet[2683]: E0819 00:25:42.182788 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4ccpq" podUID="2c50faac-e631-4180-82e0-eced6cb1cc4a" Aug 19 00:25:42.192159 containerd[1525]: time="2025-08-19T00:25:42.192099518Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\"" Aug 19 00:25:42.193149 containerd[1525]: time="2025-08-19T00:25:42.193089919Z" level=info msg="StartContainer for \"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\"" Aug 19 00:25:42.194710 containerd[1525]: time="2025-08-19T00:25:42.194649440Z" level=info msg="connecting to shim 66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d" address="unix:///run/containerd/s/0a261ec516a7782e61a8e43d6ced28a1a0fdbd2929663ff3a68ec7ae2760de7b" protocol=ttrpc version=3 Aug 19 00:25:42.225439 systemd[1]: Started cri-containerd-66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d.scope - libcontainer container 66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d. Aug 19 00:25:42.275886 containerd[1525]: time="2025-08-19T00:25:42.275833621Z" level=info msg="StartContainer for \"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\" returns successfully" Aug 19 00:25:43.030220 containerd[1525]: time="2025-08-19T00:25:43.030114513Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:25:43.033067 systemd[1]: cri-containerd-66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d.scope: Deactivated successfully. Aug 19 00:25:43.033629 systemd[1]: cri-containerd-66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d.scope: Consumed 573ms CPU time, 179.1M memory peak, 2.1M read from disk, 165.8M written to disk. Aug 19 00:25:43.042729 containerd[1525]: time="2025-08-19T00:25:43.042676201Z" level=info msg="received exit event container_id:\"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\" id:\"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\" pid:3487 exited_at:{seconds:1755563143 nanos:42392721}" Aug 19 00:25:43.043037 containerd[1525]: time="2025-08-19T00:25:43.042987602Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\" id:\"66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d\" pid:3487 exited_at:{seconds:1755563143 nanos:42392721}" Aug 19 00:25:43.082920 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-66fe61917e24f0f0407c789c2ad3d581a260f61759156d0fe1cbb8424cd5709d-rootfs.mount: Deactivated successfully. Aug 19 00:25:43.131041 kubelet[2683]: I0819 00:25:43.129455 2683 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 19 00:25:43.315121 systemd[1]: Created slice kubepods-burstable-pod5de149ae_7e04_4bbd_9374_54fbeca24b5c.slice - libcontainer container kubepods-burstable-pod5de149ae_7e04_4bbd_9374_54fbeca24b5c.slice. Aug 19 00:25:43.329931 containerd[1525]: time="2025-08-19T00:25:43.329888606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 00:25:43.333522 systemd[1]: Created slice kubepods-besteffort-pod02913a4b_73f9_41fe_b20f_d28cc9536400.slice - libcontainer container kubepods-besteffort-pod02913a4b_73f9_41fe_b20f_d28cc9536400.slice. Aug 19 00:25:43.344050 systemd[1]: Created slice kubepods-burstable-podab6c6de2_acbf_4715_972b_2d8064a58b86.slice - libcontainer container kubepods-burstable-podab6c6de2_acbf_4715_972b_2d8064a58b86.slice. Aug 19 00:25:43.353499 systemd[1]: Created slice kubepods-besteffort-podf19009fc_e647_42cf_afbb_1965a26588f6.slice - libcontainer container kubepods-besteffort-podf19009fc_e647_42cf_afbb_1965a26588f6.slice. Aug 19 00:25:43.358879 kubelet[2683]: I0819 00:25:43.358841 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bn4j\" (UniqueName: \"kubernetes.io/projected/ab6c6de2-acbf-4715-972b-2d8064a58b86-kube-api-access-5bn4j\") pod \"coredns-674b8bbfcf-5xxhl\" (UID: \"ab6c6de2-acbf-4715-972b-2d8064a58b86\") " pod="kube-system/coredns-674b8bbfcf-5xxhl" Aug 19 00:25:43.358879 kubelet[2683]: I0819 00:25:43.358885 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvc92\" (UniqueName: \"kubernetes.io/projected/f19009fc-e647-42cf-afbb-1965a26588f6-kube-api-access-lvc92\") pod \"whisker-884474788-gdjqq\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " pod="calico-system/whisker-884474788-gdjqq" Aug 19 00:25:43.359288 kubelet[2683]: I0819 00:25:43.358910 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/92768be8-1d70-4e16-8a82-57ab317927cd-calico-apiserver-certs\") pod \"calico-apiserver-6469f8695c-d2lkb\" (UID: \"92768be8-1d70-4e16-8a82-57ab317927cd\") " pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" Aug 19 00:25:43.359288 kubelet[2683]: I0819 00:25:43.358928 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpdt\" (UniqueName: \"kubernetes.io/projected/8f040549-5eba-4e69-b6a1-39fea6377b08-kube-api-access-5mpdt\") pod \"goldmane-768f4c5c69-5lvzf\" (UID: \"8f040549-5eba-4e69-b6a1-39fea6377b08\") " pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:43.359288 kubelet[2683]: I0819 00:25:43.358986 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwc4\" (UniqueName: \"kubernetes.io/projected/5de149ae-7e04-4bbd-9374-54fbeca24b5c-kube-api-access-5dwc4\") pod \"coredns-674b8bbfcf-fqnlp\" (UID: \"5de149ae-7e04-4bbd-9374-54fbeca24b5c\") " pod="kube-system/coredns-674b8bbfcf-fqnlp" Aug 19 00:25:43.359288 kubelet[2683]: I0819 00:25:43.359006 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-ca-bundle\") pod \"whisker-884474788-gdjqq\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " pod="calico-system/whisker-884474788-gdjqq" Aug 19 00:25:43.359288 kubelet[2683]: I0819 00:25:43.359024 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02913a4b-73f9-41fe-b20f-d28cc9536400-tigera-ca-bundle\") pod \"calico-kube-controllers-6ffc4747bd-f9n7m\" (UID: \"02913a4b-73f9-41fe-b20f-d28cc9536400\") " pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" Aug 19 00:25:43.359466 kubelet[2683]: I0819 00:25:43.359057 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm2zr\" (UniqueName: \"kubernetes.io/projected/02913a4b-73f9-41fe-b20f-d28cc9536400-kube-api-access-fm2zr\") pod \"calico-kube-controllers-6ffc4747bd-f9n7m\" (UID: \"02913a4b-73f9-41fe-b20f-d28cc9536400\") " pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" Aug 19 00:25:43.359466 kubelet[2683]: I0819 00:25:43.359076 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f040549-5eba-4e69-b6a1-39fea6377b08-config\") pod \"goldmane-768f4c5c69-5lvzf\" (UID: \"8f040549-5eba-4e69-b6a1-39fea6377b08\") " pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:43.359466 kubelet[2683]: I0819 00:25:43.359095 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5de149ae-7e04-4bbd-9374-54fbeca24b5c-config-volume\") pod \"coredns-674b8bbfcf-fqnlp\" (UID: \"5de149ae-7e04-4bbd-9374-54fbeca24b5c\") " pod="kube-system/coredns-674b8bbfcf-fqnlp" Aug 19 00:25:43.359466 kubelet[2683]: I0819 00:25:43.359111 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab6c6de2-acbf-4715-972b-2d8064a58b86-config-volume\") pod \"coredns-674b8bbfcf-5xxhl\" (UID: \"ab6c6de2-acbf-4715-972b-2d8064a58b86\") " pod="kube-system/coredns-674b8bbfcf-5xxhl" Aug 19 00:25:43.359466 kubelet[2683]: I0819 00:25:43.359128 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f040549-5eba-4e69-b6a1-39fea6377b08-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-5lvzf\" (UID: \"8f040549-5eba-4e69-b6a1-39fea6377b08\") " pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:43.359601 kubelet[2683]: I0819 00:25:43.359157 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-backend-key-pair\") pod \"whisker-884474788-gdjqq\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " pod="calico-system/whisker-884474788-gdjqq" Aug 19 00:25:43.359601 kubelet[2683]: I0819 00:25:43.359173 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspxx\" (UniqueName: \"kubernetes.io/projected/92768be8-1d70-4e16-8a82-57ab317927cd-kube-api-access-vspxx\") pod \"calico-apiserver-6469f8695c-d2lkb\" (UID: \"92768be8-1d70-4e16-8a82-57ab317927cd\") " pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" Aug 19 00:25:43.359601 kubelet[2683]: I0819 00:25:43.359198 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8f040549-5eba-4e69-b6a1-39fea6377b08-goldmane-key-pair\") pod \"goldmane-768f4c5c69-5lvzf\" (UID: \"8f040549-5eba-4e69-b6a1-39fea6377b08\") " pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:43.366254 systemd[1]: Created slice kubepods-besteffort-pod92768be8_1d70_4e16_8a82_57ab317927cd.slice - libcontainer container kubepods-besteffort-pod92768be8_1d70_4e16_8a82_57ab317927cd.slice. Aug 19 00:25:43.391549 systemd[1]: Created slice kubepods-besteffort-pod8f040549_5eba_4e69_b6a1_39fea6377b08.slice - libcontainer container kubepods-besteffort-pod8f040549_5eba_4e69_b6a1_39fea6377b08.slice. Aug 19 00:25:43.398828 systemd[1]: Created slice kubepods-besteffort-podcae57906_475d_4518_b0c5_5f0aecff80c9.slice - libcontainer container kubepods-besteffort-podcae57906_475d_4518_b0c5_5f0aecff80c9.slice. Aug 19 00:25:43.460423 kubelet[2683]: I0819 00:25:43.460364 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cae57906-475d-4518-b0c5-5f0aecff80c9-calico-apiserver-certs\") pod \"calico-apiserver-6469f8695c-tr54d\" (UID: \"cae57906-475d-4518-b0c5-5f0aecff80c9\") " pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" Aug 19 00:25:43.460609 kubelet[2683]: I0819 00:25:43.460498 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9c4c\" (UniqueName: \"kubernetes.io/projected/cae57906-475d-4518-b0c5-5f0aecff80c9-kube-api-access-j9c4c\") pod \"calico-apiserver-6469f8695c-tr54d\" (UID: \"cae57906-475d-4518-b0c5-5f0aecff80c9\") " pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" Aug 19 00:25:43.622917 containerd[1525]: time="2025-08-19T00:25:43.622811214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fqnlp,Uid:5de149ae-7e04-4bbd-9374-54fbeca24b5c,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:43.639814 containerd[1525]: time="2025-08-19T00:25:43.639764466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6ffc4747bd-f9n7m,Uid:02913a4b-73f9-41fe-b20f-d28cc9536400,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:43.659027 containerd[1525]: time="2025-08-19T00:25:43.658969960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5xxhl,Uid:ab6c6de2-acbf-4715-972b-2d8064a58b86,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:43.659455 containerd[1525]: time="2025-08-19T00:25:43.659415320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-884474788-gdjqq,Uid:f19009fc-e647-42cf-afbb-1965a26588f6,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:43.698778 containerd[1525]: time="2025-08-19T00:25:43.679947415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-d2lkb,Uid:92768be8-1d70-4e16-8a82-57ab317927cd,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:25:43.765247 containerd[1525]: time="2025-08-19T00:25:43.705787433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5lvzf,Uid:8f040549-5eba-4e69-b6a1-39fea6377b08,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:43.765247 containerd[1525]: time="2025-08-19T00:25:43.722159525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-tr54d,Uid:cae57906-475d-4518-b0c5-5f0aecff80c9,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:25:44.390622 systemd[1]: Created slice kubepods-besteffort-pod2c50faac_e631_4180_82e0_eced6cb1cc4a.slice - libcontainer container kubepods-besteffort-pod2c50faac_e631_4180_82e0_eced6cb1cc4a.slice. Aug 19 00:25:44.400196 containerd[1525]: time="2025-08-19T00:25:44.399785309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4ccpq,Uid:2c50faac-e631-4180-82e0-eced6cb1cc4a,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:46.326909 containerd[1525]: time="2025-08-19T00:25:46.326849527Z" level=error msg="Failed to destroy network for sandbox \"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.329524 systemd[1]: run-netns-cni\x2d08e08ab2\x2de745\x2d5a08\x2d6414\x2d7054259b1f4c.mount: Deactivated successfully. Aug 19 00:25:46.332512 containerd[1525]: time="2025-08-19T00:25:46.332444330Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-884474788-gdjqq,Uid:f19009fc-e647-42cf-afbb-1965a26588f6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.334705 containerd[1525]: time="2025-08-19T00:25:46.334658772Z" level=error msg="Failed to destroy network for sandbox \"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.336932 systemd[1]: run-netns-cni\x2d61d2568a\x2d65a9\x2d9297\x2df12a\x2ddc9c0d8ac27e.mount: Deactivated successfully. Aug 19 00:25:46.342393 containerd[1525]: time="2025-08-19T00:25:46.338126054Z" level=error msg="Failed to destroy network for sandbox \"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.339996 systemd[1]: run-netns-cni\x2d7f6333e2\x2dcf58\x2dc22e\x2d75ab\x2dc0f6c18931f2.mount: Deactivated successfully. Aug 19 00:25:46.342891 containerd[1525]: time="2025-08-19T00:25:46.342503376Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6ffc4747bd-f9n7m,Uid:02913a4b-73f9-41fe-b20f-d28cc9536400,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.342976 kubelet[2683]: E0819 00:25:46.342774 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.342976 kubelet[2683]: E0819 00:25:46.342859 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" Aug 19 00:25:46.342976 kubelet[2683]: E0819 00:25:46.342885 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" Aug 19 00:25:46.343471 kubelet[2683]: E0819 00:25:46.342962 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6ffc4747bd-f9n7m_calico-system(02913a4b-73f9-41fe-b20f-d28cc9536400)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6ffc4747bd-f9n7m_calico-system(02913a4b-73f9-41fe-b20f-d28cc9536400)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f0be3b90ee7bd2cc2bb96df8ab87c3352a55839e807917bcfad22352f3b3edc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" podUID="02913a4b-73f9-41fe-b20f-d28cc9536400" Aug 19 00:25:46.345158 kubelet[2683]: E0819 00:25:46.345101 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.345376 kubelet[2683]: E0819 00:25:46.345353 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-884474788-gdjqq" Aug 19 00:25:46.345486 kubelet[2683]: E0819 00:25:46.345467 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-884474788-gdjqq" Aug 19 00:25:46.345687 kubelet[2683]: E0819 00:25:46.345657 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-884474788-gdjqq_calico-system(f19009fc-e647-42cf-afbb-1965a26588f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-884474788-gdjqq_calico-system(f19009fc-e647-42cf-afbb-1965a26588f6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95c6653e33c95245691289f307731dbf032ef872ed5460d268cb09001ac45a7c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-884474788-gdjqq" podUID="f19009fc-e647-42cf-afbb-1965a26588f6" Aug 19 00:25:46.348602 containerd[1525]: time="2025-08-19T00:25:46.348343220Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5xxhl,Uid:ab6c6de2-acbf-4715-972b-2d8064a58b86,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.348769 kubelet[2683]: E0819 00:25:46.348636 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.348769 kubelet[2683]: E0819 00:25:46.348703 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5xxhl" Aug 19 00:25:46.348769 kubelet[2683]: E0819 00:25:46.348725 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5xxhl" Aug 19 00:25:46.348861 kubelet[2683]: E0819 00:25:46.348786 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5xxhl_kube-system(ab6c6de2-acbf-4715-972b-2d8064a58b86)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5xxhl_kube-system(ab6c6de2-acbf-4715-972b-2d8064a58b86)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"82ba668aa0691bab44f9cee606d525c8e87cad8a572464922229abbe836ace2e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5xxhl" podUID="ab6c6de2-acbf-4715-972b-2d8064a58b86" Aug 19 00:25:46.364698 containerd[1525]: time="2025-08-19T00:25:46.362222348Z" level=error msg="Failed to destroy network for sandbox \"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.364698 containerd[1525]: time="2025-08-19T00:25:46.364537349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fqnlp,Uid:5de149ae-7e04-4bbd-9374-54fbeca24b5c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.367512 kubelet[2683]: E0819 00:25:46.365327 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.367512 kubelet[2683]: E0819 00:25:46.365403 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fqnlp" Aug 19 00:25:46.367512 kubelet[2683]: E0819 00:25:46.365429 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fqnlp" Aug 19 00:25:46.366375 systemd[1]: run-netns-cni\x2da5a38e4c\x2dd033\x2d2d89\x2db0d1\x2dd9a96d62f949.mount: Deactivated successfully. Aug 19 00:25:46.367865 kubelet[2683]: E0819 00:25:46.365494 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-fqnlp_kube-system(5de149ae-7e04-4bbd-9374-54fbeca24b5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-fqnlp_kube-system(5de149ae-7e04-4bbd-9374-54fbeca24b5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2a02200f3c5ee7e2ed432b09359ee9228c4176417cdcacd2faba79528ebbb594\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-fqnlp" podUID="5de149ae-7e04-4bbd-9374-54fbeca24b5c" Aug 19 00:25:46.368125 containerd[1525]: time="2025-08-19T00:25:46.368084991Z" level=error msg="Failed to destroy network for sandbox \"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.370231 systemd[1]: run-netns-cni\x2d92b4418e\x2d34ee\x2d8301\x2d4c49\x2df1f3dd7ce614.mount: Deactivated successfully. Aug 19 00:25:46.376753 containerd[1525]: time="2025-08-19T00:25:46.376691276Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5lvzf,Uid:8f040549-5eba-4e69-b6a1-39fea6377b08,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.377653 kubelet[2683]: E0819 00:25:46.377346 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.377653 kubelet[2683]: E0819 00:25:46.377415 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:46.377653 kubelet[2683]: E0819 00:25:46.377439 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5lvzf" Aug 19 00:25:46.377843 kubelet[2683]: E0819 00:25:46.377492 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-5lvzf_calico-system(8f040549-5eba-4e69-b6a1-39fea6377b08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-5lvzf_calico-system(8f040549-5eba-4e69-b6a1-39fea6377b08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5eb08a202f1177c6bb9f8abcebaf10752bd60ca5bf1dba4ea0e6d6cf768a48cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-5lvzf" podUID="8f040549-5eba-4e69-b6a1-39fea6377b08" Aug 19 00:25:46.381423 containerd[1525]: time="2025-08-19T00:25:46.381374759Z" level=error msg="Failed to destroy network for sandbox \"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.383799 containerd[1525]: time="2025-08-19T00:25:46.383733000Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-tr54d,Uid:cae57906-475d-4518-b0c5-5f0aecff80c9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.384323 kubelet[2683]: E0819 00:25:46.383963 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.384323 kubelet[2683]: E0819 00:25:46.384021 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" Aug 19 00:25:46.384323 kubelet[2683]: E0819 00:25:46.384042 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" Aug 19 00:25:46.384499 kubelet[2683]: E0819 00:25:46.384092 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6469f8695c-tr54d_calico-apiserver(cae57906-475d-4518-b0c5-5f0aecff80c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6469f8695c-tr54d_calico-apiserver(cae57906-475d-4518-b0c5-5f0aecff80c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8789a20b8e014196823a012ce5244e7c63289f515ac5cc94c2d0d7196281b6e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" podUID="cae57906-475d-4518-b0c5-5f0aecff80c9" Aug 19 00:25:46.390482 containerd[1525]: time="2025-08-19T00:25:46.390396324Z" level=error msg="Failed to destroy network for sandbox \"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.392691 containerd[1525]: time="2025-08-19T00:25:46.392634686Z" level=error msg="Failed to destroy network for sandbox \"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.416005 containerd[1525]: time="2025-08-19T00:25:46.415319699Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-d2lkb,Uid:92768be8-1d70-4e16-8a82-57ab317927cd,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.416181 kubelet[2683]: E0819 00:25:46.415569 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.416181 kubelet[2683]: E0819 00:25:46.415629 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" Aug 19 00:25:46.416181 kubelet[2683]: E0819 00:25:46.415649 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" Aug 19 00:25:46.416301 kubelet[2683]: E0819 00:25:46.415767 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6469f8695c-d2lkb_calico-apiserver(92768be8-1d70-4e16-8a82-57ab317927cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6469f8695c-d2lkb_calico-apiserver(92768be8-1d70-4e16-8a82-57ab317927cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e4aac72778d91c4d82318104a04675ff3b58e53abbea54040d129e608fa5cd6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" podUID="92768be8-1d70-4e16-8a82-57ab317927cd" Aug 19 00:25:46.427025 containerd[1525]: time="2025-08-19T00:25:46.421827703Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4ccpq,Uid:2c50faac-e631-4180-82e0-eced6cb1cc4a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.427270 kubelet[2683]: E0819 00:25:46.427222 2683 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:25:46.427359 kubelet[2683]: E0819 00:25:46.427288 2683 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:46.427359 kubelet[2683]: E0819 00:25:46.427315 2683 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4ccpq" Aug 19 00:25:46.427413 kubelet[2683]: E0819 00:25:46.427385 2683 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-4ccpq_calico-system(2c50faac-e631-4180-82e0-eced6cb1cc4a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-4ccpq_calico-system(2c50faac-e631-4180-82e0-eced6cb1cc4a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"59f6a1de5b2de206421b4acb99891763d17dcec3e8318eeaac05e2aa2fe5e8d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-4ccpq" podUID="2c50faac-e631-4180-82e0-eced6cb1cc4a" Aug 19 00:25:47.330011 systemd[1]: run-netns-cni\x2d8d57748b\x2da189\x2d4447\x2d0ef7\x2d8c3a5e80db06.mount: Deactivated successfully. Aug 19 00:25:47.330101 systemd[1]: run-netns-cni\x2de0f3e534\x2d14a0\x2d4485\x2daa54\x2d91a677d41b92.mount: Deactivated successfully. Aug 19 00:25:47.330145 systemd[1]: run-netns-cni\x2d83c1da5a\x2dc120\x2d5649\x2d853e\x2d675273ab3f44.mount: Deactivated successfully. Aug 19 00:25:48.826462 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1533369672.mount: Deactivated successfully. Aug 19 00:25:49.138452 containerd[1525]: time="2025-08-19T00:25:49.130242690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Aug 19 00:25:49.139072 containerd[1525]: time="2025-08-19T00:25:49.139013134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:49.139719 containerd[1525]: time="2025-08-19T00:25:49.139683174Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:49.140681 containerd[1525]: time="2025-08-19T00:25:49.140624015Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:49.141240 containerd[1525]: time="2025-08-19T00:25:49.141171895Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 5.811241849s" Aug 19 00:25:49.141240 containerd[1525]: time="2025-08-19T00:25:49.141232975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Aug 19 00:25:49.185327 containerd[1525]: time="2025-08-19T00:25:49.184363316Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 00:25:49.210315 containerd[1525]: time="2025-08-19T00:25:49.210237168Z" level=info msg="Container 844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:49.232981 containerd[1525]: time="2025-08-19T00:25:49.232926539Z" level=info msg="CreateContainer within sandbox \"70503f8034840a0eff2c52a457c803020aed096670b28b86f5650c25f2b5fdc5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\"" Aug 19 00:25:49.235281 containerd[1525]: time="2025-08-19T00:25:49.235246540Z" level=info msg="StartContainer for \"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\"" Aug 19 00:25:49.237051 containerd[1525]: time="2025-08-19T00:25:49.236949541Z" level=info msg="connecting to shim 844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608" address="unix:///run/containerd/s/0a261ec516a7782e61a8e43d6ced28a1a0fdbd2929663ff3a68ec7ae2760de7b" protocol=ttrpc version=3 Aug 19 00:25:49.290423 systemd[1]: Started cri-containerd-844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608.scope - libcontainer container 844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608. Aug 19 00:25:49.342837 containerd[1525]: time="2025-08-19T00:25:49.342764352Z" level=info msg="StartContainer for \"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\" returns successfully" Aug 19 00:25:49.474775 kubelet[2683]: I0819 00:25:49.474682 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-4wbjt" podStartSLOduration=1.674939177 podStartE2EDuration="14.474664856s" podCreationTimestamp="2025-08-19 00:25:35 +0000 UTC" firstStartedPulling="2025-08-19 00:25:36.3496215 +0000 UTC m=+21.264754022" lastFinishedPulling="2025-08-19 00:25:49.149347179 +0000 UTC m=+34.064479701" observedRunningTime="2025-08-19 00:25:49.473453415 +0000 UTC m=+34.388585937" watchObservedRunningTime="2025-08-19 00:25:49.474664856 +0000 UTC m=+34.389797378" Aug 19 00:25:49.599101 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 00:25:49.599336 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 00:25:49.814512 kubelet[2683]: I0819 00:25:49.814388 2683 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-backend-key-pair\") pod \"f19009fc-e647-42cf-afbb-1965a26588f6\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " Aug 19 00:25:49.814512 kubelet[2683]: I0819 00:25:49.814478 2683 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-ca-bundle\") pod \"f19009fc-e647-42cf-afbb-1965a26588f6\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " Aug 19 00:25:49.814512 kubelet[2683]: I0819 00:25:49.814506 2683 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvc92\" (UniqueName: \"kubernetes.io/projected/f19009fc-e647-42cf-afbb-1965a26588f6-kube-api-access-lvc92\") pod \"f19009fc-e647-42cf-afbb-1965a26588f6\" (UID: \"f19009fc-e647-42cf-afbb-1965a26588f6\") " Aug 19 00:25:49.833430 systemd[1]: var-lib-kubelet-pods-f19009fc\x2de647\x2d42cf\x2dafbb\x2d1965a26588f6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dlvc92.mount: Deactivated successfully. Aug 19 00:25:49.833535 systemd[1]: var-lib-kubelet-pods-f19009fc\x2de647\x2d42cf\x2dafbb\x2d1965a26588f6-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 00:25:49.841541 kubelet[2683]: I0819 00:25:49.841481 2683 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19009fc-e647-42cf-afbb-1965a26588f6-kube-api-access-lvc92" (OuterVolumeSpecName: "kube-api-access-lvc92") pod "f19009fc-e647-42cf-afbb-1965a26588f6" (UID: "f19009fc-e647-42cf-afbb-1965a26588f6"). InnerVolumeSpecName "kube-api-access-lvc92". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 00:25:49.841541 kubelet[2683]: I0819 00:25:49.841483 2683 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f19009fc-e647-42cf-afbb-1965a26588f6" (UID: "f19009fc-e647-42cf-afbb-1965a26588f6"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 00:25:49.842684 kubelet[2683]: I0819 00:25:49.842647 2683 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f19009fc-e647-42cf-afbb-1965a26588f6" (UID: "f19009fc-e647-42cf-afbb-1965a26588f6"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 19 00:25:49.915706 kubelet[2683]: I0819 00:25:49.915622 2683 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Aug 19 00:25:49.915706 kubelet[2683]: I0819 00:25:49.915668 2683 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f19009fc-e647-42cf-afbb-1965a26588f6-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Aug 19 00:25:49.915706 kubelet[2683]: I0819 00:25:49.915679 2683 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvc92\" (UniqueName: \"kubernetes.io/projected/f19009fc-e647-42cf-afbb-1965a26588f6-kube-api-access-lvc92\") on node \"localhost\" DevicePath \"\"" Aug 19 00:25:50.412768 kubelet[2683]: I0819 00:25:50.412713 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:25:50.417582 systemd[1]: Removed slice kubepods-besteffort-podf19009fc_e647_42cf_afbb_1965a26588f6.slice - libcontainer container kubepods-besteffort-podf19009fc_e647_42cf_afbb_1965a26588f6.slice. Aug 19 00:25:50.521003 systemd[1]: Created slice kubepods-besteffort-pod8ac52e09_354e_4423_a27b_aa5d584b9e8a.slice - libcontainer container kubepods-besteffort-pod8ac52e09_354e_4423_a27b_aa5d584b9e8a.slice. Aug 19 00:25:50.630930 kubelet[2683]: I0819 00:25:50.630868 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8ac52e09-354e-4423-a27b-aa5d584b9e8a-whisker-backend-key-pair\") pod \"whisker-77b46db87b-8g8lk\" (UID: \"8ac52e09-354e-4423-a27b-aa5d584b9e8a\") " pod="calico-system/whisker-77b46db87b-8g8lk" Aug 19 00:25:50.630930 kubelet[2683]: I0819 00:25:50.630925 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f4w9\" (UniqueName: \"kubernetes.io/projected/8ac52e09-354e-4423-a27b-aa5d584b9e8a-kube-api-access-9f4w9\") pod \"whisker-77b46db87b-8g8lk\" (UID: \"8ac52e09-354e-4423-a27b-aa5d584b9e8a\") " pod="calico-system/whisker-77b46db87b-8g8lk" Aug 19 00:25:50.631398 kubelet[2683]: I0819 00:25:50.630952 2683 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ac52e09-354e-4423-a27b-aa5d584b9e8a-whisker-ca-bundle\") pod \"whisker-77b46db87b-8g8lk\" (UID: \"8ac52e09-354e-4423-a27b-aa5d584b9e8a\") " pod="calico-system/whisker-77b46db87b-8g8lk" Aug 19 00:25:50.825138 containerd[1525]: time="2025-08-19T00:25:50.825093483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77b46db87b-8g8lk,Uid:8ac52e09-354e-4423-a27b-aa5d584b9e8a,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:51.189236 kubelet[2683]: I0819 00:25:51.187767 2683 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19009fc-e647-42cf-afbb-1965a26588f6" path="/var/lib/kubelet/pods/f19009fc-e647-42cf-afbb-1965a26588f6/volumes" Aug 19 00:25:51.719665 systemd-networkd[1433]: cali78e763ccebc: Link UP Aug 19 00:25:51.720066 systemd-networkd[1433]: cali78e763ccebc: Gained carrier Aug 19 00:25:51.771992 containerd[1525]: 2025-08-19 00:25:50.874 [INFO][3862] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 00:25:51.771992 containerd[1525]: 2025-08-19 00:25:50.985 [INFO][3862] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--77b46db87b--8g8lk-eth0 whisker-77b46db87b- calico-system 8ac52e09-354e-4423-a27b-aa5d584b9e8a 876 0 2025-08-19 00:25:50 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:77b46db87b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-77b46db87b-8g8lk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali78e763ccebc [] [] }} ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-" Aug 19 00:25:51.771992 containerd[1525]: 2025-08-19 00:25:50.985 [INFO][3862] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.771992 containerd[1525]: 2025-08-19 00:25:51.536 [INFO][3878] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" HandleID="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Workload="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.546 [INFO][3878] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" HandleID="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Workload="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400033c360), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-77b46db87b-8g8lk", "timestamp":"2025-08-19 00:25:51.53648799 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.546 [INFO][3878] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.546 [INFO][3878] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.546 [INFO][3878] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.585 [INFO][3878] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" host="localhost" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.615 [INFO][3878] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.653 [INFO][3878] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.658 [INFO][3878] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.663 [INFO][3878] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:51.772265 containerd[1525]: 2025-08-19 00:25:51.663 [INFO][3878] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" host="localhost" Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.682 [INFO][3878] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800 Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.695 [INFO][3878] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" host="localhost" Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.707 [INFO][3878] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" host="localhost" Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.707 [INFO][3878] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" host="localhost" Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.707 [INFO][3878] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:51.772489 containerd[1525]: 2025-08-19 00:25:51.707 [INFO][3878] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" HandleID="k8s-pod-network.852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Workload="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.772595 containerd[1525]: 2025-08-19 00:25:51.711 [INFO][3862] cni-plugin/k8s.go 418: Populated endpoint ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--77b46db87b--8g8lk-eth0", GenerateName:"whisker-77b46db87b-", Namespace:"calico-system", SelfLink:"", UID:"8ac52e09-354e-4423-a27b-aa5d584b9e8a", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77b46db87b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-77b46db87b-8g8lk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali78e763ccebc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:51.772595 containerd[1525]: 2025-08-19 00:25:51.711 [INFO][3862] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.772663 containerd[1525]: 2025-08-19 00:25:51.711 [INFO][3862] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali78e763ccebc ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.772663 containerd[1525]: 2025-08-19 00:25:51.720 [INFO][3862] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.772703 containerd[1525]: 2025-08-19 00:25:51.720 [INFO][3862] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--77b46db87b--8g8lk-eth0", GenerateName:"whisker-77b46db87b-", Namespace:"calico-system", SelfLink:"", UID:"8ac52e09-354e-4423-a27b-aa5d584b9e8a", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77b46db87b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800", Pod:"whisker-77b46db87b-8g8lk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali78e763ccebc", MAC:"3e:ba:1d:d9:a0:81", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:51.772834 containerd[1525]: 2025-08-19 00:25:51.766 [INFO][3862] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" Namespace="calico-system" Pod="whisker-77b46db87b-8g8lk" WorkloadEndpoint="localhost-k8s-whisker--77b46db87b--8g8lk-eth0" Aug 19 00:25:51.988024 kubelet[2683]: I0819 00:25:51.987820 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:25:52.057126 containerd[1525]: time="2025-08-19T00:25:52.056894730Z" level=info msg="connecting to shim 852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800" address="unix:///run/containerd/s/ae8288c6cae7169a3c266b6da90241d14a2cf42b27228940bc49a0a121d5829f" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:52.107515 systemd[1]: Started cri-containerd-852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800.scope - libcontainer container 852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800. Aug 19 00:25:52.121195 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:52.156257 containerd[1525]: time="2025-08-19T00:25:52.156180169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77b46db87b-8g8lk,Uid:8ac52e09-354e-4423-a27b-aa5d584b9e8a,Namespace:calico-system,Attempt:0,} returns sandbox id \"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800\"" Aug 19 00:25:52.157945 containerd[1525]: time="2025-08-19T00:25:52.157880890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 00:25:52.913307 systemd-networkd[1433]: vxlan.calico: Link UP Aug 19 00:25:52.913315 systemd-networkd[1433]: vxlan.calico: Gained carrier Aug 19 00:25:53.209302 containerd[1525]: time="2025-08-19T00:25:53.208743623Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:53.210442 containerd[1525]: time="2025-08-19T00:25:53.210407184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Aug 19 00:25:53.211538 containerd[1525]: time="2025-08-19T00:25:53.211512024Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:53.213533 containerd[1525]: time="2025-08-19T00:25:53.213489105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:53.214534 containerd[1525]: time="2025-08-19T00:25:53.214474225Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.056553455s" Aug 19 00:25:53.214534 containerd[1525]: time="2025-08-19T00:25:53.214509665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Aug 19 00:25:53.219326 containerd[1525]: time="2025-08-19T00:25:53.219271667Z" level=info msg="CreateContainer within sandbox \"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 00:25:53.228818 containerd[1525]: time="2025-08-19T00:25:53.228771791Z" level=info msg="Container 79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:53.241889 containerd[1525]: time="2025-08-19T00:25:53.241819356Z" level=info msg="CreateContainer within sandbox \"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00\"" Aug 19 00:25:53.242620 containerd[1525]: time="2025-08-19T00:25:53.242583716Z" level=info msg="StartContainer for \"79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00\"" Aug 19 00:25:53.245358 containerd[1525]: time="2025-08-19T00:25:53.245308797Z" level=info msg="connecting to shim 79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00" address="unix:///run/containerd/s/ae8288c6cae7169a3c266b6da90241d14a2cf42b27228940bc49a0a121d5829f" protocol=ttrpc version=3 Aug 19 00:25:53.273684 systemd[1]: Started cri-containerd-79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00.scope - libcontainer container 79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00. Aug 19 00:25:53.340927 containerd[1525]: time="2025-08-19T00:25:53.340863793Z" level=info msg="StartContainer for \"79e29cc424e154effc7a40cb1b6adbce30d25782e6de0aca72f492bcace9fb00\" returns successfully" Aug 19 00:25:53.347632 containerd[1525]: time="2025-08-19T00:25:53.347585395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 00:25:53.366040 systemd-networkd[1433]: cali78e763ccebc: Gained IPv6LL Aug 19 00:25:54.714765 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount36150403.mount: Deactivated successfully. Aug 19 00:25:54.744261 containerd[1525]: time="2025-08-19T00:25:54.743417539Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:54.744261 containerd[1525]: time="2025-08-19T00:25:54.744074219Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Aug 19 00:25:54.745148 containerd[1525]: time="2025-08-19T00:25:54.745119299Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:54.748289 containerd[1525]: time="2025-08-19T00:25:54.748244860Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:54.749194 containerd[1525]: time="2025-08-19T00:25:54.749116661Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 1.401482186s" Aug 19 00:25:54.749194 containerd[1525]: time="2025-08-19T00:25:54.749192141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Aug 19 00:25:54.756574 containerd[1525]: time="2025-08-19T00:25:54.756515343Z" level=info msg="CreateContainer within sandbox \"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 00:25:54.764162 containerd[1525]: time="2025-08-19T00:25:54.763733786Z" level=info msg="Container 946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:54.772062 containerd[1525]: time="2025-08-19T00:25:54.772012029Z" level=info msg="CreateContainer within sandbox \"852e8e8586021ee17baa642493d528e4cff41eb9c24b552155fc36bb5feb9800\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c\"" Aug 19 00:25:54.772661 containerd[1525]: time="2025-08-19T00:25:54.772635229Z" level=info msg="StartContainer for \"946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c\"" Aug 19 00:25:54.773733 containerd[1525]: time="2025-08-19T00:25:54.773704789Z" level=info msg="connecting to shim 946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c" address="unix:///run/containerd/s/ae8288c6cae7169a3c266b6da90241d14a2cf42b27228940bc49a0a121d5829f" protocol=ttrpc version=3 Aug 19 00:25:54.801440 systemd[1]: Started cri-containerd-946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c.scope - libcontainer container 946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c. Aug 19 00:25:54.846035 containerd[1525]: time="2025-08-19T00:25:54.845984655Z" level=info msg="StartContainer for \"946ffe8da7789eb849d3d73c5affdd6b7048b649d0521b713214b4dc06d0014c\" returns successfully" Aug 19 00:25:54.901683 systemd-networkd[1433]: vxlan.calico: Gained IPv6LL Aug 19 00:25:55.458945 kubelet[2683]: I0819 00:25:55.458493 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-77b46db87b-8g8lk" podStartSLOduration=2.865975768 podStartE2EDuration="5.458473579s" podCreationTimestamp="2025-08-19 00:25:50 +0000 UTC" firstStartedPulling="2025-08-19 00:25:52.15762765 +0000 UTC m=+37.072760132" lastFinishedPulling="2025-08-19 00:25:54.750125421 +0000 UTC m=+39.665257943" observedRunningTime="2025-08-19 00:25:55.457771299 +0000 UTC m=+40.372903781" watchObservedRunningTime="2025-08-19 00:25:55.458473579 +0000 UTC m=+40.373606061" Aug 19 00:25:57.186084 containerd[1525]: time="2025-08-19T00:25:57.186045558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6ffc4747bd-f9n7m,Uid:02913a4b-73f9-41fe-b20f-d28cc9536400,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:57.371094 systemd-networkd[1433]: calic60d716c224: Link UP Aug 19 00:25:57.371726 systemd-networkd[1433]: calic60d716c224: Gained carrier Aug 19 00:25:57.406533 containerd[1525]: 2025-08-19 00:25:57.265 [INFO][4262] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0 calico-kube-controllers-6ffc4747bd- calico-system 02913a4b-73f9-41fe-b20f-d28cc9536400 803 0 2025-08-19 00:25:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6ffc4747bd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6ffc4747bd-f9n7m eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic60d716c224 [] [] }} ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-" Aug 19 00:25:57.406533 containerd[1525]: 2025-08-19 00:25:57.266 [INFO][4262] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.406533 containerd[1525]: 2025-08-19 00:25:57.312 [INFO][4276] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" HandleID="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Workload="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.313 [INFO][4276] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" HandleID="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Workload="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004df40), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6ffc4747bd-f9n7m", "timestamp":"2025-08-19 00:25:57.312870354 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.313 [INFO][4276] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.313 [INFO][4276] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.313 [INFO][4276] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.324 [INFO][4276] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" host="localhost" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.332 [INFO][4276] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.337 [INFO][4276] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.340 [INFO][4276] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.342 [INFO][4276] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:57.406740 containerd[1525]: 2025-08-19 00:25:57.342 [INFO][4276] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" host="localhost" Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.344 [INFO][4276] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.355 [INFO][4276] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" host="localhost" Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.364 [INFO][4276] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" host="localhost" Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.364 [INFO][4276] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" host="localhost" Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.364 [INFO][4276] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:57.406940 containerd[1525]: 2025-08-19 00:25:57.364 [INFO][4276] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" HandleID="k8s-pod-network.9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Workload="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.407057 containerd[1525]: 2025-08-19 00:25:57.367 [INFO][4262] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0", GenerateName:"calico-kube-controllers-6ffc4747bd-", Namespace:"calico-system", SelfLink:"", UID:"02913a4b-73f9-41fe-b20f-d28cc9536400", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6ffc4747bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6ffc4747bd-f9n7m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic60d716c224", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:57.407105 containerd[1525]: 2025-08-19 00:25:57.367 [INFO][4262] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.407105 containerd[1525]: 2025-08-19 00:25:57.367 [INFO][4262] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic60d716c224 ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.407105 containerd[1525]: 2025-08-19 00:25:57.369 [INFO][4262] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.407294 containerd[1525]: 2025-08-19 00:25:57.371 [INFO][4262] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0", GenerateName:"calico-kube-controllers-6ffc4747bd-", Namespace:"calico-system", SelfLink:"", UID:"02913a4b-73f9-41fe-b20f-d28cc9536400", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6ffc4747bd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa", Pod:"calico-kube-controllers-6ffc4747bd-f9n7m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic60d716c224", MAC:"ae:93:12:37:56:7b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:57.407365 containerd[1525]: 2025-08-19 00:25:57.403 [INFO][4262] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" Namespace="calico-system" Pod="calico-kube-controllers-6ffc4747bd-f9n7m" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6ffc4747bd--f9n7m-eth0" Aug 19 00:25:57.567511 containerd[1525]: time="2025-08-19T00:25:57.567382308Z" level=info msg="connecting to shim 9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa" address="unix:///run/containerd/s/c16bd093eeb0abc041afd5f7f0fed6a0207f8b9eb4d2fa87b39403700afd382b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:57.609545 systemd[1]: Started cri-containerd-9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa.scope - libcontainer container 9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa. Aug 19 00:25:57.632738 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:57.660190 containerd[1525]: time="2025-08-19T00:25:57.660130454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6ffc4747bd-f9n7m,Uid:02913a4b-73f9-41fe-b20f-d28cc9536400,Namespace:calico-system,Attempt:0,} returns sandbox id \"9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa\"" Aug 19 00:25:57.663227 containerd[1525]: time="2025-08-19T00:25:57.662728215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 00:25:57.929642 kubelet[2683]: I0819 00:25:57.929578 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:25:58.100420 containerd[1525]: time="2025-08-19T00:25:58.100373219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\" id:\"2343e405228be527c58f7882005cb3ad5d28eb5348d1d79cf8cc82f769e84e88\" pid:4354 exited_at:{seconds:1755563158 nanos:100025059}" Aug 19 00:25:58.185025 containerd[1525]: time="2025-08-19T00:25:58.184181202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fqnlp,Uid:5de149ae-7e04-4bbd-9374-54fbeca24b5c,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:58.185157 containerd[1525]: time="2025-08-19T00:25:58.184494042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4ccpq,Uid:2c50faac-e631-4180-82e0-eced6cb1cc4a,Namespace:calico-system,Attempt:0,}" Aug 19 00:25:58.295812 containerd[1525]: time="2025-08-19T00:25:58.295480232Z" level=info msg="TaskExit event in podsandbox handler container_id:\"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\" id:\"70ca9d2810ea953967afcc6ab746333678d7ef80912c50b82c1c3fd98a0aee02\" pid:4384 exited_at:{seconds:1755563158 nanos:294689112}" Aug 19 00:25:58.363738 systemd[1]: Started sshd@7-10.0.0.116:22-10.0.0.1:40408.service - OpenSSH per-connection server daemon (10.0.0.1:40408). Aug 19 00:25:58.411712 systemd-networkd[1433]: calib75ed4f0ee6: Link UP Aug 19 00:25:58.412744 systemd-networkd[1433]: calib75ed4f0ee6: Gained carrier Aug 19 00:25:58.430638 containerd[1525]: 2025-08-19 00:25:58.248 [INFO][4391] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--4ccpq-eth0 csi-node-driver- calico-system 2c50faac-e631-4180-82e0-eced6cb1cc4a 702 0 2025-08-19 00:25:35 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-4ccpq eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib75ed4f0ee6 [] [] }} ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-" Aug 19 00:25:58.430638 containerd[1525]: 2025-08-19 00:25:58.249 [INFO][4391] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.430638 containerd[1525]: 2025-08-19 00:25:58.294 [INFO][4427] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" HandleID="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Workload="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.294 [INFO][4427] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" HandleID="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Workload="localhost-k8s-csi--node--driver--4ccpq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001af400), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-4ccpq", "timestamp":"2025-08-19 00:25:58.294407792 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.294 [INFO][4427] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.294 [INFO][4427] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.294 [INFO][4427] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.334 [INFO][4427] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" host="localhost" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.357 [INFO][4427] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.373 [INFO][4427] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.377 [INFO][4427] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.381 [INFO][4427] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:58.430913 containerd[1525]: 2025-08-19 00:25:58.381 [INFO][4427] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" host="localhost" Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.384 [INFO][4427] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123 Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.389 [INFO][4427] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" host="localhost" Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.397 [INFO][4427] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" host="localhost" Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.397 [INFO][4427] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" host="localhost" Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.397 [INFO][4427] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:58.431162 containerd[1525]: 2025-08-19 00:25:58.398 [INFO][4427] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" HandleID="k8s-pod-network.8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Workload="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.431313 containerd[1525]: 2025-08-19 00:25:58.407 [INFO][4391] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--4ccpq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c50faac-e631-4180-82e0-eced6cb1cc4a", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-4ccpq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib75ed4f0ee6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:58.431370 containerd[1525]: 2025-08-19 00:25:58.407 [INFO][4391] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.431370 containerd[1525]: 2025-08-19 00:25:58.407 [INFO][4391] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib75ed4f0ee6 ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.431370 containerd[1525]: 2025-08-19 00:25:58.412 [INFO][4391] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.431424 containerd[1525]: 2025-08-19 00:25:58.413 [INFO][4391] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--4ccpq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c50faac-e631-4180-82e0-eced6cb1cc4a", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123", Pod:"csi-node-driver-4ccpq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib75ed4f0ee6", MAC:"6e:64:5f:fb:11:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:58.431473 containerd[1525]: 2025-08-19 00:25:58.427 [INFO][4391] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" Namespace="calico-system" Pod="csi-node-driver-4ccpq" WorkloadEndpoint="localhost-k8s-csi--node--driver--4ccpq-eth0" Aug 19 00:25:58.446536 sshd[4449]: Accepted publickey for core from 10.0.0.1 port 40408 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:25:58.449737 sshd-session[4449]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:25:58.456407 systemd-logind[1507]: New session 8 of user core. Aug 19 00:25:58.462436 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 00:25:58.488481 containerd[1525]: time="2025-08-19T00:25:58.488244884Z" level=info msg="connecting to shim 8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123" address="unix:///run/containerd/s/391a765eace6a48793dd8371fa9779d85e548ea3946da0e7194951371f45e5bf" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:58.515451 systemd[1]: Started cri-containerd-8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123.scope - libcontainer container 8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123. Aug 19 00:25:58.548499 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:58.564520 systemd-networkd[1433]: cali91ea471764f: Link UP Aug 19 00:25:58.565395 systemd-networkd[1433]: cali91ea471764f: Gained carrier Aug 19 00:25:58.656095 containerd[1525]: time="2025-08-19T00:25:58.656040650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4ccpq,Uid:2c50faac-e631-4180-82e0-eced6cb1cc4a,Namespace:calico-system,Attempt:0,} returns sandbox id \"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123\"" Aug 19 00:25:58.680071 containerd[1525]: 2025-08-19 00:25:58.262 [INFO][4397] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0 coredns-674b8bbfcf- kube-system 5de149ae-7e04-4bbd-9374-54fbeca24b5c 795 0 2025-08-19 00:25:20 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-fqnlp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali91ea471764f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-" Aug 19 00:25:58.680071 containerd[1525]: 2025-08-19 00:25:58.262 [INFO][4397] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680071 containerd[1525]: 2025-08-19 00:25:58.315 [INFO][4434] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" HandleID="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Workload="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.316 [INFO][4434] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" HandleID="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Workload="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001379e0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-fqnlp", "timestamp":"2025-08-19 00:25:58.315929038 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.316 [INFO][4434] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.397 [INFO][4434] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.398 [INFO][4434] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.434 [INFO][4434] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" host="localhost" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.457 [INFO][4434] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.487 [INFO][4434] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.494 [INFO][4434] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.502 [INFO][4434] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:58.680395 containerd[1525]: 2025-08-19 00:25:58.502 [INFO][4434] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" host="localhost" Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.509 [INFO][4434] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56 Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.525 [INFO][4434] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" host="localhost" Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.544 [INFO][4434] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" host="localhost" Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.544 [INFO][4434] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" host="localhost" Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.544 [INFO][4434] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:58.680665 containerd[1525]: 2025-08-19 00:25:58.544 [INFO][4434] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" HandleID="k8s-pod-network.4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Workload="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680781 containerd[1525]: 2025-08-19 00:25:58.557 [INFO][4397] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5de149ae-7e04-4bbd-9374-54fbeca24b5c", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-fqnlp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali91ea471764f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:58.680902 containerd[1525]: 2025-08-19 00:25:58.558 [INFO][4397] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680902 containerd[1525]: 2025-08-19 00:25:58.559 [INFO][4397] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91ea471764f ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680902 containerd[1525]: 2025-08-19 00:25:58.565 [INFO][4397] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.680968 containerd[1525]: 2025-08-19 00:25:58.566 [INFO][4397] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5de149ae-7e04-4bbd-9374-54fbeca24b5c", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56", Pod:"coredns-674b8bbfcf-fqnlp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali91ea471764f", MAC:"e6:a2:00:da:e8:48", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:58.680968 containerd[1525]: 2025-08-19 00:25:58.676 [INFO][4397] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" Namespace="kube-system" Pod="coredns-674b8bbfcf-fqnlp" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--fqnlp-eth0" Aug 19 00:25:58.777706 containerd[1525]: time="2025-08-19T00:25:58.777505322Z" level=info msg="connecting to shim 4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56" address="unix:///run/containerd/s/4b188d9533d509f23f6c0b5150d51b839c25eddae39dcc56ddb4e530b135a39d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:58.814421 systemd[1]: Started cri-containerd-4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56.scope - libcontainer container 4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56. Aug 19 00:25:58.838330 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:58.875138 containerd[1525]: time="2025-08-19T00:25:58.875062949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fqnlp,Uid:5de149ae-7e04-4bbd-9374-54fbeca24b5c,Namespace:kube-system,Attempt:0,} returns sandbox id \"4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56\"" Aug 19 00:25:58.889345 containerd[1525]: time="2025-08-19T00:25:58.889282233Z" level=info msg="CreateContainer within sandbox \"4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:25:58.897590 sshd[4462]: Connection closed by 10.0.0.1 port 40408 Aug 19 00:25:58.897990 sshd-session[4449]: pam_unix(sshd:session): session closed for user core Aug 19 00:25:58.904469 systemd-logind[1507]: Session 8 logged out. Waiting for processes to exit. Aug 19 00:25:58.904990 systemd[1]: sshd@7-10.0.0.116:22-10.0.0.1:40408.service: Deactivated successfully. Aug 19 00:25:58.906916 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 00:25:58.910546 systemd-logind[1507]: Removed session 8. Aug 19 00:25:58.924491 containerd[1525]: time="2025-08-19T00:25:58.924437762Z" level=info msg="Container 07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:58.935969 containerd[1525]: time="2025-08-19T00:25:58.935919365Z" level=info msg="CreateContainer within sandbox \"4e18782c42b2fd90eb7a6396fd38a2023b778e62c1f7373ee2878b1c07013b56\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566\"" Aug 19 00:25:58.937114 containerd[1525]: time="2025-08-19T00:25:58.937070645Z" level=info msg="StartContainer for \"07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566\"" Aug 19 00:25:58.939700 containerd[1525]: time="2025-08-19T00:25:58.939561166Z" level=info msg="connecting to shim 07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566" address="unix:///run/containerd/s/4b188d9533d509f23f6c0b5150d51b839c25eddae39dcc56ddb4e530b135a39d" protocol=ttrpc version=3 Aug 19 00:25:58.961455 systemd[1]: Started cri-containerd-07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566.scope - libcontainer container 07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566. Aug 19 00:25:59.011303 containerd[1525]: time="2025-08-19T00:25:59.011244705Z" level=info msg="StartContainer for \"07b7359200b171d0ff45682456b00fb3dd0575ddd3ebbc44d1b8d5b30a92d566\" returns successfully" Aug 19 00:25:59.188877 containerd[1525]: time="2025-08-19T00:25:59.188809350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-tr54d,Uid:cae57906-475d-4518-b0c5-5f0aecff80c9,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:25:59.189170 containerd[1525]: time="2025-08-19T00:25:59.189112630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5xxhl,Uid:ab6c6de2-acbf-4715-972b-2d8064a58b86,Namespace:kube-system,Attempt:0,}" Aug 19 00:25:59.254524 systemd-networkd[1433]: calic60d716c224: Gained IPv6LL Aug 19 00:25:59.475085 containerd[1525]: time="2025-08-19T00:25:59.467848021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:59.475085 containerd[1525]: time="2025-08-19T00:25:59.473325742Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Aug 19 00:25:59.475424 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount581980753.mount: Deactivated successfully. Aug 19 00:25:59.476600 containerd[1525]: time="2025-08-19T00:25:59.475597183Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:59.496732 containerd[1525]: time="2025-08-19T00:25:59.496661468Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:25:59.497757 containerd[1525]: time="2025-08-19T00:25:59.497720909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 1.834945334s" Aug 19 00:25:59.498072 containerd[1525]: time="2025-08-19T00:25:59.497760869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Aug 19 00:25:59.502726 containerd[1525]: time="2025-08-19T00:25:59.502687630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 00:25:59.523490 containerd[1525]: time="2025-08-19T00:25:59.523423315Z" level=info msg="CreateContainer within sandbox \"9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 00:25:59.535275 kubelet[2683]: I0819 00:25:59.535181 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-fqnlp" podStartSLOduration=39.535085198 podStartE2EDuration="39.535085198s" podCreationTimestamp="2025-08-19 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:25:59.491076547 +0000 UTC m=+44.406209069" watchObservedRunningTime="2025-08-19 00:25:59.535085198 +0000 UTC m=+44.450217680" Aug 19 00:25:59.558921 containerd[1525]: time="2025-08-19T00:25:59.558863564Z" level=info msg="Container 8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:59.573389 systemd-networkd[1433]: calib75ed4f0ee6: Gained IPv6LL Aug 19 00:25:59.581884 containerd[1525]: time="2025-08-19T00:25:59.581826690Z" level=info msg="CreateContainer within sandbox \"9e6c495806165ba7df40bd9394282d78fbc8b57819340f2e4b59372a720fefaa\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\"" Aug 19 00:25:59.583236 containerd[1525]: time="2025-08-19T00:25:59.583135130Z" level=info msg="StartContainer for \"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\"" Aug 19 00:25:59.587978 containerd[1525]: time="2025-08-19T00:25:59.587931411Z" level=info msg="connecting to shim 8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea" address="unix:///run/containerd/s/c16bd093eeb0abc041afd5f7f0fed6a0207f8b9eb4d2fa87b39403700afd382b" protocol=ttrpc version=3 Aug 19 00:25:59.623462 systemd[1]: Started cri-containerd-8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea.scope - libcontainer container 8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea. Aug 19 00:25:59.659916 systemd-networkd[1433]: calia86aa4f3ff0: Link UP Aug 19 00:25:59.661771 systemd-networkd[1433]: calia86aa4f3ff0: Gained carrier Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.515 [INFO][4615] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0 coredns-674b8bbfcf- kube-system ab6c6de2-acbf-4715-972b-2d8064a58b86 800 0 2025-08-19 00:25:20 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-5xxhl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia86aa4f3ff0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.515 [INFO][4615] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.570 [INFO][4648] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" HandleID="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Workload="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.570 [INFO][4648] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" HandleID="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Workload="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d770), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-5xxhl", "timestamp":"2025-08-19 00:25:59.570578087 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.570 [INFO][4648] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.570 [INFO][4648] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.570 [INFO][4648] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.586 [INFO][4648] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.594 [INFO][4648] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.604 [INFO][4648] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.609 [INFO][4648] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.619 [INFO][4648] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.619 [INFO][4648] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.624 [INFO][4648] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.632 [INFO][4648] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4648] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4648] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" host="localhost" Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4648] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:59.693112 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4648] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" HandleID="k8s-pod-network.6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Workload="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.653 [INFO][4615] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ab6c6de2-acbf-4715-972b-2d8064a58b86", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-5xxhl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia86aa4f3ff0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.654 [INFO][4615] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.654 [INFO][4615] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia86aa4f3ff0 ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.662 [INFO][4615] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.664 [INFO][4615] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ab6c6de2-acbf-4715-972b-2d8064a58b86", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a", Pod:"coredns-674b8bbfcf-5xxhl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia86aa4f3ff0", MAC:"8e:a0:ad:97:51:ca", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:59.694104 containerd[1525]: 2025-08-19 00:25:59.687 [INFO][4615] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" Namespace="kube-system" Pod="coredns-674b8bbfcf-5xxhl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5xxhl-eth0" Aug 19 00:25:59.695992 containerd[1525]: time="2025-08-19T00:25:59.695867719Z" level=info msg="StartContainer for \"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\" returns successfully" Aug 19 00:25:59.737067 containerd[1525]: time="2025-08-19T00:25:59.736946889Z" level=info msg="connecting to shim 6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a" address="unix:///run/containerd/s/f70787956b1a5e791d76e2aaa2d70e3bafb0bcc9df02125cb6bfd097b8b31087" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:59.759895 systemd-networkd[1433]: calia507e9ea002: Link UP Aug 19 00:25:59.760047 systemd-networkd[1433]: calia507e9ea002: Gained carrier Aug 19 00:25:59.789511 systemd[1]: Started cri-containerd-6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a.scope - libcontainer container 6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a. Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.520 [INFO][4627] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0 calico-apiserver-6469f8695c- calico-apiserver cae57906-475d-4518-b0c5-5f0aecff80c9 805 0 2025-08-19 00:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6469f8695c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6469f8695c-tr54d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia507e9ea002 [] [] }} ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.520 [INFO][4627] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.589 [INFO][4646] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" HandleID="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Workload="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.589 [INFO][4646] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" HandleID="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Workload="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001375d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6469f8695c-tr54d", "timestamp":"2025-08-19 00:25:59.589696612 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.590 [INFO][4646] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4646] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.643 [INFO][4646] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.687 [INFO][4646] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.700 [INFO][4646] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.708 [INFO][4646] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.714 [INFO][4646] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.720 [INFO][4646] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.720 [INFO][4646] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.727 [INFO][4646] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2 Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.736 [INFO][4646] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.751 [INFO][4646] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.751 [INFO][4646] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" host="localhost" Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.751 [INFO][4646] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:25:59.798637 containerd[1525]: 2025-08-19 00:25:59.751 [INFO][4646] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" HandleID="k8s-pod-network.e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Workload="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.756 [INFO][4627] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0", GenerateName:"calico-apiserver-6469f8695c-", Namespace:"calico-apiserver", SelfLink:"", UID:"cae57906-475d-4518-b0c5-5f0aecff80c9", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6469f8695c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6469f8695c-tr54d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia507e9ea002", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.756 [INFO][4627] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.756 [INFO][4627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia507e9ea002 ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.758 [INFO][4627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.759 [INFO][4627] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0", GenerateName:"calico-apiserver-6469f8695c-", Namespace:"calico-apiserver", SelfLink:"", UID:"cae57906-475d-4518-b0c5-5f0aecff80c9", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6469f8695c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2", Pod:"calico-apiserver-6469f8695c-tr54d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia507e9ea002", MAC:"ee:c2:d8:2b:2c:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:25:59.799773 containerd[1525]: 2025-08-19 00:25:59.787 [INFO][4627] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-tr54d" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--tr54d-eth0" Aug 19 00:25:59.824524 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:59.838933 containerd[1525]: time="2025-08-19T00:25:59.838872835Z" level=info msg="connecting to shim e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2" address="unix:///run/containerd/s/0c02957db49bce4106f2dbd5eda4cfc4cc05071f26c55100ba55d0015975546d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:25:59.871480 systemd[1]: Started cri-containerd-e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2.scope - libcontainer container e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2. Aug 19 00:25:59.880613 containerd[1525]: time="2025-08-19T00:25:59.880511886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5xxhl,Uid:ab6c6de2-acbf-4715-972b-2d8064a58b86,Namespace:kube-system,Attempt:0,} returns sandbox id \"6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a\"" Aug 19 00:25:59.889533 containerd[1525]: time="2025-08-19T00:25:59.889488688Z" level=info msg="CreateContainer within sandbox \"6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:25:59.900317 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:25:59.915259 containerd[1525]: time="2025-08-19T00:25:59.915194894Z" level=info msg="Container f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:25:59.930998 containerd[1525]: time="2025-08-19T00:25:59.930407138Z" level=info msg="CreateContainer within sandbox \"6cf2c2592a7ad2eedb125bdc7208cb334c21126487684ba04b6fde12e0fbe27a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645\"" Aug 19 00:25:59.932331 containerd[1525]: time="2025-08-19T00:25:59.931055458Z" level=info msg="StartContainer for \"f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645\"" Aug 19 00:25:59.934686 containerd[1525]: time="2025-08-19T00:25:59.934183419Z" level=info msg="connecting to shim f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645" address="unix:///run/containerd/s/f70787956b1a5e791d76e2aaa2d70e3bafb0bcc9df02125cb6bfd097b8b31087" protocol=ttrpc version=3 Aug 19 00:25:59.937758 containerd[1525]: time="2025-08-19T00:25:59.937510620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-tr54d,Uid:cae57906-475d-4518-b0c5-5f0aecff80c9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2\"" Aug 19 00:25:59.974488 systemd[1]: Started cri-containerd-f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645.scope - libcontainer container f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645. Aug 19 00:26:00.014417 containerd[1525]: time="2025-08-19T00:26:00.014297639Z" level=info msg="StartContainer for \"f1647a3379a451aacc0d69712b38b14c2c6d132a9122a85ab29cc6625d72a645\" returns successfully" Aug 19 00:26:00.213541 systemd-networkd[1433]: cali91ea471764f: Gained IPv6LL Aug 19 00:26:00.561590 containerd[1525]: time="2025-08-19T00:26:00.561549209Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\" id:\"96dbef722a88491cab3871a606efc9ee1e17545e04e3fa9f4fe832a7f7674bec\" pid:4878 exited_at:{seconds:1755563160 nanos:546650806}" Aug 19 00:26:00.613231 containerd[1525]: time="2025-08-19T00:26:00.612301301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:00.615325 containerd[1525]: time="2025-08-19T00:26:00.615275942Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Aug 19 00:26:00.617045 containerd[1525]: time="2025-08-19T00:26:00.616994702Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:00.620189 containerd[1525]: time="2025-08-19T00:26:00.620145983Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:00.621090 containerd[1525]: time="2025-08-19T00:26:00.620925903Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.118197273s" Aug 19 00:26:00.621090 containerd[1525]: time="2025-08-19T00:26:00.620961383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Aug 19 00:26:00.622073 containerd[1525]: time="2025-08-19T00:26:00.622044984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:26:00.625429 containerd[1525]: time="2025-08-19T00:26:00.625246544Z" level=info msg="CreateContainer within sandbox \"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 00:26:00.632847 kubelet[2683]: I0819 00:26:00.632768 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6ffc4747bd-f9n7m" podStartSLOduration=22.792466731 podStartE2EDuration="24.632750466s" podCreationTimestamp="2025-08-19 00:25:36 +0000 UTC" firstStartedPulling="2025-08-19 00:25:57.662256095 +0000 UTC m=+42.577388617" lastFinishedPulling="2025-08-19 00:25:59.50253983 +0000 UTC m=+44.417672352" observedRunningTime="2025-08-19 00:26:00.498354234 +0000 UTC m=+45.413486756" watchObservedRunningTime="2025-08-19 00:26:00.632750466 +0000 UTC m=+45.547882988" Aug 19 00:26:00.633312 kubelet[2683]: I0819 00:26:00.633107 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5xxhl" podStartSLOduration=40.633102146 podStartE2EDuration="40.633102146s" podCreationTimestamp="2025-08-19 00:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:26:00.632401386 +0000 UTC m=+45.547533948" watchObservedRunningTime="2025-08-19 00:26:00.633102146 +0000 UTC m=+45.548234668" Aug 19 00:26:00.646587 containerd[1525]: time="2025-08-19T00:26:00.646542069Z" level=info msg="Container e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:26:00.676876 containerd[1525]: time="2025-08-19T00:26:00.676133596Z" level=info msg="CreateContainer within sandbox \"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281\"" Aug 19 00:26:00.677632 containerd[1525]: time="2025-08-19T00:26:00.677179317Z" level=info msg="StartContainer for \"e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281\"" Aug 19 00:26:00.681793 containerd[1525]: time="2025-08-19T00:26:00.681166838Z" level=info msg="connecting to shim e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281" address="unix:///run/containerd/s/391a765eace6a48793dd8371fa9779d85e548ea3946da0e7194951371f45e5bf" protocol=ttrpc version=3 Aug 19 00:26:00.710437 systemd[1]: Started cri-containerd-e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281.scope - libcontainer container e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281. Aug 19 00:26:00.751230 containerd[1525]: time="2025-08-19T00:26:00.751161854Z" level=info msg="StartContainer for \"e850ae55556c9dfac1ed9a33da4f309f4d0b4e35bc0383b2aa62fb6393fe5281\" returns successfully" Aug 19 00:26:01.173542 systemd-networkd[1433]: calia507e9ea002: Gained IPv6LL Aug 19 00:26:01.185446 containerd[1525]: time="2025-08-19T00:26:01.185393675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5lvzf,Uid:8f040549-5eba-4e69-b6a1-39fea6377b08,Namespace:calico-system,Attempt:0,}" Aug 19 00:26:01.185676 containerd[1525]: time="2025-08-19T00:26:01.185410475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-d2lkb,Uid:92768be8-1d70-4e16-8a82-57ab317927cd,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:26:01.375565 systemd-networkd[1433]: calib20575573b8: Link UP Aug 19 00:26:01.376904 systemd-networkd[1433]: calib20575573b8: Gained carrier Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.247 [INFO][4922] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0 goldmane-768f4c5c69- calico-system 8f040549-5eba-4e69-b6a1-39fea6377b08 804 0 2025-08-19 00:25:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-768f4c5c69-5lvzf eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib20575573b8 [] [] }} ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.248 [INFO][4922] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.289 [INFO][4950] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" HandleID="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Workload="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.289 [INFO][4950] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" HandleID="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Workload="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000223400), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-768f4c5c69-5lvzf", "timestamp":"2025-08-19 00:26:01.289599898 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.289 [INFO][4950] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.289 [INFO][4950] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.289 [INFO][4950] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.303 [INFO][4950] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.315 [INFO][4950] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.322 [INFO][4950] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.325 [INFO][4950] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.331 [INFO][4950] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.332 [INFO][4950] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.334 [INFO][4950] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5 Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.341 [INFO][4950] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.363 [INFO][4950] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.363 [INFO][4950] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" host="localhost" Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.363 [INFO][4950] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:26:01.399313 containerd[1525]: 2025-08-19 00:26:01.363 [INFO][4950] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" HandleID="k8s-pod-network.185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Workload="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.371 [INFO][4922] cni-plugin/k8s.go 418: Populated endpoint ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"8f040549-5eba-4e69-b6a1-39fea6377b08", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-768f4c5c69-5lvzf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib20575573b8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.371 [INFO][4922] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.371 [INFO][4922] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib20575573b8 ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.377 [INFO][4922] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.378 [INFO][4922] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"8f040549-5eba-4e69-b6a1-39fea6377b08", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5", Pod:"goldmane-768f4c5c69-5lvzf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib20575573b8", MAC:"be:6c:cd:3b:94:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:26:01.401149 containerd[1525]: 2025-08-19 00:26:01.395 [INFO][4922] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" Namespace="calico-system" Pod="goldmane-768f4c5c69-5lvzf" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5lvzf-eth0" Aug 19 00:26:01.429399 systemd-networkd[1433]: calia86aa4f3ff0: Gained IPv6LL Aug 19 00:26:01.448781 containerd[1525]: time="2025-08-19T00:26:01.447996293Z" level=info msg="connecting to shim 185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5" address="unix:///run/containerd/s/4f03c3e8204abad51a7c978a0876e2ec169b4fe23395611d620f057ff8696567" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:26:01.468916 systemd-networkd[1433]: cali35712aeb5a0: Link UP Aug 19 00:26:01.470106 systemd-networkd[1433]: cali35712aeb5a0: Gained carrier Aug 19 00:26:01.499957 systemd[1]: Started cri-containerd-185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5.scope - libcontainer container 185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5. Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.254 [INFO][4932] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0 calico-apiserver-6469f8695c- calico-apiserver 92768be8-1d70-4e16-8a82-57ab317927cd 806 0 2025-08-19 00:25:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6469f8695c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6469f8695c-d2lkb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali35712aeb5a0 [] [] }} ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.255 [INFO][4932] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.294 [INFO][4957] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" HandleID="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Workload="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.295 [INFO][4957] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" HandleID="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Workload="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b9b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6469f8695c-d2lkb", "timestamp":"2025-08-19 00:26:01.294908859 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.295 [INFO][4957] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.364 [INFO][4957] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.364 [INFO][4957] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.402 [INFO][4957] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.417 [INFO][4957] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.425 [INFO][4957] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.430 [INFO][4957] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.436 [INFO][4957] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.436 [INFO][4957] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.440 [INFO][4957] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.445 [INFO][4957] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.458 [INFO][4957] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.459 [INFO][4957] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" host="localhost" Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.459 [INFO][4957] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:26:01.505535 containerd[1525]: 2025-08-19 00:26:01.459 [INFO][4957] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" HandleID="k8s-pod-network.0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Workload="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.464 [INFO][4932] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0", GenerateName:"calico-apiserver-6469f8695c-", Namespace:"calico-apiserver", SelfLink:"", UID:"92768be8-1d70-4e16-8a82-57ab317927cd", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6469f8695c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6469f8695c-d2lkb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35712aeb5a0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.464 [INFO][4932] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.464 [INFO][4932] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35712aeb5a0 ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.469 [INFO][4932] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.472 [INFO][4932] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0", GenerateName:"calico-apiserver-6469f8695c-", Namespace:"calico-apiserver", SelfLink:"", UID:"92768be8-1d70-4e16-8a82-57ab317927cd", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 25, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6469f8695c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed", Pod:"calico-apiserver-6469f8695c-d2lkb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali35712aeb5a0", MAC:"3a:2e:41:0d:8a:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:26:01.506368 containerd[1525]: 2025-08-19 00:26:01.500 [INFO][4932] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" Namespace="calico-apiserver" Pod="calico-apiserver-6469f8695c-d2lkb" WorkloadEndpoint="localhost-k8s-calico--apiserver--6469f8695c--d2lkb-eth0" Aug 19 00:26:01.521238 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:26:01.580785 containerd[1525]: time="2025-08-19T00:26:01.580679323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5lvzf,Uid:8f040549-5eba-4e69-b6a1-39fea6377b08,Namespace:calico-system,Attempt:0,} returns sandbox id \"185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5\"" Aug 19 00:26:01.610889 containerd[1525]: time="2025-08-19T00:26:01.610737129Z" level=info msg="connecting to shim 0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed" address="unix:///run/containerd/s/0a08042813d81b44c7e83bb1456a34dcf1b9fe3800d0e8bc117a6b32625492a8" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:26:01.648473 systemd[1]: Started cri-containerd-0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed.scope - libcontainer container 0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed. Aug 19 00:26:01.669810 systemd-resolved[1347]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 00:26:01.707024 containerd[1525]: time="2025-08-19T00:26:01.706895271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6469f8695c-d2lkb,Uid:92768be8-1d70-4e16-8a82-57ab317927cd,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed\"" Aug 19 00:26:02.576650 containerd[1525]: time="2025-08-19T00:26:02.576599936Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:02.577741 containerd[1525]: time="2025-08-19T00:26:02.577707097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Aug 19 00:26:02.579480 containerd[1525]: time="2025-08-19T00:26:02.579420857Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:02.581805 containerd[1525]: time="2025-08-19T00:26:02.581774058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:02.582741 containerd[1525]: time="2025-08-19T00:26:02.582473298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 1.960387114s" Aug 19 00:26:02.582741 containerd[1525]: time="2025-08-19T00:26:02.582515778Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:26:02.583562 containerd[1525]: time="2025-08-19T00:26:02.583538178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 00:26:02.588422 containerd[1525]: time="2025-08-19T00:26:02.588384059Z" level=info msg="CreateContainer within sandbox \"e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:26:02.607233 containerd[1525]: time="2025-08-19T00:26:02.606489943Z" level=info msg="Container 72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:26:02.616960 containerd[1525]: time="2025-08-19T00:26:02.616903745Z" level=info msg="CreateContainer within sandbox \"e990a1f8ec7dabd85302c7fbd787a707496ef164e08b63cd29e2ab75117feed2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e\"" Aug 19 00:26:02.617949 containerd[1525]: time="2025-08-19T00:26:02.617886665Z" level=info msg="StartContainer for \"72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e\"" Aug 19 00:26:02.627941 containerd[1525]: time="2025-08-19T00:26:02.627899187Z" level=info msg="connecting to shim 72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e" address="unix:///run/containerd/s/0c02957db49bce4106f2dbd5eda4cfc4cc05071f26c55100ba55d0015975546d" protocol=ttrpc version=3 Aug 19 00:26:02.650402 systemd[1]: Started cri-containerd-72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e.scope - libcontainer container 72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e. Aug 19 00:26:02.795568 containerd[1525]: time="2025-08-19T00:26:02.795526582Z" level=info msg="StartContainer for \"72df6a55e5998ef528f7b960fc66f2a4fbaf810711b7360fd296d793fe03d60e\" returns successfully" Aug 19 00:26:02.965421 systemd-networkd[1433]: cali35712aeb5a0: Gained IPv6LL Aug 19 00:26:03.349814 systemd-networkd[1433]: calib20575573b8: Gained IPv6LL Aug 19 00:26:03.529301 kubelet[2683]: I0819 00:26:03.529169 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6469f8695c-tr54d" podStartSLOduration=29.885960451 podStartE2EDuration="32.529129728s" podCreationTimestamp="2025-08-19 00:25:31 +0000 UTC" firstStartedPulling="2025-08-19 00:25:59.940184781 +0000 UTC m=+44.855317303" lastFinishedPulling="2025-08-19 00:26:02.583354098 +0000 UTC m=+47.498486580" observedRunningTime="2025-08-19 00:26:03.524777248 +0000 UTC m=+48.439909770" watchObservedRunningTime="2025-08-19 00:26:03.529129728 +0000 UTC m=+48.444262250" Aug 19 00:26:03.916062 systemd[1]: Started sshd@8-10.0.0.116:22-10.0.0.1:47942.service - OpenSSH per-connection server daemon (10.0.0.1:47942). Aug 19 00:26:03.994878 sshd[5133]: Accepted publickey for core from 10.0.0.1 port 47942 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:04.007472 sshd-session[5133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:04.015599 systemd-logind[1507]: New session 9 of user core. Aug 19 00:26:04.025507 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 00:26:04.113933 containerd[1525]: time="2025-08-19T00:26:04.113875001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:04.115445 containerd[1525]: time="2025-08-19T00:26:04.115406842Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Aug 19 00:26:04.116573 containerd[1525]: time="2025-08-19T00:26:04.116508162Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:04.121307 containerd[1525]: time="2025-08-19T00:26:04.121253243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:04.122819 containerd[1525]: time="2025-08-19T00:26:04.122771843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.539197985s" Aug 19 00:26:04.122819 containerd[1525]: time="2025-08-19T00:26:04.122816003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Aug 19 00:26:04.128253 containerd[1525]: time="2025-08-19T00:26:04.128174444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 00:26:04.137941 containerd[1525]: time="2025-08-19T00:26:04.137550366Z" level=info msg="CreateContainer within sandbox \"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 00:26:04.182832 containerd[1525]: time="2025-08-19T00:26:04.182764574Z" level=info msg="Container bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:26:04.205798 containerd[1525]: time="2025-08-19T00:26:04.205636138Z" level=info msg="CreateContainer within sandbox \"8597cc50fef1bac69978b9701c301bd5584a1724d2435b0c27023174daa0b123\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b\"" Aug 19 00:26:04.206669 containerd[1525]: time="2025-08-19T00:26:04.206545659Z" level=info msg="StartContainer for \"bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b\"" Aug 19 00:26:04.209459 containerd[1525]: time="2025-08-19T00:26:04.209217179Z" level=info msg="connecting to shim bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b" address="unix:///run/containerd/s/391a765eace6a48793dd8371fa9779d85e548ea3946da0e7194951371f45e5bf" protocol=ttrpc version=3 Aug 19 00:26:04.248441 systemd[1]: Started cri-containerd-bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b.scope - libcontainer container bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b. Aug 19 00:26:04.353498 containerd[1525]: time="2025-08-19T00:26:04.353454845Z" level=info msg="StartContainer for \"bf0a598cfab591f52a6b1b808ce7236626b5a44b038df6b4e7bdb6f98d9d514b\" returns successfully" Aug 19 00:26:04.361425 sshd[5136]: Connection closed by 10.0.0.1 port 47942 Aug 19 00:26:04.361955 sshd-session[5133]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:04.366785 systemd-logind[1507]: Session 9 logged out. Waiting for processes to exit. Aug 19 00:26:04.367503 systemd[1]: sshd@8-10.0.0.116:22-10.0.0.1:47942.service: Deactivated successfully. Aug 19 00:26:04.370077 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 00:26:04.371874 systemd-logind[1507]: Removed session 9. Aug 19 00:26:04.535987 kubelet[2683]: I0819 00:26:04.535678 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-4ccpq" podStartSLOduration=24.069308286 podStartE2EDuration="29.535659519s" podCreationTimestamp="2025-08-19 00:25:35 +0000 UTC" firstStartedPulling="2025-08-19 00:25:58.65762353 +0000 UTC m=+43.572756052" lastFinishedPulling="2025-08-19 00:26:04.123974763 +0000 UTC m=+49.039107285" observedRunningTime="2025-08-19 00:26:04.535172119 +0000 UTC m=+49.450304641" watchObservedRunningTime="2025-08-19 00:26:04.535659519 +0000 UTC m=+49.450792041" Aug 19 00:26:05.320417 kubelet[2683]: I0819 00:26:05.320357 2683 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 00:26:05.320417 kubelet[2683]: I0819 00:26:05.320425 2683 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 00:26:05.706701 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3358093576.mount: Deactivated successfully. Aug 19 00:26:06.300125 containerd[1525]: time="2025-08-19T00:26:06.300071185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:06.301357 containerd[1525]: time="2025-08-19T00:26:06.301315465Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Aug 19 00:26:06.302364 containerd[1525]: time="2025-08-19T00:26:06.302298345Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:06.304794 containerd[1525]: time="2025-08-19T00:26:06.304750825Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:06.305482 containerd[1525]: time="2025-08-19T00:26:06.305451265Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 2.177160941s" Aug 19 00:26:06.305529 containerd[1525]: time="2025-08-19T00:26:06.305489625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Aug 19 00:26:06.306886 containerd[1525]: time="2025-08-19T00:26:06.306638986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:26:06.310663 containerd[1525]: time="2025-08-19T00:26:06.310620506Z" level=info msg="CreateContainer within sandbox \"185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 00:26:06.340541 containerd[1525]: time="2025-08-19T00:26:06.340486031Z" level=info msg="Container 128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:26:06.355913 containerd[1525]: time="2025-08-19T00:26:06.355857954Z" level=info msg="CreateContainer within sandbox \"185c7d4fd587cb60a18ee27e9735c076a0bfa4062a230165a10b4a5294eac2d5\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\"" Aug 19 00:26:06.356581 containerd[1525]: time="2025-08-19T00:26:06.356542754Z" level=info msg="StartContainer for \"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\"" Aug 19 00:26:06.358075 containerd[1525]: time="2025-08-19T00:26:06.358032434Z" level=info msg="connecting to shim 128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815" address="unix:///run/containerd/s/4f03c3e8204abad51a7c978a0876e2ec169b4fe23395611d620f057ff8696567" protocol=ttrpc version=3 Aug 19 00:26:06.381483 systemd[1]: Started cri-containerd-128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815.scope - libcontainer container 128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815. Aug 19 00:26:06.428503 containerd[1525]: time="2025-08-19T00:26:06.428455685Z" level=info msg="StartContainer for \"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\" returns successfully" Aug 19 00:26:06.563883 kubelet[2683]: I0819 00:26:06.563720 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-5lvzf" podStartSLOduration=26.840461165 podStartE2EDuration="31.563704027s" podCreationTimestamp="2025-08-19 00:25:35 +0000 UTC" firstStartedPulling="2025-08-19 00:26:01.582906203 +0000 UTC m=+46.498038725" lastFinishedPulling="2025-08-19 00:26:06.306149065 +0000 UTC m=+51.221281587" observedRunningTime="2025-08-19 00:26:06.562809427 +0000 UTC m=+51.477941949" watchObservedRunningTime="2025-08-19 00:26:06.563704027 +0000 UTC m=+51.478836549" Aug 19 00:26:06.643345 containerd[1525]: time="2025-08-19T00:26:06.643251200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\" id:\"dcc3d8fbc816a45d3e0cf28b7a8c14f92652e26f228f21843a8231df653a48bf\" pid:5248 exit_status:1 exited_at:{seconds:1755563166 nanos:642355000}" Aug 19 00:26:06.845948 containerd[1525]: time="2025-08-19T00:26:06.845821793Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:26:06.846985 containerd[1525]: time="2025-08-19T00:26:06.846944313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 00:26:06.848765 containerd[1525]: time="2025-08-19T00:26:06.848708873Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 542.033647ms" Aug 19 00:26:06.848765 containerd[1525]: time="2025-08-19T00:26:06.848756873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:26:06.854533 containerd[1525]: time="2025-08-19T00:26:06.854481954Z" level=info msg="CreateContainer within sandbox \"0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:26:06.865238 containerd[1525]: time="2025-08-19T00:26:06.863566955Z" level=info msg="Container 7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:26:06.878076 containerd[1525]: time="2025-08-19T00:26:06.877949198Z" level=info msg="CreateContainer within sandbox \"0dda6e5b329f8ac2152ced6e983cb4d1080349c86a4564ffd08ed9f43dcba2ed\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21\"" Aug 19 00:26:06.879553 containerd[1525]: time="2025-08-19T00:26:06.878910118Z" level=info msg="StartContainer for \"7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21\"" Aug 19 00:26:06.880076 containerd[1525]: time="2025-08-19T00:26:06.880043438Z" level=info msg="connecting to shim 7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21" address="unix:///run/containerd/s/0a08042813d81b44c7e83bb1456a34dcf1b9fe3800d0e8bc117a6b32625492a8" protocol=ttrpc version=3 Aug 19 00:26:06.911459 systemd[1]: Started cri-containerd-7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21.scope - libcontainer container 7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21. Aug 19 00:26:06.960062 containerd[1525]: time="2025-08-19T00:26:06.959989771Z" level=info msg="StartContainer for \"7df0122c6c150b3e98517d68fbaa754da793c93aaeae2b4f8ad31bbf22048a21\" returns successfully" Aug 19 00:26:07.557177 kubelet[2683]: I0819 00:26:07.556984 2683 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6469f8695c-d2lkb" podStartSLOduration=31.41686222 podStartE2EDuration="36.556961022s" podCreationTimestamp="2025-08-19 00:25:31 +0000 UTC" firstStartedPulling="2025-08-19 00:26:01.709383591 +0000 UTC m=+46.624516113" lastFinishedPulling="2025-08-19 00:26:06.849482393 +0000 UTC m=+51.764614915" observedRunningTime="2025-08-19 00:26:07.556836822 +0000 UTC m=+52.471969344" watchObservedRunningTime="2025-08-19 00:26:07.556961022 +0000 UTC m=+52.472093544" Aug 19 00:26:07.662822 containerd[1525]: time="2025-08-19T00:26:07.662766358Z" level=info msg="TaskExit event in podsandbox handler container_id:\"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\" id:\"d4b1b8b908640cbfe73cc9b96783e6933ed69ad9c5b52b650887499c973685bc\" pid:5310 exit_status:1 exited_at:{seconds:1755563167 nanos:662482358}" Aug 19 00:26:08.529248 kubelet[2683]: I0819 00:26:08.528811 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:26:09.380551 systemd[1]: Started sshd@9-10.0.0.116:22-10.0.0.1:47948.service - OpenSSH per-connection server daemon (10.0.0.1:47948). Aug 19 00:26:09.450903 sshd[5324]: Accepted publickey for core from 10.0.0.1 port 47948 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:09.452726 sshd-session[5324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:09.459283 systemd-logind[1507]: New session 10 of user core. Aug 19 00:26:09.470446 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 00:26:09.738420 sshd[5327]: Connection closed by 10.0.0.1 port 47948 Aug 19 00:26:09.739851 sshd-session[5324]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:09.752877 systemd[1]: sshd@9-10.0.0.116:22-10.0.0.1:47948.service: Deactivated successfully. Aug 19 00:26:09.756956 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 00:26:09.758427 systemd-logind[1507]: Session 10 logged out. Waiting for processes to exit. Aug 19 00:26:09.762814 systemd[1]: Started sshd@10-10.0.0.116:22-10.0.0.1:47954.service - OpenSSH per-connection server daemon (10.0.0.1:47954). Aug 19 00:26:09.767286 systemd-logind[1507]: Removed session 10. Aug 19 00:26:09.845694 sshd[5342]: Accepted publickey for core from 10.0.0.1 port 47954 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:09.847519 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:09.853040 systemd-logind[1507]: New session 11 of user core. Aug 19 00:26:09.858613 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 00:26:10.114050 sshd[5345]: Connection closed by 10.0.0.1 port 47954 Aug 19 00:26:10.118016 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:10.132117 systemd[1]: sshd@10-10.0.0.116:22-10.0.0.1:47954.service: Deactivated successfully. Aug 19 00:26:10.137059 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 00:26:10.138553 systemd-logind[1507]: Session 11 logged out. Waiting for processes to exit. Aug 19 00:26:10.142852 systemd[1]: Started sshd@11-10.0.0.116:22-10.0.0.1:47968.service - OpenSSH per-connection server daemon (10.0.0.1:47968). Aug 19 00:26:10.146097 systemd-logind[1507]: Removed session 11. Aug 19 00:26:10.217248 sshd[5357]: Accepted publickey for core from 10.0.0.1 port 47968 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:10.219233 sshd-session[5357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:10.224355 systemd-logind[1507]: New session 12 of user core. Aug 19 00:26:10.240452 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 00:26:10.432259 sshd[5360]: Connection closed by 10.0.0.1 port 47968 Aug 19 00:26:10.434240 sshd-session[5357]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:10.440352 systemd-logind[1507]: Session 12 logged out. Waiting for processes to exit. Aug 19 00:26:10.440430 systemd[1]: sshd@11-10.0.0.116:22-10.0.0.1:47968.service: Deactivated successfully. Aug 19 00:26:10.443504 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 00:26:10.445522 systemd-logind[1507]: Removed session 12. Aug 19 00:26:15.455361 systemd[1]: Started sshd@12-10.0.0.116:22-10.0.0.1:37080.service - OpenSSH per-connection server daemon (10.0.0.1:37080). Aug 19 00:26:15.523000 sshd[5388]: Accepted publickey for core from 10.0.0.1 port 37080 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:15.527663 sshd-session[5388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:15.537160 systemd-logind[1507]: New session 13 of user core. Aug 19 00:26:15.556488 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 00:26:15.744676 sshd[5391]: Connection closed by 10.0.0.1 port 37080 Aug 19 00:26:15.745488 sshd-session[5388]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:15.750383 systemd[1]: sshd@12-10.0.0.116:22-10.0.0.1:37080.service: Deactivated successfully. Aug 19 00:26:15.753836 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 00:26:15.757192 systemd-logind[1507]: Session 13 logged out. Waiting for processes to exit. Aug 19 00:26:15.760192 systemd-logind[1507]: Removed session 13. Aug 19 00:26:20.765321 systemd[1]: Started sshd@13-10.0.0.116:22-10.0.0.1:37084.service - OpenSSH per-connection server daemon (10.0.0.1:37084). Aug 19 00:26:20.824152 sshd[5406]: Accepted publickey for core from 10.0.0.1 port 37084 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:20.826848 sshd-session[5406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:20.839500 systemd-logind[1507]: New session 14 of user core. Aug 19 00:26:20.854473 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 00:26:21.056866 sshd[5409]: Connection closed by 10.0.0.1 port 37084 Aug 19 00:26:21.056166 sshd-session[5406]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:21.062602 systemd[1]: sshd@13-10.0.0.116:22-10.0.0.1:37084.service: Deactivated successfully. Aug 19 00:26:21.066474 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 00:26:21.073127 systemd-logind[1507]: Session 14 logged out. Waiting for processes to exit. Aug 19 00:26:21.074666 systemd-logind[1507]: Removed session 14. Aug 19 00:26:26.081010 systemd[1]: Started sshd@14-10.0.0.116:22-10.0.0.1:50400.service - OpenSSH per-connection server daemon (10.0.0.1:50400). Aug 19 00:26:26.146405 sshd[5428]: Accepted publickey for core from 10.0.0.1 port 50400 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:26.147908 sshd-session[5428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:26.152343 systemd-logind[1507]: New session 15 of user core. Aug 19 00:26:26.164499 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 00:26:26.334845 sshd[5431]: Connection closed by 10.0.0.1 port 50400 Aug 19 00:26:26.335356 sshd-session[5428]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:26.353293 systemd[1]: sshd@14-10.0.0.116:22-10.0.0.1:50400.service: Deactivated successfully. Aug 19 00:26:26.356064 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 00:26:26.360290 systemd-logind[1507]: Session 15 logged out. Waiting for processes to exit. Aug 19 00:26:26.364189 systemd[1]: Started sshd@15-10.0.0.116:22-10.0.0.1:50414.service - OpenSSH per-connection server daemon (10.0.0.1:50414). Aug 19 00:26:26.368195 systemd-logind[1507]: Removed session 15. Aug 19 00:26:26.452235 sshd[5444]: Accepted publickey for core from 10.0.0.1 port 50414 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:26.453802 sshd-session[5444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:26.459058 systemd-logind[1507]: New session 16 of user core. Aug 19 00:26:26.470458 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 00:26:26.748362 sshd[5447]: Connection closed by 10.0.0.1 port 50414 Aug 19 00:26:26.753538 sshd-session[5444]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:26.770911 systemd[1]: sshd@15-10.0.0.116:22-10.0.0.1:50414.service: Deactivated successfully. Aug 19 00:26:26.772992 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 00:26:26.775002 systemd-logind[1507]: Session 16 logged out. Waiting for processes to exit. Aug 19 00:26:26.777037 systemd[1]: Started sshd@16-10.0.0.116:22-10.0.0.1:50420.service - OpenSSH per-connection server daemon (10.0.0.1:50420). Aug 19 00:26:26.778890 systemd-logind[1507]: Removed session 16. Aug 19 00:26:26.865349 sshd[5459]: Accepted publickey for core from 10.0.0.1 port 50420 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:26.866405 sshd-session[5459]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:26.874000 systemd-logind[1507]: New session 17 of user core. Aug 19 00:26:26.886424 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 00:26:27.645388 sshd[5462]: Connection closed by 10.0.0.1 port 50420 Aug 19 00:26:27.645338 sshd-session[5459]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:27.657832 systemd[1]: sshd@16-10.0.0.116:22-10.0.0.1:50420.service: Deactivated successfully. Aug 19 00:26:27.664847 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 00:26:27.667728 systemd-logind[1507]: Session 17 logged out. Waiting for processes to exit. Aug 19 00:26:27.672392 systemd[1]: Started sshd@17-10.0.0.116:22-10.0.0.1:50424.service - OpenSSH per-connection server daemon (10.0.0.1:50424). Aug 19 00:26:27.674411 systemd-logind[1507]: Removed session 17. Aug 19 00:26:27.735270 sshd[5481]: Accepted publickey for core from 10.0.0.1 port 50424 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:27.736662 sshd-session[5481]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:27.742820 systemd-logind[1507]: New session 18 of user core. Aug 19 00:26:27.762705 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 00:26:28.171014 sshd[5484]: Connection closed by 10.0.0.1 port 50424 Aug 19 00:26:28.171467 sshd-session[5481]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:28.181856 systemd[1]: sshd@17-10.0.0.116:22-10.0.0.1:50424.service: Deactivated successfully. Aug 19 00:26:28.186818 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 00:26:28.188400 systemd-logind[1507]: Session 18 logged out. Waiting for processes to exit. Aug 19 00:26:28.194355 systemd[1]: Started sshd@18-10.0.0.116:22-10.0.0.1:50430.service - OpenSSH per-connection server daemon (10.0.0.1:50430). Aug 19 00:26:28.194880 systemd-logind[1507]: Removed session 18. Aug 19 00:26:28.269019 containerd[1525]: time="2025-08-19T00:26:28.268974314Z" level=info msg="TaskExit event in podsandbox handler container_id:\"844bca4e08f7270219c02b3a41d2fef1fb6de30dcd49babbe087d3510850d608\" id:\"0e8aa3cbddba08d848e4030d9f47a9e0ca6611a2f756ba12718474f9ff0ea25b\" pid:5505 exited_at:{seconds:1755563188 nanos:268573911}" Aug 19 00:26:28.274226 sshd[5520]: Accepted publickey for core from 10.0.0.1 port 50430 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:28.275704 sshd-session[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:28.287306 systemd-logind[1507]: New session 19 of user core. Aug 19 00:26:28.296484 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 00:26:28.506038 sshd[5525]: Connection closed by 10.0.0.1 port 50430 Aug 19 00:26:28.506850 sshd-session[5520]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:28.511503 systemd-logind[1507]: Session 19 logged out. Waiting for processes to exit. Aug 19 00:26:28.511745 systemd[1]: sshd@18-10.0.0.116:22-10.0.0.1:50430.service: Deactivated successfully. Aug 19 00:26:28.513979 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 00:26:28.515746 systemd-logind[1507]: Removed session 19. Aug 19 00:26:30.524096 containerd[1525]: time="2025-08-19T00:26:30.523423188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\" id:\"5ed433c372733f66661a5307d6c99ea7fc950175a3ceb95ceca9dceda3167d3e\" pid:5552 exited_at:{seconds:1755563190 nanos:523036945}" Aug 19 00:26:33.517822 systemd[1]: Started sshd@19-10.0.0.116:22-10.0.0.1:38152.service - OpenSSH per-connection server daemon (10.0.0.1:38152). Aug 19 00:26:33.574561 sshd[5572]: Accepted publickey for core from 10.0.0.1 port 38152 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:33.576007 sshd-session[5572]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:33.580276 systemd-logind[1507]: New session 20 of user core. Aug 19 00:26:33.594424 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 00:26:33.711193 sshd[5575]: Connection closed by 10.0.0.1 port 38152 Aug 19 00:26:33.711579 sshd-session[5572]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:33.715563 systemd[1]: sshd@19-10.0.0.116:22-10.0.0.1:38152.service: Deactivated successfully. Aug 19 00:26:33.718813 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 00:26:33.719504 systemd-logind[1507]: Session 20 logged out. Waiting for processes to exit. Aug 19 00:26:33.720789 systemd-logind[1507]: Removed session 20. Aug 19 00:26:37.621099 containerd[1525]: time="2025-08-19T00:26:37.621055261Z" level=info msg="TaskExit event in podsandbox handler container_id:\"128d7f6a60fb3901a10bc81eaefb68365aa4d406e4661fa2a25f1a6f54fc6815\" id:\"1c513518c233d1bc9b57134e63d19e999097f7cf6b1394c0fd6ce6afef8cc142\" pid:5599 exited_at:{seconds:1755563197 nanos:620481778}" Aug 19 00:26:37.931051 containerd[1525]: time="2025-08-19T00:26:37.931014825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8108c4b2c45ba55b477ef61465de37cc423808d1dbe85878912de796754935ea\" id:\"ff9dcd9fbb3d76322560cda9627617355acae19ba220090b2a6fa42208e578f2\" pid:5625 exited_at:{seconds:1755563197 nanos:930791463}" Aug 19 00:26:38.186194 kubelet[2683]: I0819 00:26:38.185923 2683 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:26:38.726033 systemd[1]: Started sshd@20-10.0.0.116:22-10.0.0.1:38160.service - OpenSSH per-connection server daemon (10.0.0.1:38160). Aug 19 00:26:38.805854 sshd[5638]: Accepted publickey for core from 10.0.0.1 port 38160 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:38.807481 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:38.811881 systemd-logind[1507]: New session 21 of user core. Aug 19 00:26:38.823464 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 00:26:39.014356 sshd[5641]: Connection closed by 10.0.0.1 port 38160 Aug 19 00:26:39.015288 sshd-session[5638]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:39.027633 systemd-logind[1507]: Session 21 logged out. Waiting for processes to exit. Aug 19 00:26:39.028676 systemd[1]: sshd@20-10.0.0.116:22-10.0.0.1:38160.service: Deactivated successfully. Aug 19 00:26:39.035809 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 00:26:39.038642 systemd-logind[1507]: Removed session 21. Aug 19 00:26:44.029063 systemd[1]: Started sshd@21-10.0.0.116:22-10.0.0.1:44798.service - OpenSSH per-connection server daemon (10.0.0.1:44798). Aug 19 00:26:44.121100 sshd[5656]: Accepted publickey for core from 10.0.0.1 port 44798 ssh2: RSA SHA256:KtdM7F0JALreH0qQbeHxcUClgTXNHNzWeYwdEyvS3QA Aug 19 00:26:44.123077 sshd-session[5656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:26:44.129602 systemd-logind[1507]: New session 22 of user core. Aug 19 00:26:44.143448 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 00:26:44.304715 sshd[5659]: Connection closed by 10.0.0.1 port 44798 Aug 19 00:26:44.305472 sshd-session[5656]: pam_unix(sshd:session): session closed for user core Aug 19 00:26:44.309466 systemd[1]: sshd@21-10.0.0.116:22-10.0.0.1:44798.service: Deactivated successfully. Aug 19 00:26:44.311665 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 00:26:44.312980 systemd-logind[1507]: Session 22 logged out. Waiting for processes to exit. Aug 19 00:26:44.316983 systemd-logind[1507]: Removed session 22.