Sep 6 09:55:12.831963 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sat Sep 6 08:10:27 -00 2025 Sep 6 09:55:12.831986 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=c69e1af3de5da13c6be143e3ab518bb528f3fa67a9a488f7fa132987ebb99256 Sep 6 09:55:12.831996 kernel: BIOS-provided physical RAM map: Sep 6 09:55:12.832002 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 6 09:55:12.832009 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 6 09:55:12.832015 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 6 09:55:12.832023 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Sep 6 09:55:12.832030 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Sep 6 09:55:12.832042 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 6 09:55:12.832048 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 6 09:55:12.832055 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 6 09:55:12.832062 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 6 09:55:12.832068 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 6 09:55:12.832075 kernel: NX (Execute Disable) protection: active Sep 6 09:55:12.832085 kernel: APIC: Static calls initialized Sep 6 09:55:12.832092 kernel: SMBIOS 2.8 present. Sep 6 09:55:12.832102 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Sep 6 09:55:12.832110 kernel: DMI: Memory slots populated: 1/1 Sep 6 09:55:12.832117 kernel: Hypervisor detected: KVM Sep 6 09:55:12.832124 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 6 09:55:12.832131 kernel: kvm-clock: using sched offset of 4059303513 cycles Sep 6 09:55:12.832138 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 6 09:55:12.832146 kernel: tsc: Detected 2794.750 MHz processor Sep 6 09:55:12.832156 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 6 09:55:12.832177 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 6 09:55:12.832184 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Sep 6 09:55:12.832192 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 6 09:55:12.832219 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 6 09:55:12.832236 kernel: Using GB pages for direct mapping Sep 6 09:55:12.832245 kernel: ACPI: Early table checksum verification disabled Sep 6 09:55:12.832253 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Sep 6 09:55:12.832261 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832274 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832291 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832308 kernel: ACPI: FACS 0x000000009CFE0000 000040 Sep 6 09:55:12.832317 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832324 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832331 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832338 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 6 09:55:12.832346 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Sep 6 09:55:12.832359 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Sep 6 09:55:12.832367 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Sep 6 09:55:12.832374 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Sep 6 09:55:12.832382 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Sep 6 09:55:12.832389 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Sep 6 09:55:12.832397 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Sep 6 09:55:12.832406 kernel: No NUMA configuration found Sep 6 09:55:12.832413 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Sep 6 09:55:12.832421 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Sep 6 09:55:12.832428 kernel: Zone ranges: Sep 6 09:55:12.832436 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 6 09:55:12.832443 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Sep 6 09:55:12.832450 kernel: Normal empty Sep 6 09:55:12.832458 kernel: Device empty Sep 6 09:55:12.832465 kernel: Movable zone start for each node Sep 6 09:55:12.832472 kernel: Early memory node ranges Sep 6 09:55:12.832482 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 6 09:55:12.832489 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Sep 6 09:55:12.832497 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Sep 6 09:55:12.832504 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 6 09:55:12.832512 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 6 09:55:12.832521 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 6 09:55:12.832529 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 6 09:55:12.832538 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 6 09:55:12.832546 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 6 09:55:12.832556 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 6 09:55:12.832564 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 6 09:55:12.832573 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 6 09:55:12.832581 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 6 09:55:12.832589 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 6 09:55:12.832596 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 6 09:55:12.832604 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 6 09:55:12.832611 kernel: TSC deadline timer available Sep 6 09:55:12.832618 kernel: CPU topo: Max. logical packages: 1 Sep 6 09:55:12.832628 kernel: CPU topo: Max. logical dies: 1 Sep 6 09:55:12.832635 kernel: CPU topo: Max. dies per package: 1 Sep 6 09:55:12.832642 kernel: CPU topo: Max. threads per core: 1 Sep 6 09:55:12.832649 kernel: CPU topo: Num. cores per package: 4 Sep 6 09:55:12.832657 kernel: CPU topo: Num. threads per package: 4 Sep 6 09:55:12.832664 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 6 09:55:12.832672 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 6 09:55:12.832679 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 6 09:55:12.832686 kernel: kvm-guest: setup PV sched yield Sep 6 09:55:12.832694 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 6 09:55:12.832706 kernel: Booting paravirtualized kernel on KVM Sep 6 09:55:12.832713 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 6 09:55:12.832721 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 6 09:55:12.832729 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 6 09:55:12.832736 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 6 09:55:12.832744 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 6 09:55:12.832751 kernel: kvm-guest: PV spinlocks enabled Sep 6 09:55:12.832758 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 6 09:55:12.832767 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=c69e1af3de5da13c6be143e3ab518bb528f3fa67a9a488f7fa132987ebb99256 Sep 6 09:55:12.832777 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 6 09:55:12.832784 kernel: random: crng init done Sep 6 09:55:12.832792 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 6 09:55:12.832799 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 6 09:55:12.832807 kernel: Fallback order for Node 0: 0 Sep 6 09:55:12.832814 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Sep 6 09:55:12.832821 kernel: Policy zone: DMA32 Sep 6 09:55:12.832829 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 6 09:55:12.832841 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 6 09:55:12.832848 kernel: ftrace: allocating 40102 entries in 157 pages Sep 6 09:55:12.832855 kernel: ftrace: allocated 157 pages with 5 groups Sep 6 09:55:12.832863 kernel: Dynamic Preempt: voluntary Sep 6 09:55:12.832870 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 6 09:55:12.832878 kernel: rcu: RCU event tracing is enabled. Sep 6 09:55:12.832886 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 6 09:55:12.832893 kernel: Trampoline variant of Tasks RCU enabled. Sep 6 09:55:12.832903 kernel: Rude variant of Tasks RCU enabled. Sep 6 09:55:12.832913 kernel: Tracing variant of Tasks RCU enabled. Sep 6 09:55:12.833014 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 6 09:55:12.833022 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 6 09:55:12.833029 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 6 09:55:12.833037 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 6 09:55:12.833044 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 6 09:55:12.833052 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 6 09:55:12.833060 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 6 09:55:12.833080 kernel: Console: colour VGA+ 80x25 Sep 6 09:55:12.833088 kernel: printk: legacy console [ttyS0] enabled Sep 6 09:55:12.833096 kernel: ACPI: Core revision 20240827 Sep 6 09:55:12.833104 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 6 09:55:12.833115 kernel: APIC: Switch to symmetric I/O mode setup Sep 6 09:55:12.833122 kernel: x2apic enabled Sep 6 09:55:12.833130 kernel: APIC: Switched APIC routing to: physical x2apic Sep 6 09:55:12.833140 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 6 09:55:12.833148 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 6 09:55:12.833158 kernel: kvm-guest: setup PV IPIs Sep 6 09:55:12.833166 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 6 09:55:12.833174 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 6 09:55:12.833182 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 6 09:55:12.833189 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 6 09:55:12.833197 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 6 09:55:12.833205 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 6 09:55:12.833220 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 6 09:55:12.833230 kernel: Spectre V2 : Mitigation: Retpolines Sep 6 09:55:12.833239 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 6 09:55:12.833246 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 6 09:55:12.833254 kernel: active return thunk: retbleed_return_thunk Sep 6 09:55:12.833262 kernel: RETBleed: Mitigation: untrained return thunk Sep 6 09:55:12.833270 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 6 09:55:12.833278 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 6 09:55:12.833286 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 6 09:55:12.833299 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 6 09:55:12.833317 kernel: active return thunk: srso_return_thunk Sep 6 09:55:12.833337 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 6 09:55:12.833357 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 6 09:55:12.833365 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 6 09:55:12.833372 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 6 09:55:12.833380 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 6 09:55:12.833388 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 6 09:55:12.833396 kernel: Freeing SMP alternatives memory: 32K Sep 6 09:55:12.833404 kernel: pid_max: default: 32768 minimum: 301 Sep 6 09:55:12.833414 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 6 09:55:12.833422 kernel: landlock: Up and running. Sep 6 09:55:12.833429 kernel: SELinux: Initializing. Sep 6 09:55:12.833440 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 6 09:55:12.833449 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 6 09:55:12.833457 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 6 09:55:12.833465 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 6 09:55:12.833472 kernel: ... version: 0 Sep 6 09:55:12.833480 kernel: ... bit width: 48 Sep 6 09:55:12.833490 kernel: ... generic registers: 6 Sep 6 09:55:12.833498 kernel: ... value mask: 0000ffffffffffff Sep 6 09:55:12.833505 kernel: ... max period: 00007fffffffffff Sep 6 09:55:12.833513 kernel: ... fixed-purpose events: 0 Sep 6 09:55:12.833520 kernel: ... event mask: 000000000000003f Sep 6 09:55:12.833528 kernel: signal: max sigframe size: 1776 Sep 6 09:55:12.833536 kernel: rcu: Hierarchical SRCU implementation. Sep 6 09:55:12.833544 kernel: rcu: Max phase no-delay instances is 400. Sep 6 09:55:12.833551 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 6 09:55:12.833561 kernel: smp: Bringing up secondary CPUs ... Sep 6 09:55:12.833569 kernel: smpboot: x86: Booting SMP configuration: Sep 6 09:55:12.833577 kernel: .... node #0, CPUs: #1 #2 #3 Sep 6 09:55:12.833584 kernel: smp: Brought up 1 node, 4 CPUs Sep 6 09:55:12.833592 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 6 09:55:12.833600 kernel: Memory: 2428920K/2571752K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 136904K reserved, 0K cma-reserved) Sep 6 09:55:12.833608 kernel: devtmpfs: initialized Sep 6 09:55:12.833616 kernel: x86/mm: Memory block size: 128MB Sep 6 09:55:12.833624 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 6 09:55:12.833634 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 6 09:55:12.833641 kernel: pinctrl core: initialized pinctrl subsystem Sep 6 09:55:12.833649 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 6 09:55:12.833668 kernel: audit: initializing netlink subsys (disabled) Sep 6 09:55:12.833686 kernel: audit: type=2000 audit(1757152510.176:1): state=initialized audit_enabled=0 res=1 Sep 6 09:55:12.833694 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 6 09:55:12.833702 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 6 09:55:12.833710 kernel: cpuidle: using governor menu Sep 6 09:55:12.833717 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 6 09:55:12.833729 kernel: dca service started, version 1.12.1 Sep 6 09:55:12.833737 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 6 09:55:12.833744 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 6 09:55:12.833752 kernel: PCI: Using configuration type 1 for base access Sep 6 09:55:12.833760 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 6 09:55:12.833768 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 6 09:55:12.833776 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 6 09:55:12.833783 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 6 09:55:12.833791 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 6 09:55:12.833802 kernel: ACPI: Added _OSI(Module Device) Sep 6 09:55:12.833809 kernel: ACPI: Added _OSI(Processor Device) Sep 6 09:55:12.833817 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 6 09:55:12.833825 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 6 09:55:12.833832 kernel: ACPI: Interpreter enabled Sep 6 09:55:12.833840 kernel: ACPI: PM: (supports S0 S3 S5) Sep 6 09:55:12.833848 kernel: ACPI: Using IOAPIC for interrupt routing Sep 6 09:55:12.833856 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 6 09:55:12.833863 kernel: PCI: Using E820 reservations for host bridge windows Sep 6 09:55:12.833873 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 6 09:55:12.833881 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 6 09:55:12.834154 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 6 09:55:12.834291 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 6 09:55:12.834412 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 6 09:55:12.834423 kernel: PCI host bridge to bus 0000:00 Sep 6 09:55:12.834559 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 6 09:55:12.834677 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 6 09:55:12.834788 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 6 09:55:12.834930 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Sep 6 09:55:12.835048 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 6 09:55:12.835158 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 6 09:55:12.835278 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 6 09:55:12.835604 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 6 09:55:12.835750 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 6 09:55:12.835873 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Sep 6 09:55:12.836029 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Sep 6 09:55:12.836155 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Sep 6 09:55:12.836283 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 6 09:55:12.836421 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 6 09:55:12.836549 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Sep 6 09:55:12.836669 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Sep 6 09:55:12.836865 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Sep 6 09:55:12.837027 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 6 09:55:12.837156 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Sep 6 09:55:12.837287 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Sep 6 09:55:12.837407 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Sep 6 09:55:12.837554 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 6 09:55:12.837676 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Sep 6 09:55:12.837796 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Sep 6 09:55:12.837929 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Sep 6 09:55:12.838055 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Sep 6 09:55:12.838199 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 6 09:55:12.838330 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 6 09:55:12.838470 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 6 09:55:12.838590 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Sep 6 09:55:12.838709 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Sep 6 09:55:12.838845 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 6 09:55:12.838985 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 6 09:55:12.838997 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 6 09:55:12.839010 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 6 09:55:12.839018 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 6 09:55:12.839026 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 6 09:55:12.839034 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 6 09:55:12.839042 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 6 09:55:12.839050 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 6 09:55:12.839058 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 6 09:55:12.839066 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 6 09:55:12.839074 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 6 09:55:12.839085 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 6 09:55:12.839093 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 6 09:55:12.839101 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 6 09:55:12.839109 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 6 09:55:12.839117 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 6 09:55:12.839125 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 6 09:55:12.839133 kernel: iommu: Default domain type: Translated Sep 6 09:55:12.839141 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 6 09:55:12.839149 kernel: PCI: Using ACPI for IRQ routing Sep 6 09:55:12.839159 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 6 09:55:12.839168 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 6 09:55:12.839176 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Sep 6 09:55:12.839306 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 6 09:55:12.839426 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 6 09:55:12.839546 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 6 09:55:12.839556 kernel: vgaarb: loaded Sep 6 09:55:12.839565 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 6 09:55:12.839573 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 6 09:55:12.839585 kernel: clocksource: Switched to clocksource kvm-clock Sep 6 09:55:12.839593 kernel: VFS: Disk quotas dquot_6.6.0 Sep 6 09:55:12.839601 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 6 09:55:12.839610 kernel: pnp: PnP ACPI init Sep 6 09:55:12.839789 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 6 09:55:12.839802 kernel: pnp: PnP ACPI: found 6 devices Sep 6 09:55:12.839810 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 6 09:55:12.839818 kernel: NET: Registered PF_INET protocol family Sep 6 09:55:12.839830 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 6 09:55:12.839838 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 6 09:55:12.839846 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 6 09:55:12.839854 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 6 09:55:12.839863 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 6 09:55:12.839871 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 6 09:55:12.839879 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 6 09:55:12.839887 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 6 09:55:12.839898 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 6 09:55:12.839906 kernel: NET: Registered PF_XDP protocol family Sep 6 09:55:12.840061 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 6 09:55:12.840174 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 6 09:55:12.840295 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 6 09:55:12.840422 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Sep 6 09:55:12.840534 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 6 09:55:12.840644 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 6 09:55:12.840654 kernel: PCI: CLS 0 bytes, default 64 Sep 6 09:55:12.840667 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 6 09:55:12.840676 kernel: Initialise system trusted keyrings Sep 6 09:55:12.840684 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 6 09:55:12.840692 kernel: Key type asymmetric registered Sep 6 09:55:12.840700 kernel: Asymmetric key parser 'x509' registered Sep 6 09:55:12.840708 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 6 09:55:12.840716 kernel: io scheduler mq-deadline registered Sep 6 09:55:12.840724 kernel: io scheduler kyber registered Sep 6 09:55:12.840732 kernel: io scheduler bfq registered Sep 6 09:55:12.840743 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 6 09:55:12.840751 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 6 09:55:12.840760 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 6 09:55:12.840768 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 6 09:55:12.840776 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 6 09:55:12.840784 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 6 09:55:12.840792 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 6 09:55:12.840800 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 6 09:55:12.840808 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 6 09:55:12.840966 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 6 09:55:12.841085 kernel: rtc_cmos 00:04: registered as rtc0 Sep 6 09:55:12.841200 kernel: rtc_cmos 00:04: setting system clock to 2025-09-06T09:55:12 UTC (1757152512) Sep 6 09:55:12.841323 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 6 09:55:12.841334 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 6 09:55:12.841343 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 6 09:55:12.841351 kernel: NET: Registered PF_INET6 protocol family Sep 6 09:55:12.841360 kernel: Segment Routing with IPv6 Sep 6 09:55:12.841373 kernel: In-situ OAM (IOAM) with IPv6 Sep 6 09:55:12.841381 kernel: NET: Registered PF_PACKET protocol family Sep 6 09:55:12.841389 kernel: Key type dns_resolver registered Sep 6 09:55:12.841397 kernel: IPI shorthand broadcast: enabled Sep 6 09:55:12.841405 kernel: sched_clock: Marking stable (3151003115, 109503038)->(3276229820, -15723667) Sep 6 09:55:12.841413 kernel: registered taskstats version 1 Sep 6 09:55:12.841421 kernel: Loading compiled-in X.509 certificates Sep 6 09:55:12.841429 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: d54a04c0c6d7404ed8dd26757b3e0037e8128454' Sep 6 09:55:12.841437 kernel: Demotion targets for Node 0: null Sep 6 09:55:12.841448 kernel: Key type .fscrypt registered Sep 6 09:55:12.841455 kernel: Key type fscrypt-provisioning registered Sep 6 09:55:12.841463 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 6 09:55:12.841471 kernel: ima: Allocated hash algorithm: sha1 Sep 6 09:55:12.841480 kernel: ima: No architecture policies found Sep 6 09:55:12.841487 kernel: clk: Disabling unused clocks Sep 6 09:55:12.841495 kernel: Warning: unable to open an initial console. Sep 6 09:55:12.841504 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 6 09:55:12.841515 kernel: Write protecting the kernel read-only data: 24576k Sep 6 09:55:12.841523 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 6 09:55:12.841531 kernel: Run /init as init process Sep 6 09:55:12.841539 kernel: with arguments: Sep 6 09:55:12.841547 kernel: /init Sep 6 09:55:12.841555 kernel: with environment: Sep 6 09:55:12.841563 kernel: HOME=/ Sep 6 09:55:12.841571 kernel: TERM=linux Sep 6 09:55:12.841579 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 6 09:55:12.841588 systemd[1]: Successfully made /usr/ read-only. Sep 6 09:55:12.841612 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 6 09:55:12.841624 systemd[1]: Detected virtualization kvm. Sep 6 09:55:12.841632 systemd[1]: Detected architecture x86-64. Sep 6 09:55:12.841641 systemd[1]: Running in initrd. Sep 6 09:55:12.841650 systemd[1]: No hostname configured, using default hostname. Sep 6 09:55:12.841661 systemd[1]: Hostname set to . Sep 6 09:55:12.841669 systemd[1]: Initializing machine ID from VM UUID. Sep 6 09:55:12.841678 systemd[1]: Queued start job for default target initrd.target. Sep 6 09:55:12.841687 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 09:55:12.841696 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 09:55:12.841705 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 6 09:55:12.841714 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 6 09:55:12.841723 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 6 09:55:12.841735 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 6 09:55:12.841745 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 6 09:55:12.841756 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 6 09:55:12.841764 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 09:55:12.841773 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 6 09:55:12.841782 systemd[1]: Reached target paths.target - Path Units. Sep 6 09:55:12.841792 systemd[1]: Reached target slices.target - Slice Units. Sep 6 09:55:12.841801 systemd[1]: Reached target swap.target - Swaps. Sep 6 09:55:12.841810 systemd[1]: Reached target timers.target - Timer Units. Sep 6 09:55:12.841818 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 6 09:55:12.841827 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 6 09:55:12.841836 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 6 09:55:12.841845 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 6 09:55:12.841854 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 6 09:55:12.841862 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 6 09:55:12.841873 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 09:55:12.841882 systemd[1]: Reached target sockets.target - Socket Units. Sep 6 09:55:12.841891 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 6 09:55:12.841900 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 6 09:55:12.841910 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 6 09:55:12.841940 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 6 09:55:12.841949 systemd[1]: Starting systemd-fsck-usr.service... Sep 6 09:55:12.841958 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 6 09:55:12.841967 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 6 09:55:12.841976 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 09:55:12.841984 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 6 09:55:12.841996 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 09:55:12.842005 systemd[1]: Finished systemd-fsck-usr.service. Sep 6 09:55:12.842014 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 6 09:55:12.842023 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 09:55:12.842032 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 6 09:55:12.842087 systemd-journald[219]: Collecting audit messages is disabled. Sep 6 09:55:12.842126 systemd-journald[219]: Journal started Sep 6 09:55:12.842147 systemd-journald[219]: Runtime Journal (/run/log/journal/193fa681259f4720b1e52fcf20133e46) is 6M, max 48.6M, 42.5M free. Sep 6 09:55:12.833712 systemd-modules-load[220]: Inserted module 'overlay' Sep 6 09:55:12.878910 systemd[1]: Started systemd-journald.service - Journal Service. Sep 6 09:55:12.878962 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 6 09:55:12.878983 kernel: Bridge firewalling registered Sep 6 09:55:12.863968 systemd-modules-load[220]: Inserted module 'br_netfilter' Sep 6 09:55:12.877859 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 6 09:55:12.880316 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 09:55:12.882283 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 09:55:12.886487 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 6 09:55:12.888853 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 6 09:55:12.891948 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 6 09:55:12.903614 systemd-tmpfiles[240]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 6 09:55:12.908572 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 6 09:55:12.911495 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 09:55:12.913622 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 6 09:55:12.930135 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 09:55:12.932673 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 6 09:55:12.957790 dracut-cmdline[263]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=c69e1af3de5da13c6be143e3ab518bb528f3fa67a9a488f7fa132987ebb99256 Sep 6 09:55:12.975721 systemd-resolved[258]: Positive Trust Anchors: Sep 6 09:55:12.975743 systemd-resolved[258]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 6 09:55:12.975782 systemd-resolved[258]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 6 09:55:12.979010 systemd-resolved[258]: Defaulting to hostname 'linux'. Sep 6 09:55:12.985169 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 6 09:55:12.986475 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 6 09:55:13.081972 kernel: SCSI subsystem initialized Sep 6 09:55:13.091958 kernel: Loading iSCSI transport class v2.0-870. Sep 6 09:55:13.101959 kernel: iscsi: registered transport (tcp) Sep 6 09:55:13.124034 kernel: iscsi: registered transport (qla4xxx) Sep 6 09:55:13.124068 kernel: QLogic iSCSI HBA Driver Sep 6 09:55:13.147068 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 6 09:55:13.170884 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 6 09:55:13.173228 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 6 09:55:13.235607 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 6 09:55:13.238165 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 6 09:55:13.301949 kernel: raid6: avx2x4 gen() 29673 MB/s Sep 6 09:55:13.318947 kernel: raid6: avx2x2 gen() 29455 MB/s Sep 6 09:55:13.335994 kernel: raid6: avx2x1 gen() 25814 MB/s Sep 6 09:55:13.336013 kernel: raid6: using algorithm avx2x4 gen() 29673 MB/s Sep 6 09:55:13.354439 kernel: raid6: .... xor() 7617 MB/s, rmw enabled Sep 6 09:55:13.354486 kernel: raid6: using avx2x2 recovery algorithm Sep 6 09:55:13.375948 kernel: xor: automatically using best checksumming function avx Sep 6 09:55:13.537965 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 6 09:55:13.547170 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 6 09:55:13.550053 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 09:55:13.578062 systemd-udevd[471]: Using default interface naming scheme 'v255'. Sep 6 09:55:13.583719 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 09:55:13.584656 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 6 09:55:13.608498 dracut-pre-trigger[473]: rd.md=0: removing MD RAID activation Sep 6 09:55:13.640801 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 6 09:55:13.642375 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 6 09:55:13.740126 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 09:55:13.746421 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 6 09:55:13.783958 kernel: cryptd: max_cpu_qlen set to 1000 Sep 6 09:55:13.807104 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 6 09:55:13.817168 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 6 09:55:13.813679 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 6 09:55:13.813805 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 09:55:13.818911 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 09:55:13.824683 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 09:55:13.826938 kernel: libata version 3.00 loaded. Sep 6 09:55:13.829089 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 6 09:55:13.829116 kernel: GPT:9289727 != 19775487 Sep 6 09:55:13.829127 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 6 09:55:13.829137 kernel: AES CTR mode by8 optimization enabled Sep 6 09:55:13.830424 kernel: GPT:9289727 != 19775487 Sep 6 09:55:13.830946 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 6 09:55:13.831014 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 6 09:55:13.832530 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 6 09:55:13.845066 kernel: ahci 0000:00:1f.2: version 3.0 Sep 6 09:55:13.846943 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 6 09:55:13.853099 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 6 09:55:13.858513 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 6 09:55:13.858666 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 6 09:55:13.862964 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 6 09:55:13.877200 kernel: scsi host0: ahci Sep 6 09:55:13.881035 kernel: scsi host1: ahci Sep 6 09:55:13.881253 kernel: scsi host2: ahci Sep 6 09:55:13.881462 kernel: scsi host3: ahci Sep 6 09:55:13.881693 kernel: scsi host4: ahci Sep 6 09:55:13.881900 kernel: scsi host5: ahci Sep 6 09:55:13.882143 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 1 Sep 6 09:55:13.882156 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 1 Sep 6 09:55:13.882166 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 1 Sep 6 09:55:13.882176 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 1 Sep 6 09:55:13.882197 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 1 Sep 6 09:55:13.882207 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 1 Sep 6 09:55:13.892520 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 6 09:55:13.927586 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 09:55:13.950518 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 6 09:55:13.960998 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 6 09:55:13.969122 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 6 09:55:13.970349 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 6 09:55:13.973508 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 6 09:55:14.002422 disk-uuid[632]: Primary Header is updated. Sep 6 09:55:14.002422 disk-uuid[632]: Secondary Entries is updated. Sep 6 09:55:14.002422 disk-uuid[632]: Secondary Header is updated. Sep 6 09:55:14.006947 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 6 09:55:14.011949 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 6 09:55:14.189482 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 6 09:55:14.189576 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 6 09:55:14.189587 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 6 09:55:14.190944 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 6 09:55:14.191940 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 6 09:55:14.192951 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 6 09:55:14.192967 kernel: ata3.00: LPM support broken, forcing max_power Sep 6 09:55:14.193979 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 6 09:55:14.194008 kernel: ata3.00: applying bridge limits Sep 6 09:55:14.195266 kernel: ata3.00: LPM support broken, forcing max_power Sep 6 09:55:14.195290 kernel: ata3.00: configured for UDMA/100 Sep 6 09:55:14.197946 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 6 09:55:14.254969 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 6 09:55:14.255299 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 6 09:55:14.280953 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 6 09:55:14.674739 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 6 09:55:14.675378 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 6 09:55:14.677993 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 09:55:14.680187 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 6 09:55:14.683245 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 6 09:55:14.716854 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 6 09:55:15.012948 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 6 09:55:15.014151 disk-uuid[633]: The operation has completed successfully. Sep 6 09:55:15.043893 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 6 09:55:15.044055 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 6 09:55:15.082535 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 6 09:55:15.104723 sh[662]: Success Sep 6 09:55:15.122972 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 6 09:55:15.123061 kernel: device-mapper: uevent: version 1.0.3 Sep 6 09:55:15.123075 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 6 09:55:15.133946 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 6 09:55:15.163607 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 6 09:55:15.167622 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 6 09:55:15.180907 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 6 09:55:15.186955 kernel: BTRFS: device fsid d01fb51c-249d-484b-98c9-d7ac47264f4b devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (674) Sep 6 09:55:15.186989 kernel: BTRFS info (device dm-0): first mount of filesystem d01fb51c-249d-484b-98c9-d7ac47264f4b Sep 6 09:55:15.188672 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 6 09:55:15.193445 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 6 09:55:15.193466 kernel: BTRFS info (device dm-0): enabling free space tree Sep 6 09:55:15.194706 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 6 09:55:15.196831 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 6 09:55:15.199025 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 6 09:55:15.201574 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 6 09:55:15.204299 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 6 09:55:15.231948 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (706) Sep 6 09:55:15.234471 kernel: BTRFS info (device vda6): first mount of filesystem 5df41a5c-7aec-412a-8efb-42c0335a0fb4 Sep 6 09:55:15.234496 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 6 09:55:15.237362 kernel: BTRFS info (device vda6): turning on async discard Sep 6 09:55:15.237420 kernel: BTRFS info (device vda6): enabling free space tree Sep 6 09:55:15.242938 kernel: BTRFS info (device vda6): last unmount of filesystem 5df41a5c-7aec-412a-8efb-42c0335a0fb4 Sep 6 09:55:15.243842 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 6 09:55:15.245263 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 6 09:55:15.375062 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 6 09:55:15.378061 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 6 09:55:15.444053 systemd-networkd[843]: lo: Link UP Sep 6 09:55:15.444066 systemd-networkd[843]: lo: Gained carrier Sep 6 09:55:15.446070 systemd-networkd[843]: Enumeration completed Sep 6 09:55:15.446450 systemd-networkd[843]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 09:55:15.446454 systemd-networkd[843]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 6 09:55:15.446858 systemd-networkd[843]: eth0: Link UP Sep 6 09:55:15.447822 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 6 09:55:15.448009 systemd-networkd[843]: eth0: Gained carrier Sep 6 09:55:15.448018 systemd-networkd[843]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 09:55:15.451857 systemd[1]: Reached target network.target - Network. Sep 6 09:55:15.465070 ignition[748]: Ignition 2.22.0 Sep 6 09:55:15.465097 ignition[748]: Stage: fetch-offline Sep 6 09:55:15.465134 ignition[748]: no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:15.465143 ignition[748]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:15.465251 ignition[748]: parsed url from cmdline: "" Sep 6 09:55:15.465256 ignition[748]: no config URL provided Sep 6 09:55:15.465263 ignition[748]: reading system config file "/usr/lib/ignition/user.ign" Sep 6 09:55:15.468959 systemd-networkd[843]: eth0: DHCPv4 address 10.0.0.36/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 6 09:55:15.465275 ignition[748]: no config at "/usr/lib/ignition/user.ign" Sep 6 09:55:15.465307 ignition[748]: op(1): [started] loading QEMU firmware config module Sep 6 09:55:15.465313 ignition[748]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 6 09:55:15.478497 ignition[748]: op(1): [finished] loading QEMU firmware config module Sep 6 09:55:15.518318 ignition[748]: parsing config with SHA512: d869605d5467333f9e46759427fcc89ed31ef27e962a652e183e09e6ac7a889552deaa1b85b3048d3a7f8b482519aa3a35a4728c014f470cda41ca626be19b0b Sep 6 09:55:15.524612 unknown[748]: fetched base config from "system" Sep 6 09:55:15.524626 unknown[748]: fetched user config from "qemu" Sep 6 09:55:15.525051 ignition[748]: fetch-offline: fetch-offline passed Sep 6 09:55:15.525537 systemd-resolved[258]: Detected conflict on linux IN A 10.0.0.36 Sep 6 09:55:15.525181 ignition[748]: Ignition finished successfully Sep 6 09:55:15.525546 systemd-resolved[258]: Hostname conflict, changing published hostname from 'linux' to 'linux6'. Sep 6 09:55:15.528864 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 6 09:55:15.530705 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 6 09:55:15.531645 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 6 09:55:15.576373 ignition[857]: Ignition 2.22.0 Sep 6 09:55:15.576386 ignition[857]: Stage: kargs Sep 6 09:55:15.576519 ignition[857]: no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:15.576530 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:15.578717 ignition[857]: kargs: kargs passed Sep 6 09:55:15.578801 ignition[857]: Ignition finished successfully Sep 6 09:55:15.583028 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 6 09:55:15.584189 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 6 09:55:15.628016 ignition[865]: Ignition 2.22.0 Sep 6 09:55:15.628029 ignition[865]: Stage: disks Sep 6 09:55:15.628192 ignition[865]: no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:15.628204 ignition[865]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:15.628959 ignition[865]: disks: disks passed Sep 6 09:55:15.629010 ignition[865]: Ignition finished successfully Sep 6 09:55:15.632089 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 6 09:55:15.633884 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 6 09:55:15.635566 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 6 09:55:15.637662 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 6 09:55:15.637733 systemd[1]: Reached target sysinit.target - System Initialization. Sep 6 09:55:15.638199 systemd[1]: Reached target basic.target - Basic System. Sep 6 09:55:15.639422 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 6 09:55:15.671365 systemd-fsck[875]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 6 09:55:15.679611 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 6 09:55:15.682061 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 6 09:55:15.816965 kernel: EXT4-fs (vda9): mounted filesystem 9a4cce02-a1df-4d9f-a25f-08e044692442 r/w with ordered data mode. Quota mode: none. Sep 6 09:55:15.817904 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 6 09:55:15.820253 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 6 09:55:15.823541 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 6 09:55:15.826049 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 6 09:55:15.827936 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 6 09:55:15.829012 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 6 09:55:15.829036 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 6 09:55:15.848899 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 6 09:55:15.852353 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 6 09:55:15.856378 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (883) Sep 6 09:55:15.856405 kernel: BTRFS info (device vda6): first mount of filesystem 5df41a5c-7aec-412a-8efb-42c0335a0fb4 Sep 6 09:55:15.856416 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 6 09:55:15.858010 kernel: BTRFS info (device vda6): turning on async discard Sep 6 09:55:15.858028 kernel: BTRFS info (device vda6): enabling free space tree Sep 6 09:55:15.860302 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 6 09:55:15.890335 initrd-setup-root[907]: cut: /sysroot/etc/passwd: No such file or directory Sep 6 09:55:15.894928 initrd-setup-root[914]: cut: /sysroot/etc/group: No such file or directory Sep 6 09:55:15.899615 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory Sep 6 09:55:15.904569 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory Sep 6 09:55:15.997022 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 6 09:55:15.998083 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 6 09:55:15.998851 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 6 09:55:16.016950 kernel: BTRFS info (device vda6): last unmount of filesystem 5df41a5c-7aec-412a-8efb-42c0335a0fb4 Sep 6 09:55:16.030068 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 6 09:55:16.057360 ignition[997]: INFO : Ignition 2.22.0 Sep 6 09:55:16.057360 ignition[997]: INFO : Stage: mount Sep 6 09:55:16.059209 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:16.059209 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:16.059209 ignition[997]: INFO : mount: mount passed Sep 6 09:55:16.059209 ignition[997]: INFO : Ignition finished successfully Sep 6 09:55:16.063067 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 6 09:55:16.066629 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 6 09:55:16.186652 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 6 09:55:16.188234 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 6 09:55:16.218952 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1009) Sep 6 09:55:16.219023 kernel: BTRFS info (device vda6): first mount of filesystem 5df41a5c-7aec-412a-8efb-42c0335a0fb4 Sep 6 09:55:16.220555 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 6 09:55:16.223399 kernel: BTRFS info (device vda6): turning on async discard Sep 6 09:55:16.223459 kernel: BTRFS info (device vda6): enabling free space tree Sep 6 09:55:16.225020 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 6 09:55:16.258659 ignition[1026]: INFO : Ignition 2.22.0 Sep 6 09:55:16.258659 ignition[1026]: INFO : Stage: files Sep 6 09:55:16.260611 ignition[1026]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:16.260611 ignition[1026]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:16.260611 ignition[1026]: DEBUG : files: compiled without relabeling support, skipping Sep 6 09:55:16.260611 ignition[1026]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 6 09:55:16.260611 ignition[1026]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 6 09:55:16.267123 ignition[1026]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 6 09:55:16.267123 ignition[1026]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 6 09:55:16.267123 ignition[1026]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 6 09:55:16.267123 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 6 09:55:16.267123 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 6 09:55:16.263451 unknown[1026]: wrote ssh authorized keys file for user: core Sep 6 09:55:16.310241 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 6 09:55:16.448705 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 6 09:55:16.450732 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 6 09:55:16.452511 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 6 09:55:16.452511 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 6 09:55:16.452511 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 6 09:55:16.457577 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 6 09:55:16.459215 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 6 09:55:16.459215 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 6 09:55:16.462520 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 6 09:55:16.468480 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 6 09:55:16.470622 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 6 09:55:16.472384 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 6 09:55:16.474995 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 6 09:55:16.474995 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 6 09:55:16.474995 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 6 09:55:16.895980 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 6 09:55:17.294091 systemd-networkd[843]: eth0: Gained IPv6LL Sep 6 09:55:17.305282 ignition[1026]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 6 09:55:17.307715 ignition[1026]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 6 09:55:17.307715 ignition[1026]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 6 09:55:17.311701 ignition[1026]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 6 09:55:17.311701 ignition[1026]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 6 09:55:17.311701 ignition[1026]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 6 09:55:17.315902 ignition[1026]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 6 09:55:17.315902 ignition[1026]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 6 09:55:17.315902 ignition[1026]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 6 09:55:17.315902 ignition[1026]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 6 09:55:17.340744 ignition[1026]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 6 09:55:17.349060 ignition[1026]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 6 09:55:17.350596 ignition[1026]: INFO : files: files passed Sep 6 09:55:17.350596 ignition[1026]: INFO : Ignition finished successfully Sep 6 09:55:17.358132 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 6 09:55:17.362316 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 6 09:55:17.365148 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 6 09:55:17.388298 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 6 09:55:17.388547 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 6 09:55:17.391638 initrd-setup-root-after-ignition[1055]: grep: /sysroot/oem/oem-release: No such file or directory Sep 6 09:55:17.394205 initrd-setup-root-after-ignition[1057]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 6 09:55:17.396006 initrd-setup-root-after-ignition[1057]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 6 09:55:17.397596 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 6 09:55:17.398089 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 6 09:55:17.401003 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 6 09:55:17.402554 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 6 09:55:17.444854 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 6 09:55:17.445057 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 6 09:55:17.446233 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 6 09:55:17.448300 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 6 09:55:17.451085 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 6 09:55:17.452089 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 6 09:55:17.493042 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 6 09:55:17.495874 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 6 09:55:17.522255 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 6 09:55:17.522411 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 09:55:17.524520 systemd[1]: Stopped target timers.target - Timer Units. Sep 6 09:55:17.526600 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 6 09:55:17.526719 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 6 09:55:17.531169 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 6 09:55:17.531303 systemd[1]: Stopped target basic.target - Basic System. Sep 6 09:55:17.533129 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 6 09:55:17.533438 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 6 09:55:17.533758 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 6 09:55:17.534250 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 6 09:55:17.534565 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 6 09:55:17.534881 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 6 09:55:17.535379 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 6 09:55:17.535686 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 6 09:55:17.536165 systemd[1]: Stopped target swap.target - Swaps. Sep 6 09:55:17.536455 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 6 09:55:17.536559 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 6 09:55:17.553677 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 6 09:55:17.553806 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 09:55:17.554269 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 6 09:55:17.557706 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 09:55:17.558662 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 6 09:55:17.558772 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 6 09:55:17.561303 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 6 09:55:17.561412 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 6 09:55:17.563756 systemd[1]: Stopped target paths.target - Path Units. Sep 6 09:55:17.564316 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 6 09:55:17.570991 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 09:55:17.572290 systemd[1]: Stopped target slices.target - Slice Units. Sep 6 09:55:17.573778 systemd[1]: Stopped target sockets.target - Socket Units. Sep 6 09:55:17.574228 systemd[1]: iscsid.socket: Deactivated successfully. Sep 6 09:55:17.574315 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 6 09:55:17.576941 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 6 09:55:17.577026 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 6 09:55:17.577419 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 6 09:55:17.577533 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 6 09:55:17.580259 systemd[1]: ignition-files.service: Deactivated successfully. Sep 6 09:55:17.580364 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 6 09:55:17.585043 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 6 09:55:17.588673 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 6 09:55:17.590819 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 6 09:55:17.592758 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 09:55:17.595089 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 6 09:55:17.595218 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 6 09:55:17.601719 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 6 09:55:17.601836 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 6 09:55:17.624955 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 6 09:55:17.734033 ignition[1083]: INFO : Ignition 2.22.0 Sep 6 09:55:17.734033 ignition[1083]: INFO : Stage: umount Sep 6 09:55:17.735965 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 6 09:55:17.735965 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 6 09:55:17.738189 ignition[1083]: INFO : umount: umount passed Sep 6 09:55:17.738189 ignition[1083]: INFO : Ignition finished successfully Sep 6 09:55:17.742021 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 6 09:55:17.742182 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 6 09:55:17.743364 systemd[1]: Stopped target network.target - Network. Sep 6 09:55:17.744909 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 6 09:55:17.744988 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 6 09:55:17.745422 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 6 09:55:17.745465 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 6 09:55:17.745733 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 6 09:55:17.745783 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 6 09:55:17.746229 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 6 09:55:17.746272 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 6 09:55:17.746643 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 6 09:55:17.746957 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 6 09:55:17.761491 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 6 09:55:17.761675 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 6 09:55:17.767611 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 6 09:55:17.767872 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 6 09:55:17.768040 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 6 09:55:17.773061 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 6 09:55:17.773698 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 6 09:55:17.774691 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 6 09:55:17.774734 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 6 09:55:17.780608 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 6 09:55:17.782659 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 6 09:55:17.782725 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 6 09:55:17.785071 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 6 09:55:17.785127 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 6 09:55:17.788413 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 6 09:55:17.788460 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 6 09:55:17.789431 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 6 09:55:17.789480 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 09:55:17.793629 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 09:55:17.795550 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 6 09:55:17.795617 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 6 09:55:17.819733 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 6 09:55:17.824104 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 09:55:17.825839 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 6 09:55:17.825888 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 6 09:55:17.828271 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 6 09:55:17.828308 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 09:55:17.829779 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 6 09:55:17.829840 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 6 09:55:17.834386 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 6 09:55:17.834446 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 6 09:55:17.837358 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 6 09:55:17.837418 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 6 09:55:17.843886 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 6 09:55:17.846142 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 6 09:55:17.846213 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 6 09:55:17.850199 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 6 09:55:17.850325 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 09:55:17.854137 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 6 09:55:17.854214 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 09:55:17.858193 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 6 09:55:17.858277 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 09:55:17.861786 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 6 09:55:17.861863 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 09:55:17.866494 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 6 09:55:17.866578 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 6 09:55:17.866638 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 6 09:55:17.866698 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 6 09:55:17.867291 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 6 09:55:17.867436 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 6 09:55:17.868431 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 6 09:55:17.868561 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 6 09:55:17.870734 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 6 09:55:17.870862 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 6 09:55:17.876539 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 6 09:55:17.877133 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 6 09:55:17.877226 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 6 09:55:17.878597 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 6 09:55:17.910484 systemd[1]: Switching root. Sep 6 09:55:17.959843 systemd-journald[219]: Journal stopped Sep 6 09:55:19.052262 systemd-journald[219]: Received SIGTERM from PID 1 (systemd). Sep 6 09:55:19.052325 kernel: SELinux: policy capability network_peer_controls=1 Sep 6 09:55:19.052339 kernel: SELinux: policy capability open_perms=1 Sep 6 09:55:19.052350 kernel: SELinux: policy capability extended_socket_class=1 Sep 6 09:55:19.052366 kernel: SELinux: policy capability always_check_network=0 Sep 6 09:55:19.052378 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 6 09:55:19.052393 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 6 09:55:19.052404 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 6 09:55:19.052415 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 6 09:55:19.052426 kernel: SELinux: policy capability userspace_initial_context=0 Sep 6 09:55:19.052438 kernel: audit: type=1403 audit(1757152518.254:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 6 09:55:19.052458 systemd[1]: Successfully loaded SELinux policy in 69.136ms. Sep 6 09:55:19.052477 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.808ms. Sep 6 09:55:19.052490 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 6 09:55:19.052502 systemd[1]: Detected virtualization kvm. Sep 6 09:55:19.052516 systemd[1]: Detected architecture x86-64. Sep 6 09:55:19.052528 systemd[1]: Detected first boot. Sep 6 09:55:19.052540 systemd[1]: Initializing machine ID from VM UUID. Sep 6 09:55:19.052552 zram_generator::config[1127]: No configuration found. Sep 6 09:55:19.052565 kernel: Guest personality initialized and is inactive Sep 6 09:55:19.052577 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 6 09:55:19.052588 kernel: Initialized host personality Sep 6 09:55:19.052600 kernel: NET: Registered PF_VSOCK protocol family Sep 6 09:55:19.052613 systemd[1]: Populated /etc with preset unit settings. Sep 6 09:55:19.052626 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 6 09:55:19.052638 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 6 09:55:19.052651 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 6 09:55:19.052663 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 6 09:55:19.052676 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 6 09:55:19.052688 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 6 09:55:19.052706 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 6 09:55:19.052717 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 6 09:55:19.052737 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 6 09:55:19.052755 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 6 09:55:19.052768 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 6 09:55:19.052780 systemd[1]: Created slice user.slice - User and Session Slice. Sep 6 09:55:19.052792 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 6 09:55:19.052804 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 6 09:55:19.052817 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 6 09:55:19.052830 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 6 09:55:19.052845 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 6 09:55:19.052857 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 6 09:55:19.052869 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 6 09:55:19.052881 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 6 09:55:19.052893 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 6 09:55:19.052905 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 6 09:55:19.052984 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 6 09:55:19.053010 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 6 09:55:19.053025 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 6 09:55:19.053038 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 6 09:55:19.053050 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 6 09:55:19.053070 systemd[1]: Reached target slices.target - Slice Units. Sep 6 09:55:19.053082 systemd[1]: Reached target swap.target - Swaps. Sep 6 09:55:19.053094 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 6 09:55:19.053116 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 6 09:55:19.053128 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 6 09:55:19.053140 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 6 09:55:19.053153 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 6 09:55:19.053167 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 6 09:55:19.053180 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 6 09:55:19.053192 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 6 09:55:19.053204 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 6 09:55:19.053215 systemd[1]: Mounting media.mount - External Media Directory... Sep 6 09:55:19.053227 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:19.053239 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 6 09:55:19.053251 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 6 09:55:19.053265 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 6 09:55:19.053278 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 6 09:55:19.053291 systemd[1]: Reached target machines.target - Containers. Sep 6 09:55:19.053302 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 6 09:55:19.053315 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 09:55:19.053327 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 6 09:55:19.053339 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 6 09:55:19.053351 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 6 09:55:19.053364 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 6 09:55:19.053378 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 6 09:55:19.053390 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 6 09:55:19.053402 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 6 09:55:19.053414 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 6 09:55:19.053426 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 6 09:55:19.053438 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 6 09:55:19.053450 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 6 09:55:19.053462 systemd[1]: Stopped systemd-fsck-usr.service. Sep 6 09:55:19.053477 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 6 09:55:19.053490 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 6 09:55:19.053502 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 6 09:55:19.053514 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 6 09:55:19.053526 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 6 09:55:19.053539 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 6 09:55:19.053556 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 6 09:55:19.053568 systemd[1]: verity-setup.service: Deactivated successfully. Sep 6 09:55:19.053580 systemd[1]: Stopped verity-setup.service. Sep 6 09:55:19.053592 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:19.053612 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 6 09:55:19.053624 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 6 09:55:19.053636 systemd[1]: Mounted media.mount - External Media Directory. Sep 6 09:55:19.053648 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 6 09:55:19.053660 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 6 09:55:19.053672 kernel: loop: module loaded Sep 6 09:55:19.053683 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 6 09:55:19.053695 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 6 09:55:19.053707 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 6 09:55:19.053722 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 6 09:55:19.053734 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 6 09:55:19.053746 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 6 09:55:19.053778 systemd-journald[1198]: Collecting audit messages is disabled. Sep 6 09:55:19.053802 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 6 09:55:19.053814 kernel: fuse: init (API version 7.41) Sep 6 09:55:19.053826 systemd-journald[1198]: Journal started Sep 6 09:55:19.053853 systemd-journald[1198]: Runtime Journal (/run/log/journal/193fa681259f4720b1e52fcf20133e46) is 6M, max 48.6M, 42.5M free. Sep 6 09:55:18.818223 systemd[1]: Queued start job for default target multi-user.target. Sep 6 09:55:18.830269 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 6 09:55:18.830871 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 6 09:55:19.054949 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 6 09:55:19.057949 systemd[1]: Started systemd-journald.service - Journal Service. Sep 6 09:55:19.059840 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 6 09:55:19.060112 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 6 09:55:19.061680 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 6 09:55:19.063462 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 6 09:55:19.065396 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 6 09:55:19.068412 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 6 09:55:19.068657 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 6 09:55:19.070096 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 6 09:55:19.073875 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 6 09:55:19.076937 kernel: ACPI: bus type drm_connector registered Sep 6 09:55:19.078038 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 6 09:55:19.078297 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 6 09:55:19.089074 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 6 09:55:19.091642 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 6 09:55:19.093768 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 6 09:55:19.094863 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 6 09:55:19.094892 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 6 09:55:19.096802 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 6 09:55:19.099609 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 6 09:55:19.101169 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 09:55:19.108567 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 6 09:55:19.112756 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 6 09:55:19.114270 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 6 09:55:19.117212 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 6 09:55:19.118565 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 6 09:55:19.122777 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 6 09:55:19.125043 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 6 09:55:19.135522 systemd-journald[1198]: Time spent on flushing to /var/log/journal/193fa681259f4720b1e52fcf20133e46 is 19.039ms for 987 entries. Sep 6 09:55:19.135522 systemd-journald[1198]: System Journal (/var/log/journal/193fa681259f4720b1e52fcf20133e46) is 8M, max 195.6M, 187.6M free. Sep 6 09:55:19.165502 systemd-journald[1198]: Received client request to flush runtime journal. Sep 6 09:55:19.165535 kernel: loop0: detected capacity change from 0 to 110984 Sep 6 09:55:19.129037 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 6 09:55:19.132450 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 6 09:55:19.133949 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 6 09:55:19.144143 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 6 09:55:19.145620 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 6 09:55:19.150809 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 6 09:55:19.167750 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 6 09:55:19.171500 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 6 09:55:19.174572 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 6 09:55:19.181511 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Sep 6 09:55:19.181967 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Sep 6 09:55:19.188024 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 6 09:55:19.193716 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 6 09:55:19.194081 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 6 09:55:19.206331 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 6 09:55:19.212565 kernel: loop1: detected capacity change from 0 to 229808 Sep 6 09:55:19.240945 kernel: loop2: detected capacity change from 0 to 128016 Sep 6 09:55:19.241186 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 6 09:55:19.246167 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 6 09:55:19.304013 kernel: loop3: detected capacity change from 0 to 110984 Sep 6 09:55:19.307714 systemd-tmpfiles[1268]: ACLs are not supported, ignoring. Sep 6 09:55:19.308071 systemd-tmpfiles[1268]: ACLs are not supported, ignoring. Sep 6 09:55:19.312818 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 6 09:55:19.319960 kernel: loop4: detected capacity change from 0 to 229808 Sep 6 09:55:19.373185 kernel: loop5: detected capacity change from 0 to 128016 Sep 6 09:55:19.384138 (sd-merge)[1271]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 6 09:55:19.384841 (sd-merge)[1271]: Merged extensions into '/usr'. Sep 6 09:55:19.390880 systemd[1]: Reload requested from client PID 1246 ('systemd-sysext') (unit systemd-sysext.service)... Sep 6 09:55:19.390901 systemd[1]: Reloading... Sep 6 09:55:19.487945 zram_generator::config[1294]: No configuration found. Sep 6 09:55:19.713724 ldconfig[1241]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 6 09:55:19.765196 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 6 09:55:19.765610 systemd[1]: Reloading finished in 374 ms. Sep 6 09:55:19.798658 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 6 09:55:19.800833 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 6 09:55:19.819520 systemd[1]: Starting ensure-sysext.service... Sep 6 09:55:19.821511 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 6 09:55:19.843198 systemd[1]: Reload requested from client PID 1335 ('systemctl') (unit ensure-sysext.service)... Sep 6 09:55:19.843217 systemd[1]: Reloading... Sep 6 09:55:19.888253 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 6 09:55:19.888300 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 6 09:55:19.888605 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 6 09:55:19.888884 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 6 09:55:19.890356 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 6 09:55:19.890757 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Sep 6 09:55:19.890893 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Sep 6 09:55:19.896072 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Sep 6 09:55:19.898022 systemd-tmpfiles[1336]: Skipping /boot Sep 6 09:55:19.921800 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Sep 6 09:55:19.921817 systemd-tmpfiles[1336]: Skipping /boot Sep 6 09:55:19.969956 zram_generator::config[1369]: No configuration found. Sep 6 09:55:20.142151 systemd[1]: Reloading finished in 298 ms. Sep 6 09:55:20.163586 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 6 09:55:20.193813 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 6 09:55:20.203160 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 6 09:55:20.205701 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 6 09:55:20.228558 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 6 09:55:20.232331 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 6 09:55:20.235104 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 6 09:55:20.238189 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 6 09:55:20.242736 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.242951 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 09:55:20.244973 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 6 09:55:20.248996 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 6 09:55:20.251749 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 6 09:55:20.253158 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 09:55:20.254202 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 6 09:55:20.259190 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 6 09:55:20.260222 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.263007 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.263177 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 09:55:20.263334 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 09:55:20.263412 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 6 09:55:20.263500 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.273484 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.273734 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 6 09:55:20.277330 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 6 09:55:20.278479 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 6 09:55:20.278583 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 6 09:55:20.278708 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 6 09:55:20.281054 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 6 09:55:20.293695 systemd[1]: Finished ensure-sysext.service. Sep 6 09:55:20.295140 systemd-udevd[1409]: Using default interface naming scheme 'v255'. Sep 6 09:55:20.298111 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 6 09:55:20.299764 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 6 09:55:20.299998 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 6 09:55:20.301479 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 6 09:55:20.302119 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 6 09:55:20.303763 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 6 09:55:20.304223 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 6 09:55:20.307900 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 6 09:55:20.308241 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 6 09:55:20.312604 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 6 09:55:20.312667 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 6 09:55:20.315105 augenrules[1439]: No rules Sep 6 09:55:20.315830 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 6 09:55:20.319272 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 6 09:55:20.321130 systemd[1]: audit-rules.service: Deactivated successfully. Sep 6 09:55:20.321427 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 6 09:55:20.336081 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 6 09:55:20.341818 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 6 09:55:20.351478 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 6 09:55:20.353017 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 6 09:55:20.357258 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 6 09:55:20.362445 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 6 09:55:20.418842 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 6 09:55:20.473969 kernel: mousedev: PS/2 mouse device common for all mice Sep 6 09:55:20.477544 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 6 09:55:20.481053 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 6 09:55:20.510235 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 6 09:55:20.514949 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 6 09:55:20.533585 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 6 09:55:20.533910 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 6 09:55:20.536944 kernel: ACPI: button: Power Button [PWRF] Sep 6 09:55:20.581300 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 6 09:55:20.582908 systemd[1]: Reached target time-set.target - System Time Set. Sep 6 09:55:20.593979 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 6 09:55:20.598761 systemd-networkd[1450]: lo: Link UP Sep 6 09:55:20.598776 systemd-networkd[1450]: lo: Gained carrier Sep 6 09:55:20.606408 systemd-networkd[1450]: Enumeration completed Sep 6 09:55:20.606649 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 6 09:55:20.607733 systemd-networkd[1450]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 09:55:20.609813 systemd-networkd[1450]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 6 09:55:20.610818 systemd-networkd[1450]: eth0: Link UP Sep 6 09:55:20.611323 systemd-networkd[1450]: eth0: Gained carrier Sep 6 09:55:20.611633 systemd-networkd[1450]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 6 09:55:20.619348 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 6 09:55:20.621898 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 6 09:55:20.653982 systemd-networkd[1450]: eth0: DHCPv4 address 10.0.0.36/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 6 09:55:20.657411 systemd-timesyncd[1431]: Network configuration changed, trying to establish connection. Sep 6 09:55:20.658601 systemd-resolved[1405]: Positive Trust Anchors: Sep 6 09:55:20.658620 systemd-resolved[1405]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 6 09:55:20.658652 systemd-resolved[1405]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 6 09:55:21.491616 systemd-timesyncd[1431]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 6 09:55:21.491686 systemd-timesyncd[1431]: Initial clock synchronization to Sat 2025-09-06 09:55:21.491467 UTC. Sep 6 09:55:21.496705 systemd-resolved[1405]: Defaulting to hostname 'linux'. Sep 6 09:55:21.499311 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 6 09:55:21.499513 systemd[1]: Reached target network.target - Network. Sep 6 09:55:21.499807 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 6 09:55:21.504417 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 6 09:55:21.562433 kernel: kvm_amd: TSC scaling supported Sep 6 09:55:21.562561 kernel: kvm_amd: Nested Virtualization enabled Sep 6 09:55:21.562589 kernel: kvm_amd: Nested Paging enabled Sep 6 09:55:21.562602 kernel: kvm_amd: LBR virtualization supported Sep 6 09:55:21.562615 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 6 09:55:21.562627 kernel: kvm_amd: Virtual GIF supported Sep 6 09:55:21.599430 kernel: EDAC MC: Ver: 3.0.0 Sep 6 09:55:21.604818 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 6 09:55:21.606299 systemd[1]: Reached target sysinit.target - System Initialization. Sep 6 09:55:21.607466 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 6 09:55:21.608749 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 6 09:55:21.609979 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 6 09:55:21.611288 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 6 09:55:21.612417 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 6 09:55:21.613634 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 6 09:55:21.614845 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 6 09:55:21.614876 systemd[1]: Reached target paths.target - Path Units. Sep 6 09:55:21.615755 systemd[1]: Reached target timers.target - Timer Units. Sep 6 09:55:21.617879 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 6 09:55:21.620832 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 6 09:55:21.624032 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 6 09:55:21.625469 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 6 09:55:21.626749 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 6 09:55:21.630504 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 6 09:55:21.631946 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 6 09:55:21.633846 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 6 09:55:21.635695 systemd[1]: Reached target sockets.target - Socket Units. Sep 6 09:55:21.636648 systemd[1]: Reached target basic.target - Basic System. Sep 6 09:55:21.637629 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 6 09:55:21.637658 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 6 09:55:21.638706 systemd[1]: Starting containerd.service - containerd container runtime... Sep 6 09:55:21.640771 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 6 09:55:21.642781 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 6 09:55:21.654898 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 6 09:55:21.657213 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 6 09:55:21.659584 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 6 09:55:21.661106 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 6 09:55:21.663844 jq[1530]: false Sep 6 09:55:21.664315 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 6 09:55:21.666694 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 6 09:55:21.669646 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 6 09:55:21.671909 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 6 09:55:21.677166 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 6 09:55:21.680106 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 6 09:55:21.680596 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 6 09:55:21.682607 systemd[1]: Starting update-engine.service - Update Engine... Sep 6 09:55:21.684687 oslogin_cache_refresh[1532]: Refreshing passwd entry cache Sep 6 09:55:21.685740 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Refreshing passwd entry cache Sep 6 09:55:21.686600 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 6 09:55:21.690993 extend-filesystems[1531]: Found /dev/vda6 Sep 6 09:55:21.693633 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 6 09:55:21.694418 oslogin_cache_refresh[1532]: Failure getting users, quitting Sep 6 09:55:21.695265 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Failure getting users, quitting Sep 6 09:55:21.695265 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 6 09:55:21.695265 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Refreshing group entry cache Sep 6 09:55:21.694437 oslogin_cache_refresh[1532]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 6 09:55:21.695518 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 6 09:55:21.694518 oslogin_cache_refresh[1532]: Refreshing group entry cache Sep 6 09:55:21.696168 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 6 09:55:21.696882 systemd[1]: motdgen.service: Deactivated successfully. Sep 6 09:55:21.697897 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 6 09:55:21.702503 extend-filesystems[1531]: Found /dev/vda9 Sep 6 09:55:21.703369 jq[1548]: true Sep 6 09:55:21.702947 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 6 09:55:21.703687 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 6 09:55:21.705091 extend-filesystems[1531]: Checking size of /dev/vda9 Sep 6 09:55:21.716868 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Failure getting groups, quitting Sep 6 09:55:21.716868 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 6 09:55:21.716775 oslogin_cache_refresh[1532]: Failure getting groups, quitting Sep 6 09:55:21.716790 oslogin_cache_refresh[1532]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 6 09:55:21.718842 (ntainerd)[1557]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 6 09:55:21.721485 update_engine[1546]: I20250906 09:55:21.717376 1546 main.cc:92] Flatcar Update Engine starting Sep 6 09:55:21.724345 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 6 09:55:21.724716 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 6 09:55:21.732286 jq[1556]: true Sep 6 09:55:21.733793 extend-filesystems[1531]: Resized partition /dev/vda9 Sep 6 09:55:21.746611 extend-filesystems[1573]: resize2fs 1.47.3 (8-Jul-2025) Sep 6 09:55:21.752433 tar[1553]: linux-amd64/LICENSE Sep 6 09:55:21.752433 tar[1553]: linux-amd64/helm Sep 6 09:55:21.769112 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 6 09:55:21.782751 dbus-daemon[1528]: [system] SELinux support is enabled Sep 6 09:55:21.783008 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 6 09:55:21.786593 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 6 09:55:21.786621 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 6 09:55:21.788260 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 6 09:55:21.788282 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 6 09:55:21.795452 systemd[1]: Started update-engine.service - Update Engine. Sep 6 09:55:21.796058 update_engine[1546]: I20250906 09:55:21.795699 1546 update_check_scheduler.cc:74] Next update check in 10m33s Sep 6 09:55:21.802745 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 6 09:55:21.813923 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 6 09:55:21.840774 systemd-logind[1541]: Watching system buttons on /dev/input/event2 (Power Button) Sep 6 09:55:21.840863 systemd-logind[1541]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 6 09:55:21.841527 systemd-logind[1541]: New seat seat0. Sep 6 09:55:21.843302 extend-filesystems[1573]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 6 09:55:21.843302 extend-filesystems[1573]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 6 09:55:21.843302 extend-filesystems[1573]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 6 09:55:21.850685 extend-filesystems[1531]: Resized filesystem in /dev/vda9 Sep 6 09:55:21.844935 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 6 09:55:21.846075 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 6 09:55:21.856040 systemd[1]: Started systemd-logind.service - User Login Management. Sep 6 09:55:21.864988 bash[1590]: Updated "/home/core/.ssh/authorized_keys" Sep 6 09:55:21.888624 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 6 09:55:21.891009 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 6 09:55:21.935533 locksmithd[1582]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 6 09:55:22.002117 sshd_keygen[1552]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 6 09:55:22.028318 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 6 09:55:22.032653 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 6 09:55:22.034700 containerd[1557]: time="2025-09-06T09:55:22Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 6 09:55:22.035489 containerd[1557]: time="2025-09-06T09:55:22.035445730Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 6 09:55:22.045923 containerd[1557]: time="2025-09-06T09:55:22.045875756Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.761µs" Sep 6 09:55:22.046144 containerd[1557]: time="2025-09-06T09:55:22.045994318Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 6 09:55:22.046144 containerd[1557]: time="2025-09-06T09:55:22.046018524Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 6 09:55:22.046384 containerd[1557]: time="2025-09-06T09:55:22.046351508Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 6 09:55:22.046477 containerd[1557]: time="2025-09-06T09:55:22.046462827Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 6 09:55:22.046557 containerd[1557]: time="2025-09-06T09:55:22.046543788Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 6 09:55:22.046770 containerd[1557]: time="2025-09-06T09:55:22.046720390Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 6 09:55:22.046856 containerd[1557]: time="2025-09-06T09:55:22.046842799Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 6 09:55:22.047449 containerd[1557]: time="2025-09-06T09:55:22.047402128Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 6 09:55:22.047519 containerd[1557]: time="2025-09-06T09:55:22.047504911Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 6 09:55:22.047587 containerd[1557]: time="2025-09-06T09:55:22.047573289Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 6 09:55:22.047635 containerd[1557]: time="2025-09-06T09:55:22.047623533Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 6 09:55:22.047815 containerd[1557]: time="2025-09-06T09:55:22.047797219Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 6 09:55:22.048139 containerd[1557]: time="2025-09-06T09:55:22.048119012Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 6 09:55:22.048219 containerd[1557]: time="2025-09-06T09:55:22.048205484Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 6 09:55:22.048281 containerd[1557]: time="2025-09-06T09:55:22.048268592Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 6 09:55:22.048379 containerd[1557]: time="2025-09-06T09:55:22.048363180Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 6 09:55:22.048794 containerd[1557]: time="2025-09-06T09:55:22.048774561Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 6 09:55:22.048925 containerd[1557]: time="2025-09-06T09:55:22.048910155Z" level=info msg="metadata content store policy set" policy=shared Sep 6 09:55:22.051779 systemd[1]: issuegen.service: Deactivated successfully. Sep 6 09:55:22.052312 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 6 09:55:22.055202 containerd[1557]: time="2025-09-06T09:55:22.055148092Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 6 09:55:22.055254 containerd[1557]: time="2025-09-06T09:55:22.055243380Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 6 09:55:22.055276 containerd[1557]: time="2025-09-06T09:55:22.055260713Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 6 09:55:22.055371 containerd[1557]: time="2025-09-06T09:55:22.055279638Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055372863Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055386619Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055437113Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055451410Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055462211Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055489472Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055501664Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 6 09:55:22.055608 containerd[1557]: time="2025-09-06T09:55:22.055514749Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055664580Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055684998Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055700938Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055712329Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055722919Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055732758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055743167Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 6 09:55:22.055752 containerd[1557]: time="2025-09-06T09:55:22.055752835Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 6 09:55:22.055897 containerd[1557]: time="2025-09-06T09:55:22.055763646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 6 09:55:22.055897 containerd[1557]: time="2025-09-06T09:55:22.055774476Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 6 09:55:22.055897 containerd[1557]: time="2025-09-06T09:55:22.055784675Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 6 09:55:22.055897 containerd[1557]: time="2025-09-06T09:55:22.055872941Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 6 09:55:22.055897 containerd[1557]: time="2025-09-06T09:55:22.055885133Z" level=info msg="Start snapshots syncer" Sep 6 09:55:22.055997 containerd[1557]: time="2025-09-06T09:55:22.055932182Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 6 09:55:22.056777 containerd[1557]: time="2025-09-06T09:55:22.056205103Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 6 09:55:22.056777 containerd[1557]: time="2025-09-06T09:55:22.056279924Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 6 09:55:22.056715 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056352109Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056469609Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056497682Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056508061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056518471Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056530824Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056542606Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056555711Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056582872Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056601557Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056614251Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056645930Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056659265Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 6 09:55:22.058353 containerd[1557]: time="2025-09-06T09:55:22.056668082Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056678130Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056685675Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056695273Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056711142Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056728725Z" level=info msg="runtime interface created" Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056737692Z" level=info msg="created NRI interface" Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056745747Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056755696Z" level=info msg="Connect containerd service" Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.056777156Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 6 09:55:22.058880 containerd[1557]: time="2025-09-06T09:55:22.057466589Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 6 09:55:22.128421 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 6 09:55:22.132598 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 6 09:55:22.136625 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 6 09:55:22.137943 systemd[1]: Reached target getty.target - Login Prompts. Sep 6 09:55:22.219264 containerd[1557]: time="2025-09-06T09:55:22.219183257Z" level=info msg="Start subscribing containerd event" Sep 6 09:55:22.219422 containerd[1557]: time="2025-09-06T09:55:22.219303914Z" level=info msg="Start recovering state" Sep 6 09:55:22.219535 containerd[1557]: time="2025-09-06T09:55:22.219487328Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 6 09:55:22.219600 containerd[1557]: time="2025-09-06T09:55:22.219488169Z" level=info msg="Start event monitor" Sep 6 09:55:22.219600 containerd[1557]: time="2025-09-06T09:55:22.219591323Z" level=info msg="Start cni network conf syncer for default" Sep 6 09:55:22.219600 containerd[1557]: time="2025-09-06T09:55:22.219577186Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 6 09:55:22.219600 containerd[1557]: time="2025-09-06T09:55:22.219617532Z" level=info msg="Start streaming server" Sep 6 09:55:22.219797 containerd[1557]: time="2025-09-06T09:55:22.219628933Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 6 09:55:22.219797 containerd[1557]: time="2025-09-06T09:55:22.219637148Z" level=info msg="runtime interface starting up..." Sep 6 09:55:22.219797 containerd[1557]: time="2025-09-06T09:55:22.219642779Z" level=info msg="starting plugins..." Sep 6 09:55:22.219877 containerd[1557]: time="2025-09-06T09:55:22.219858734Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 6 09:55:22.220261 containerd[1557]: time="2025-09-06T09:55:22.220016580Z" level=info msg="containerd successfully booted in 0.185919s" Sep 6 09:55:22.220144 systemd[1]: Started containerd.service - containerd container runtime. Sep 6 09:55:22.230437 tar[1553]: linux-amd64/README.md Sep 6 09:55:22.253265 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 6 09:55:23.502915 systemd-networkd[1450]: eth0: Gained IPv6LL Sep 6 09:55:23.508966 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 6 09:55:23.511261 systemd[1]: Reached target network-online.target - Network is Online. Sep 6 09:55:23.515179 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 6 09:55:23.518817 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:23.523108 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 6 09:55:23.555830 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 6 09:55:23.557849 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 6 09:55:23.558156 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 6 09:55:23.560998 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 6 09:55:24.634368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:24.636575 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 6 09:55:24.638366 systemd[1]: Startup finished in 3.212s (kernel) + 5.613s (initrd) + 5.618s (userspace) = 14.443s. Sep 6 09:55:24.666991 (kubelet)[1662]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 09:55:25.104725 kubelet[1662]: E0906 09:55:25.104588 1662 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 09:55:25.108769 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 09:55:25.108983 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 09:55:25.109434 systemd[1]: kubelet.service: Consumed 1.360s CPU time, 266.8M memory peak. Sep 6 09:55:27.246148 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 6 09:55:27.247547 systemd[1]: Started sshd@0-10.0.0.36:22-10.0.0.1:35160.service - OpenSSH per-connection server daemon (10.0.0.1:35160). Sep 6 09:55:27.315426 sshd[1675]: Accepted publickey for core from 10.0.0.1 port 35160 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:27.316913 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:27.323657 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 6 09:55:27.324791 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 6 09:55:27.332104 systemd-logind[1541]: New session 1 of user core. Sep 6 09:55:27.347977 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 6 09:55:27.350935 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 6 09:55:27.378947 (systemd)[1680]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 6 09:55:27.381636 systemd-logind[1541]: New session c1 of user core. Sep 6 09:55:27.528969 systemd[1680]: Queued start job for default target default.target. Sep 6 09:55:27.551707 systemd[1680]: Created slice app.slice - User Application Slice. Sep 6 09:55:27.551734 systemd[1680]: Reached target paths.target - Paths. Sep 6 09:55:27.551778 systemd[1680]: Reached target timers.target - Timers. Sep 6 09:55:27.553328 systemd[1680]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 6 09:55:27.564767 systemd[1680]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 6 09:55:27.564897 systemd[1680]: Reached target sockets.target - Sockets. Sep 6 09:55:27.564935 systemd[1680]: Reached target basic.target - Basic System. Sep 6 09:55:27.564977 systemd[1680]: Reached target default.target - Main User Target. Sep 6 09:55:27.565009 systemd[1680]: Startup finished in 176ms. Sep 6 09:55:27.565489 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 6 09:55:27.567509 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 6 09:55:27.630784 systemd[1]: Started sshd@1-10.0.0.36:22-10.0.0.1:35164.service - OpenSSH per-connection server daemon (10.0.0.1:35164). Sep 6 09:55:27.700276 sshd[1691]: Accepted publickey for core from 10.0.0.1 port 35164 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:27.701830 sshd-session[1691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:27.713486 systemd-logind[1541]: New session 2 of user core. Sep 6 09:55:27.724547 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 6 09:55:27.780227 sshd[1694]: Connection closed by 10.0.0.1 port 35164 Sep 6 09:55:27.780815 sshd-session[1691]: pam_unix(sshd:session): session closed for user core Sep 6 09:55:27.794291 systemd[1]: sshd@1-10.0.0.36:22-10.0.0.1:35164.service: Deactivated successfully. Sep 6 09:55:27.796170 systemd[1]: session-2.scope: Deactivated successfully. Sep 6 09:55:27.797018 systemd-logind[1541]: Session 2 logged out. Waiting for processes to exit. Sep 6 09:55:27.800087 systemd[1]: Started sshd@2-10.0.0.36:22-10.0.0.1:35170.service - OpenSSH per-connection server daemon (10.0.0.1:35170). Sep 6 09:55:27.800648 systemd-logind[1541]: Removed session 2. Sep 6 09:55:27.854270 sshd[1700]: Accepted publickey for core from 10.0.0.1 port 35170 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:27.855701 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:27.860231 systemd-logind[1541]: New session 3 of user core. Sep 6 09:55:27.869563 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 6 09:55:27.920067 sshd[1703]: Connection closed by 10.0.0.1 port 35170 Sep 6 09:55:27.920899 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Sep 6 09:55:27.932059 systemd[1]: sshd@2-10.0.0.36:22-10.0.0.1:35170.service: Deactivated successfully. Sep 6 09:55:27.933980 systemd[1]: session-3.scope: Deactivated successfully. Sep 6 09:55:27.934743 systemd-logind[1541]: Session 3 logged out. Waiting for processes to exit. Sep 6 09:55:27.937667 systemd[1]: Started sshd@3-10.0.0.36:22-10.0.0.1:35174.service - OpenSSH per-connection server daemon (10.0.0.1:35174). Sep 6 09:55:27.938452 systemd-logind[1541]: Removed session 3. Sep 6 09:55:27.986420 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 35174 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:27.987911 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:27.992324 systemd-logind[1541]: New session 4 of user core. Sep 6 09:55:28.010524 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 6 09:55:28.066103 sshd[1712]: Connection closed by 10.0.0.1 port 35174 Sep 6 09:55:28.066555 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Sep 6 09:55:28.076078 systemd[1]: sshd@3-10.0.0.36:22-10.0.0.1:35174.service: Deactivated successfully. Sep 6 09:55:28.078472 systemd[1]: session-4.scope: Deactivated successfully. Sep 6 09:55:28.079283 systemd-logind[1541]: Session 4 logged out. Waiting for processes to exit. Sep 6 09:55:28.082807 systemd[1]: Started sshd@4-10.0.0.36:22-10.0.0.1:35178.service - OpenSSH per-connection server daemon (10.0.0.1:35178). Sep 6 09:55:28.083503 systemd-logind[1541]: Removed session 4. Sep 6 09:55:28.141420 sshd[1718]: Accepted publickey for core from 10.0.0.1 port 35178 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:28.143316 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:28.148377 systemd-logind[1541]: New session 5 of user core. Sep 6 09:55:28.166689 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 6 09:55:28.228559 sudo[1722]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 6 09:55:28.228976 sudo[1722]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 09:55:28.252546 sudo[1722]: pam_unix(sudo:session): session closed for user root Sep 6 09:55:28.254536 sshd[1721]: Connection closed by 10.0.0.1 port 35178 Sep 6 09:55:28.255082 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Sep 6 09:55:28.264220 systemd[1]: sshd@4-10.0.0.36:22-10.0.0.1:35178.service: Deactivated successfully. Sep 6 09:55:28.266374 systemd[1]: session-5.scope: Deactivated successfully. Sep 6 09:55:28.267309 systemd-logind[1541]: Session 5 logged out. Waiting for processes to exit. Sep 6 09:55:28.270705 systemd[1]: Started sshd@5-10.0.0.36:22-10.0.0.1:35182.service - OpenSSH per-connection server daemon (10.0.0.1:35182). Sep 6 09:55:28.271503 systemd-logind[1541]: Removed session 5. Sep 6 09:55:28.326246 sshd[1728]: Accepted publickey for core from 10.0.0.1 port 35182 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:28.327970 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:28.333108 systemd-logind[1541]: New session 6 of user core. Sep 6 09:55:28.342603 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 6 09:55:28.397448 sudo[1733]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 6 09:55:28.397767 sudo[1733]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 09:55:28.404682 sudo[1733]: pam_unix(sudo:session): session closed for user root Sep 6 09:55:28.412466 sudo[1732]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 6 09:55:28.412851 sudo[1732]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 09:55:28.423858 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 6 09:55:28.469904 augenrules[1755]: No rules Sep 6 09:55:28.471037 systemd[1]: audit-rules.service: Deactivated successfully. Sep 6 09:55:28.471422 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 6 09:55:28.472587 sudo[1732]: pam_unix(sudo:session): session closed for user root Sep 6 09:55:28.474202 sshd[1731]: Connection closed by 10.0.0.1 port 35182 Sep 6 09:55:28.474673 sshd-session[1728]: pam_unix(sshd:session): session closed for user core Sep 6 09:55:28.492448 systemd[1]: sshd@5-10.0.0.36:22-10.0.0.1:35182.service: Deactivated successfully. Sep 6 09:55:28.494246 systemd[1]: session-6.scope: Deactivated successfully. Sep 6 09:55:28.494963 systemd-logind[1541]: Session 6 logged out. Waiting for processes to exit. Sep 6 09:55:28.497628 systemd[1]: Started sshd@6-10.0.0.36:22-10.0.0.1:35194.service - OpenSSH per-connection server daemon (10.0.0.1:35194). Sep 6 09:55:28.498270 systemd-logind[1541]: Removed session 6. Sep 6 09:55:28.549928 sshd[1764]: Accepted publickey for core from 10.0.0.1 port 35194 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:55:28.551631 sshd-session[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:55:28.556616 systemd-logind[1541]: New session 7 of user core. Sep 6 09:55:28.566592 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 6 09:55:28.621424 sudo[1768]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 6 09:55:28.621748 sudo[1768]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 6 09:55:29.234122 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 6 09:55:29.247745 (dockerd)[1789]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 6 09:55:29.762987 dockerd[1789]: time="2025-09-06T09:55:29.762900235Z" level=info msg="Starting up" Sep 6 09:55:29.763773 dockerd[1789]: time="2025-09-06T09:55:29.763746652Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 6 09:55:29.784849 dockerd[1789]: time="2025-09-06T09:55:29.784791160Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 6 09:55:30.013515 dockerd[1789]: time="2025-09-06T09:55:30.013353224Z" level=info msg="Loading containers: start." Sep 6 09:55:30.024414 kernel: Initializing XFRM netlink socket Sep 6 09:55:30.306621 systemd-networkd[1450]: docker0: Link UP Sep 6 09:55:30.312085 dockerd[1789]: time="2025-09-06T09:55:30.312041675Z" level=info msg="Loading containers: done." Sep 6 09:55:30.326350 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2393018118-merged.mount: Deactivated successfully. Sep 6 09:55:30.329632 dockerd[1789]: time="2025-09-06T09:55:30.329593955Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 6 09:55:30.329699 dockerd[1789]: time="2025-09-06T09:55:30.329686850Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 6 09:55:30.329813 dockerd[1789]: time="2025-09-06T09:55:30.329793079Z" level=info msg="Initializing buildkit" Sep 6 09:55:30.359506 dockerd[1789]: time="2025-09-06T09:55:30.359457980Z" level=info msg="Completed buildkit initialization" Sep 6 09:55:30.365723 dockerd[1789]: time="2025-09-06T09:55:30.365691498Z" level=info msg="Daemon has completed initialization" Sep 6 09:55:30.365826 dockerd[1789]: time="2025-09-06T09:55:30.365775095Z" level=info msg="API listen on /run/docker.sock" Sep 6 09:55:30.366009 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 6 09:55:31.260804 containerd[1557]: time="2025-09-06T09:55:31.260750784Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 6 09:55:32.095190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount329230823.mount: Deactivated successfully. Sep 6 09:55:33.538689 containerd[1557]: time="2025-09-06T09:55:33.538607335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:33.539864 containerd[1557]: time="2025-09-06T09:55:33.539801363Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078664" Sep 6 09:55:33.541094 containerd[1557]: time="2025-09-06T09:55:33.541024076Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:33.543835 containerd[1557]: time="2025-09-06T09:55:33.543799790Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:33.544938 containerd[1557]: time="2025-09-06T09:55:33.544747758Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 2.283953432s" Sep 6 09:55:33.544938 containerd[1557]: time="2025-09-06T09:55:33.544924950Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Sep 6 09:55:33.545886 containerd[1557]: time="2025-09-06T09:55:33.545843823Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 6 09:55:35.221321 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 6 09:55:35.223817 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:35.480379 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:35.491817 (kubelet)[2075]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 09:55:35.581570 kubelet[2075]: E0906 09:55:35.581427 2075 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 09:55:35.589730 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 09:55:35.589971 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 09:55:35.590660 systemd[1]: kubelet.service: Consumed 315ms CPU time, 109.9M memory peak. Sep 6 09:55:35.738547 containerd[1557]: time="2025-09-06T09:55:35.738417989Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:35.739460 containerd[1557]: time="2025-09-06T09:55:35.739432792Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018066" Sep 6 09:55:35.741024 containerd[1557]: time="2025-09-06T09:55:35.740980023Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:35.743761 containerd[1557]: time="2025-09-06T09:55:35.743714209Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:35.745057 containerd[1557]: time="2025-09-06T09:55:35.745001613Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 2.199116974s" Sep 6 09:55:35.745057 containerd[1557]: time="2025-09-06T09:55:35.745038773Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Sep 6 09:55:35.745697 containerd[1557]: time="2025-09-06T09:55:35.745664586Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 6 09:55:37.686803 containerd[1557]: time="2025-09-06T09:55:37.686732949Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:37.687681 containerd[1557]: time="2025-09-06T09:55:37.687607769Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153911" Sep 6 09:55:37.688824 containerd[1557]: time="2025-09-06T09:55:37.688773796Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:37.691371 containerd[1557]: time="2025-09-06T09:55:37.691337212Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:37.692313 containerd[1557]: time="2025-09-06T09:55:37.692282234Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.946586599s" Sep 6 09:55:37.692370 containerd[1557]: time="2025-09-06T09:55:37.692319093Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Sep 6 09:55:37.693263 containerd[1557]: time="2025-09-06T09:55:37.693232856Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 6 09:55:38.985567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount703238716.mount: Deactivated successfully. Sep 6 09:55:39.801053 containerd[1557]: time="2025-09-06T09:55:39.800965995Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:39.801763 containerd[1557]: time="2025-09-06T09:55:39.801733915Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899626" Sep 6 09:55:39.803001 containerd[1557]: time="2025-09-06T09:55:39.802973159Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:39.804923 containerd[1557]: time="2025-09-06T09:55:39.804889141Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:39.805481 containerd[1557]: time="2025-09-06T09:55:39.805429184Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 2.112165961s" Sep 6 09:55:39.805481 containerd[1557]: time="2025-09-06T09:55:39.805478727Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Sep 6 09:55:39.806090 containerd[1557]: time="2025-09-06T09:55:39.806055919Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 6 09:55:40.328779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3525586470.mount: Deactivated successfully. Sep 6 09:55:41.127173 containerd[1557]: time="2025-09-06T09:55:41.127096229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:41.127720 containerd[1557]: time="2025-09-06T09:55:41.127651060Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 6 09:55:41.129037 containerd[1557]: time="2025-09-06T09:55:41.128953201Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:41.134166 containerd[1557]: time="2025-09-06T09:55:41.134113016Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:41.135204 containerd[1557]: time="2025-09-06T09:55:41.135156051Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.329071519s" Sep 6 09:55:41.135204 containerd[1557]: time="2025-09-06T09:55:41.135199122Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 6 09:55:41.135874 containerd[1557]: time="2025-09-06T09:55:41.135679944Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 6 09:55:41.715786 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount590980398.mount: Deactivated successfully. Sep 6 09:55:41.722590 containerd[1557]: time="2025-09-06T09:55:41.722517334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 09:55:41.723343 containerd[1557]: time="2025-09-06T09:55:41.723319137Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 6 09:55:41.724612 containerd[1557]: time="2025-09-06T09:55:41.724577286Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 09:55:41.726856 containerd[1557]: time="2025-09-06T09:55:41.726791678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 6 09:55:41.727310 containerd[1557]: time="2025-09-06T09:55:41.727266288Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 591.557279ms" Sep 6 09:55:41.727310 containerd[1557]: time="2025-09-06T09:55:41.727303448Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 6 09:55:41.728049 containerd[1557]: time="2025-09-06T09:55:41.727821078Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 6 09:55:42.159490 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4093058199.mount: Deactivated successfully. Sep 6 09:55:43.767106 containerd[1557]: time="2025-09-06T09:55:43.766958340Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:43.767864 containerd[1557]: time="2025-09-06T09:55:43.767789949Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377871" Sep 6 09:55:43.769360 containerd[1557]: time="2025-09-06T09:55:43.769306112Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:43.772449 containerd[1557]: time="2025-09-06T09:55:43.772383532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:55:43.773775 containerd[1557]: time="2025-09-06T09:55:43.773700632Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.045850579s" Sep 6 09:55:43.773775 containerd[1557]: time="2025-09-06T09:55:43.773746899Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 6 09:55:45.721532 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 6 09:55:45.723374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:45.956755 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:45.974699 (kubelet)[2238]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 6 09:55:46.035084 kubelet[2238]: E0906 09:55:46.035009 2238 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 6 09:55:46.039758 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 6 09:55:46.040024 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 6 09:55:46.040556 systemd[1]: kubelet.service: Consumed 242ms CPU time, 111.2M memory peak. Sep 6 09:55:47.596014 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:47.596186 systemd[1]: kubelet.service: Consumed 242ms CPU time, 111.2M memory peak. Sep 6 09:55:47.598545 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:47.623882 systemd[1]: Reload requested from client PID 2252 ('systemctl') (unit session-7.scope)... Sep 6 09:55:47.623900 systemd[1]: Reloading... Sep 6 09:55:47.726511 zram_generator::config[2295]: No configuration found. Sep 6 09:55:48.131154 systemd[1]: Reloading finished in 506 ms. Sep 6 09:55:48.202159 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 6 09:55:48.202258 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 6 09:55:48.202571 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:48.202617 systemd[1]: kubelet.service: Consumed 162ms CPU time, 98.3M memory peak. Sep 6 09:55:48.204385 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:48.403252 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:48.407216 (kubelet)[2343]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 6 09:55:48.447851 kubelet[2343]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 09:55:48.447851 kubelet[2343]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 6 09:55:48.447851 kubelet[2343]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 09:55:48.448277 kubelet[2343]: I0906 09:55:48.447893 2343 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 6 09:55:49.319674 kubelet[2343]: I0906 09:55:49.319605 2343 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 6 09:55:49.319674 kubelet[2343]: I0906 09:55:49.319649 2343 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 6 09:55:49.319914 kubelet[2343]: I0906 09:55:49.319885 2343 server.go:956] "Client rotation is on, will bootstrap in background" Sep 6 09:55:49.352216 kubelet[2343]: E0906 09:55:49.352154 2343 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.36:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.36:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 6 09:55:49.356478 kubelet[2343]: I0906 09:55:49.356445 2343 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 6 09:55:49.361828 kubelet[2343]: I0906 09:55:49.361775 2343 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 6 09:55:49.367707 kubelet[2343]: I0906 09:55:49.367674 2343 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 6 09:55:49.367949 kubelet[2343]: I0906 09:55:49.367902 2343 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 6 09:55:49.368099 kubelet[2343]: I0906 09:55:49.367933 2343 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 6 09:55:49.368208 kubelet[2343]: I0906 09:55:49.368101 2343 topology_manager.go:138] "Creating topology manager with none policy" Sep 6 09:55:49.368208 kubelet[2343]: I0906 09:55:49.368111 2343 container_manager_linux.go:303] "Creating device plugin manager" Sep 6 09:55:49.368279 kubelet[2343]: I0906 09:55:49.368255 2343 state_mem.go:36] "Initialized new in-memory state store" Sep 6 09:55:49.371335 kubelet[2343]: I0906 09:55:49.371308 2343 kubelet.go:480] "Attempting to sync node with API server" Sep 6 09:55:49.371335 kubelet[2343]: I0906 09:55:49.371330 2343 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 6 09:55:49.371421 kubelet[2343]: I0906 09:55:49.371355 2343 kubelet.go:386] "Adding apiserver pod source" Sep 6 09:55:49.371421 kubelet[2343]: I0906 09:55:49.371404 2343 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 6 09:55:49.375859 kubelet[2343]: E0906 09:55:49.375790 2343 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.36:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 6 09:55:49.375989 kubelet[2343]: E0906 09:55:49.375932 2343 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.36:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.36:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 6 09:55:49.376550 kubelet[2343]: I0906 09:55:49.376519 2343 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 6 09:55:49.377064 kubelet[2343]: I0906 09:55:49.377027 2343 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 6 09:55:49.377724 kubelet[2343]: W0906 09:55:49.377694 2343 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 6 09:55:49.380820 kubelet[2343]: I0906 09:55:49.380783 2343 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 6 09:55:49.380882 kubelet[2343]: I0906 09:55:49.380843 2343 server.go:1289] "Started kubelet" Sep 6 09:55:49.381504 kubelet[2343]: I0906 09:55:49.381457 2343 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 6 09:55:49.383013 kubelet[2343]: I0906 09:55:49.382987 2343 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 6 09:55:49.383075 kubelet[2343]: I0906 09:55:49.383065 2343 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 6 09:55:49.383163 kubelet[2343]: I0906 09:55:49.383124 2343 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 6 09:55:49.384720 kubelet[2343]: I0906 09:55:49.384690 2343 server.go:317] "Adding debug handlers to kubelet server" Sep 6 09:55:49.386247 kubelet[2343]: E0906 09:55:49.386221 2343 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 6 09:55:49.386319 kubelet[2343]: I0906 09:55:49.386258 2343 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 6 09:55:49.387269 kubelet[2343]: I0906 09:55:49.387237 2343 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 6 09:55:49.387331 kubelet[2343]: I0906 09:55:49.387302 2343 reconciler.go:26] "Reconciler: start to sync state" Sep 6 09:55:49.388946 kubelet[2343]: E0906 09:55:49.388887 2343 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.36:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 6 09:55:49.389305 kubelet[2343]: E0906 09:55:49.389174 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.36:6443: connect: connection refused" interval="200ms" Sep 6 09:55:49.389337 kubelet[2343]: I0906 09:55:49.389311 2343 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 6 09:55:49.389494 kubelet[2343]: I0906 09:55:49.389470 2343 factory.go:223] Registration of the systemd container factory successfully Sep 6 09:55:49.389589 kubelet[2343]: I0906 09:55:49.389568 2343 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 6 09:55:49.390621 kubelet[2343]: E0906 09:55:49.388541 2343 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.36:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.36:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1862a8ef1203b8af default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-06 09:55:49.380798639 +0000 UTC m=+0.969575024,LastTimestamp:2025-09-06 09:55:49.380798639 +0000 UTC m=+0.969575024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 6 09:55:49.392818 kubelet[2343]: E0906 09:55:49.392766 2343 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 6 09:55:49.394413 kubelet[2343]: I0906 09:55:49.393369 2343 factory.go:223] Registration of the containerd container factory successfully Sep 6 09:55:49.407463 kubelet[2343]: I0906 09:55:49.407421 2343 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 6 09:55:49.407463 kubelet[2343]: I0906 09:55:49.407448 2343 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 6 09:55:49.407463 kubelet[2343]: I0906 09:55:49.407469 2343 state_mem.go:36] "Initialized new in-memory state store" Sep 6 09:55:49.409333 kubelet[2343]: I0906 09:55:49.409271 2343 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 6 09:55:49.411004 kubelet[2343]: I0906 09:55:49.410962 2343 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 6 09:55:49.411004 kubelet[2343]: I0906 09:55:49.410995 2343 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 6 09:55:49.411147 kubelet[2343]: I0906 09:55:49.411045 2343 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 6 09:55:49.411147 kubelet[2343]: I0906 09:55:49.411058 2343 kubelet.go:2436] "Starting kubelet main sync loop" Sep 6 09:55:49.411147 kubelet[2343]: E0906 09:55:49.411107 2343 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 6 09:55:49.411875 kubelet[2343]: E0906 09:55:49.411828 2343 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.36:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 6 09:55:49.412086 kubelet[2343]: I0906 09:55:49.412058 2343 policy_none.go:49] "None policy: Start" Sep 6 09:55:49.412118 kubelet[2343]: I0906 09:55:49.412110 2343 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 6 09:55:49.412140 kubelet[2343]: I0906 09:55:49.412128 2343 state_mem.go:35] "Initializing new in-memory state store" Sep 6 09:55:49.419659 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 6 09:55:49.432676 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 6 09:55:49.445286 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 6 09:55:49.446727 kubelet[2343]: E0906 09:55:49.446646 2343 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 6 09:55:49.446959 kubelet[2343]: I0906 09:55:49.446928 2343 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 6 09:55:49.447009 kubelet[2343]: I0906 09:55:49.446950 2343 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 6 09:55:49.447208 kubelet[2343]: I0906 09:55:49.447173 2343 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 6 09:55:49.448218 kubelet[2343]: E0906 09:55:49.448183 2343 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 6 09:55:49.448218 kubelet[2343]: E0906 09:55:49.448229 2343 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 6 09:55:49.524385 systemd[1]: Created slice kubepods-burstable-podf45b6e74f7c3eb01f08c7fdfe36496a2.slice - libcontainer container kubepods-burstable-podf45b6e74f7c3eb01f08c7fdfe36496a2.slice. Sep 6 09:55:49.532281 kubelet[2343]: E0906 09:55:49.532221 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:49.535268 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Sep 6 09:55:49.548187 kubelet[2343]: I0906 09:55:49.548159 2343 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 6 09:55:49.548689 kubelet[2343]: E0906 09:55:49.548648 2343 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.36:6443/api/v1/nodes\": dial tcp 10.0.0.36:6443: connect: connection refused" node="localhost" Sep 6 09:55:49.549849 kubelet[2343]: E0906 09:55:49.549815 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:49.551515 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Sep 6 09:55:49.553511 kubelet[2343]: E0906 09:55:49.553474 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:49.588823 kubelet[2343]: I0906 09:55:49.588721 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:49.588823 kubelet[2343]: I0906 09:55:49.588756 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:49.588823 kubelet[2343]: I0906 09:55:49.588781 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:49.588823 kubelet[2343]: I0906 09:55:49.588799 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:49.588954 kubelet[2343]: I0906 09:55:49.588833 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:49.588954 kubelet[2343]: I0906 09:55:49.588856 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:49.588954 kubelet[2343]: I0906 09:55:49.588880 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:49.588954 kubelet[2343]: I0906 09:55:49.588927 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:49.588954 kubelet[2343]: I0906 09:55:49.588948 2343 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:49.590221 kubelet[2343]: E0906 09:55:49.590155 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.36:6443: connect: connection refused" interval="400ms" Sep 6 09:55:49.750773 kubelet[2343]: I0906 09:55:49.750730 2343 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 6 09:55:49.751154 kubelet[2343]: E0906 09:55:49.751111 2343 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.36:6443/api/v1/nodes\": dial tcp 10.0.0.36:6443: connect: connection refused" node="localhost" Sep 6 09:55:49.833418 containerd[1557]: time="2025-09-06T09:55:49.833336708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:f45b6e74f7c3eb01f08c7fdfe36496a2,Namespace:kube-system,Attempt:0,}" Sep 6 09:55:49.851556 containerd[1557]: time="2025-09-06T09:55:49.851047696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Sep 6 09:55:49.855039 containerd[1557]: time="2025-09-06T09:55:49.855004345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Sep 6 09:55:49.860600 containerd[1557]: time="2025-09-06T09:55:49.860560022Z" level=info msg="connecting to shim 4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3" address="unix:///run/containerd/s/0c7812f869541ef3e84b226e097d402bbcdda691ce35e39699e9c85242b592f5" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:55:49.909524 containerd[1557]: time="2025-09-06T09:55:49.909461221Z" level=info msg="connecting to shim d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a" address="unix:///run/containerd/s/05b849c883caf22c459f50fa54551d2bfc58cb0d84469bb96a59fbab939d009f" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:55:49.916203 containerd[1557]: time="2025-09-06T09:55:49.916135445Z" level=info msg="connecting to shim c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84" address="unix:///run/containerd/s/c342949ca6d7484df1e25f65702d47973c79f247e36fe640fcc6bbecd359fa9c" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:55:49.917555 systemd[1]: Started cri-containerd-4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3.scope - libcontainer container 4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3. Sep 6 09:55:49.956882 systemd[1]: Started cri-containerd-d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a.scope - libcontainer container d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a. Sep 6 09:55:49.962007 systemd[1]: Started cri-containerd-c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84.scope - libcontainer container c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84. Sep 6 09:55:49.980744 containerd[1557]: time="2025-09-06T09:55:49.980471005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:f45b6e74f7c3eb01f08c7fdfe36496a2,Namespace:kube-system,Attempt:0,} returns sandbox id \"4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3\"" Sep 6 09:55:49.992876 kubelet[2343]: E0906 09:55:49.992833 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.36:6443: connect: connection refused" interval="800ms" Sep 6 09:55:49.993580 containerd[1557]: time="2025-09-06T09:55:49.993541081Z" level=info msg="CreateContainer within sandbox \"4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 6 09:55:50.005544 containerd[1557]: time="2025-09-06T09:55:50.005463776Z" level=info msg="Container 5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:55:50.016008 containerd[1557]: time="2025-09-06T09:55:50.015937204Z" level=info msg="CreateContainer within sandbox \"4a6ca950dcf0e190babd557ca18b2e8aeb382175ea702f8f5f5fa6015e01e6c3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5\"" Sep 6 09:55:50.016837 containerd[1557]: time="2025-09-06T09:55:50.016775856Z" level=info msg="StartContainer for \"5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5\"" Sep 6 09:55:50.018686 containerd[1557]: time="2025-09-06T09:55:50.018654819Z" level=info msg="connecting to shim 5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5" address="unix:///run/containerd/s/0c7812f869541ef3e84b226e097d402bbcdda691ce35e39699e9c85242b592f5" protocol=ttrpc version=3 Sep 6 09:55:50.019576 containerd[1557]: time="2025-09-06T09:55:50.019518168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84\"" Sep 6 09:55:50.022132 containerd[1557]: time="2025-09-06T09:55:50.022097544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a\"" Sep 6 09:55:50.027278 containerd[1557]: time="2025-09-06T09:55:50.027220831Z" level=info msg="CreateContainer within sandbox \"c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 6 09:55:50.028419 containerd[1557]: time="2025-09-06T09:55:50.028367851Z" level=info msg="CreateContainer within sandbox \"d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 6 09:55:50.043226 containerd[1557]: time="2025-09-06T09:55:50.043198408Z" level=info msg="Container b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:55:50.045992 containerd[1557]: time="2025-09-06T09:55:50.045958704Z" level=info msg="Container 50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:55:50.048572 systemd[1]: Started cri-containerd-5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5.scope - libcontainer container 5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5. Sep 6 09:55:50.051985 containerd[1557]: time="2025-09-06T09:55:50.051870859Z" level=info msg="CreateContainer within sandbox \"d782d7bc0665dc45b3204ef3f7acad24653567a8287f32bcd7fad8661541bf8a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768\"" Sep 6 09:55:50.052379 containerd[1557]: time="2025-09-06T09:55:50.052359185Z" level=info msg="StartContainer for \"b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768\"" Sep 6 09:55:50.053589 containerd[1557]: time="2025-09-06T09:55:50.053558043Z" level=info msg="connecting to shim b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768" address="unix:///run/containerd/s/05b849c883caf22c459f50fa54551d2bfc58cb0d84469bb96a59fbab939d009f" protocol=ttrpc version=3 Sep 6 09:55:50.054985 containerd[1557]: time="2025-09-06T09:55:50.054940134Z" level=info msg="CreateContainer within sandbox \"c4c485bae3621ca365bfeeb2b4f15cafc62418e6e80754d0ff006e35a361ee84\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3\"" Sep 6 09:55:50.055582 containerd[1557]: time="2025-09-06T09:55:50.055429402Z" level=info msg="StartContainer for \"50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3\"" Sep 6 09:55:50.056583 containerd[1557]: time="2025-09-06T09:55:50.056563899Z" level=info msg="connecting to shim 50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3" address="unix:///run/containerd/s/c342949ca6d7484df1e25f65702d47973c79f247e36fe640fcc6bbecd359fa9c" protocol=ttrpc version=3 Sep 6 09:55:50.075542 systemd[1]: Started cri-containerd-b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768.scope - libcontainer container b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768. Sep 6 09:55:50.088716 systemd[1]: Started cri-containerd-50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3.scope - libcontainer container 50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3. Sep 6 09:55:50.131735 containerd[1557]: time="2025-09-06T09:55:50.131577539Z" level=info msg="StartContainer for \"5c3bfb0c99044ab760acaf55c10505f163f99c0df6f23473273b226951e5d8b5\" returns successfully" Sep 6 09:55:50.152823 kubelet[2343]: I0906 09:55:50.152736 2343 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 6 09:55:50.155164 kubelet[2343]: E0906 09:55:50.155131 2343 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.36:6443/api/v1/nodes\": dial tcp 10.0.0.36:6443: connect: connection refused" node="localhost" Sep 6 09:55:50.173307 containerd[1557]: time="2025-09-06T09:55:50.173271586Z" level=info msg="StartContainer for \"50b4fe64cc297bb229444dd80e1afbb559313d2edf472a4a3ff20a7d08fffcf3\" returns successfully" Sep 6 09:55:50.178886 containerd[1557]: time="2025-09-06T09:55:50.178835839Z" level=info msg="StartContainer for \"b75fb1beae3c709267e969883a32df85f42a72d5ab9ea94dd1128e8651909768\" returns successfully" Sep 6 09:55:50.423620 kubelet[2343]: E0906 09:55:50.422770 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:50.424186 kubelet[2343]: E0906 09:55:50.424010 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:50.427642 kubelet[2343]: E0906 09:55:50.427619 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:50.957533 kubelet[2343]: I0906 09:55:50.957487 2343 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 6 09:55:51.474979 kubelet[2343]: E0906 09:55:51.474929 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:51.475505 kubelet[2343]: E0906 09:55:51.475483 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:51.565778 kubelet[2343]: E0906 09:55:51.565720 2343 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 6 09:55:52.020826 kubelet[2343]: E0906 09:55:52.020777 2343 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 6 09:55:52.113285 kubelet[2343]: I0906 09:55:52.113232 2343 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 6 09:55:52.113285 kubelet[2343]: E0906 09:55:52.113277 2343 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 6 09:55:52.198854 kubelet[2343]: I0906 09:55:52.198791 2343 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:52.206711 kubelet[2343]: E0906 09:55:52.206647 2343 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:52.206711 kubelet[2343]: I0906 09:55:52.206690 2343 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:52.209238 kubelet[2343]: E0906 09:55:52.209182 2343 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:52.209238 kubelet[2343]: I0906 09:55:52.209238 2343 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:52.211155 kubelet[2343]: E0906 09:55:52.211115 2343 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:52.471722 kubelet[2343]: I0906 09:55:52.471677 2343 apiserver.go:52] "Watching apiserver" Sep 6 09:55:52.488106 kubelet[2343]: I0906 09:55:52.488072 2343 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 6 09:55:53.855453 systemd[1]: Reload requested from client PID 2625 ('systemctl') (unit session-7.scope)... Sep 6 09:55:53.855469 systemd[1]: Reloading... Sep 6 09:55:53.928436 zram_generator::config[2668]: No configuration found. Sep 6 09:55:54.162564 systemd[1]: Reloading finished in 306 ms. Sep 6 09:55:54.192946 kubelet[2343]: I0906 09:55:54.192906 2343 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 6 09:55:54.193088 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:54.218028 systemd[1]: kubelet.service: Deactivated successfully. Sep 6 09:55:54.218411 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:54.218477 systemd[1]: kubelet.service: Consumed 1.022s CPU time, 130.5M memory peak. Sep 6 09:55:54.220648 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 6 09:55:54.448674 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 6 09:55:54.453008 (kubelet)[2713]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 6 09:55:54.492637 kubelet[2713]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 09:55:54.492637 kubelet[2713]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 6 09:55:54.492637 kubelet[2713]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 6 09:55:54.492637 kubelet[2713]: I0906 09:55:54.492565 2713 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 6 09:55:54.500807 kubelet[2713]: I0906 09:55:54.500767 2713 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 6 09:55:54.500807 kubelet[2713]: I0906 09:55:54.500795 2713 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 6 09:55:54.501022 kubelet[2713]: I0906 09:55:54.500998 2713 server.go:956] "Client rotation is on, will bootstrap in background" Sep 6 09:55:54.502209 kubelet[2713]: I0906 09:55:54.502183 2713 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 6 09:55:54.504284 kubelet[2713]: I0906 09:55:54.504220 2713 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 6 09:55:54.508252 kubelet[2713]: I0906 09:55:54.508223 2713 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 6 09:55:54.514332 kubelet[2713]: I0906 09:55:54.514294 2713 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 6 09:55:54.514608 kubelet[2713]: I0906 09:55:54.514568 2713 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 6 09:55:54.514797 kubelet[2713]: I0906 09:55:54.514601 2713 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 6 09:55:54.514925 kubelet[2713]: I0906 09:55:54.514804 2713 topology_manager.go:138] "Creating topology manager with none policy" Sep 6 09:55:54.514925 kubelet[2713]: I0906 09:55:54.514815 2713 container_manager_linux.go:303] "Creating device plugin manager" Sep 6 09:55:54.514925 kubelet[2713]: I0906 09:55:54.514868 2713 state_mem.go:36] "Initialized new in-memory state store" Sep 6 09:55:54.515070 kubelet[2713]: I0906 09:55:54.515048 2713 kubelet.go:480] "Attempting to sync node with API server" Sep 6 09:55:54.515070 kubelet[2713]: I0906 09:55:54.515066 2713 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 6 09:55:54.515141 kubelet[2713]: I0906 09:55:54.515097 2713 kubelet.go:386] "Adding apiserver pod source" Sep 6 09:55:54.515141 kubelet[2713]: I0906 09:55:54.515119 2713 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 6 09:55:54.517779 kubelet[2713]: I0906 09:55:54.517748 2713 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 6 09:55:54.518281 kubelet[2713]: I0906 09:55:54.518239 2713 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 6 09:55:54.524725 kubelet[2713]: I0906 09:55:54.524676 2713 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 6 09:55:54.524847 kubelet[2713]: I0906 09:55:54.524826 2713 server.go:1289] "Started kubelet" Sep 6 09:55:54.524940 kubelet[2713]: I0906 09:55:54.524914 2713 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 6 09:55:54.525468 kubelet[2713]: I0906 09:55:54.525424 2713 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 6 09:55:54.526240 kubelet[2713]: I0906 09:55:54.526141 2713 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 6 09:55:54.527425 kubelet[2713]: I0906 09:55:54.526326 2713 server.go:317] "Adding debug handlers to kubelet server" Sep 6 09:55:54.527425 kubelet[2713]: I0906 09:55:54.526820 2713 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 6 09:55:54.527520 kubelet[2713]: I0906 09:55:54.527447 2713 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 6 09:55:54.527973 kubelet[2713]: I0906 09:55:54.527956 2713 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 6 09:55:54.528201 kubelet[2713]: I0906 09:55:54.528180 2713 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 6 09:55:54.529422 kubelet[2713]: I0906 09:55:54.528285 2713 reconciler.go:26] "Reconciler: start to sync state" Sep 6 09:55:54.530312 kubelet[2713]: I0906 09:55:54.530277 2713 factory.go:223] Registration of the systemd container factory successfully Sep 6 09:55:54.530385 kubelet[2713]: I0906 09:55:54.530366 2713 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 6 09:55:54.532893 kubelet[2713]: E0906 09:55:54.532867 2713 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 6 09:55:54.533825 kubelet[2713]: I0906 09:55:54.533804 2713 factory.go:223] Registration of the containerd container factory successfully Sep 6 09:55:54.550576 kubelet[2713]: I0906 09:55:54.550530 2713 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 6 09:55:54.552656 kubelet[2713]: I0906 09:55:54.552634 2713 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 6 09:55:54.552656 kubelet[2713]: I0906 09:55:54.552654 2713 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 6 09:55:54.552896 kubelet[2713]: I0906 09:55:54.552673 2713 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 6 09:55:54.552896 kubelet[2713]: I0906 09:55:54.552681 2713 kubelet.go:2436] "Starting kubelet main sync loop" Sep 6 09:55:54.552896 kubelet[2713]: E0906 09:55:54.552730 2713 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 6 09:55:54.569875 kubelet[2713]: I0906 09:55:54.569838 2713 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 6 09:55:54.569875 kubelet[2713]: I0906 09:55:54.569863 2713 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 6 09:55:54.569875 kubelet[2713]: I0906 09:55:54.569886 2713 state_mem.go:36] "Initialized new in-memory state store" Sep 6 09:55:54.570125 kubelet[2713]: I0906 09:55:54.570041 2713 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 6 09:55:54.570125 kubelet[2713]: I0906 09:55:54.570054 2713 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 6 09:55:54.570125 kubelet[2713]: I0906 09:55:54.570074 2713 policy_none.go:49] "None policy: Start" Sep 6 09:55:54.570125 kubelet[2713]: I0906 09:55:54.570088 2713 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 6 09:55:54.570125 kubelet[2713]: I0906 09:55:54.570103 2713 state_mem.go:35] "Initializing new in-memory state store" Sep 6 09:55:54.570291 kubelet[2713]: I0906 09:55:54.570217 2713 state_mem.go:75] "Updated machine memory state" Sep 6 09:55:54.574533 kubelet[2713]: E0906 09:55:54.574503 2713 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 6 09:55:54.575086 kubelet[2713]: I0906 09:55:54.574692 2713 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 6 09:55:54.575086 kubelet[2713]: I0906 09:55:54.574718 2713 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 6 09:55:54.575086 kubelet[2713]: I0906 09:55:54.574879 2713 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 6 09:55:54.576420 kubelet[2713]: E0906 09:55:54.576375 2713 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 6 09:55:54.653543 kubelet[2713]: I0906 09:55:54.653486 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:54.653735 kubelet[2713]: I0906 09:55:54.653513 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.653735 kubelet[2713]: I0906 09:55:54.653643 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:54.678449 kubelet[2713]: I0906 09:55:54.678420 2713 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 6 09:55:54.683523 kubelet[2713]: I0906 09:55:54.683504 2713 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 6 09:55:54.683611 kubelet[2713]: I0906 09:55:54.683573 2713 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 6 09:55:54.829637 kubelet[2713]: I0906 09:55:54.829600 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.829637 kubelet[2713]: I0906 09:55:54.829630 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:54.829793 kubelet[2713]: I0906 09:55:54.829647 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.829793 kubelet[2713]: I0906 09:55:54.829661 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.829793 kubelet[2713]: I0906 09:55:54.829680 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.829862 kubelet[2713]: I0906 09:55:54.829778 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:54.829862 kubelet[2713]: I0906 09:55:54.829821 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:54.829862 kubelet[2713]: I0906 09:55:54.829839 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:54.829862 kubelet[2713]: I0906 09:55:54.829853 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f45b6e74f7c3eb01f08c7fdfe36496a2-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"f45b6e74f7c3eb01f08c7fdfe36496a2\") " pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:55.516259 kubelet[2713]: I0906 09:55:55.516214 2713 apiserver.go:52] "Watching apiserver" Sep 6 09:55:55.529181 kubelet[2713]: I0906 09:55:55.529138 2713 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 6 09:55:55.552667 kubelet[2713]: I0906 09:55:55.552595 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.552561308 podStartE2EDuration="1.552561308s" podCreationTimestamp="2025-09-06 09:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:55:55.552105685 +0000 UTC m=+1.094858338" watchObservedRunningTime="2025-09-06 09:55:55.552561308 +0000 UTC m=+1.095313951" Sep 6 09:55:55.563940 kubelet[2713]: I0906 09:55:55.563898 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:55.564255 kubelet[2713]: I0906 09:55:55.564204 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:55.564681 kubelet[2713]: I0906 09:55:55.564625 2713 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:55.567005 kubelet[2713]: I0906 09:55:55.566873 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.566863589 podStartE2EDuration="1.566863589s" podCreationTimestamp="2025-09-06 09:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:55:55.558877093 +0000 UTC m=+1.101629746" watchObservedRunningTime="2025-09-06 09:55:55.566863589 +0000 UTC m=+1.109616242" Sep 6 09:55:55.572360 kubelet[2713]: E0906 09:55:55.572317 2713 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 6 09:55:55.572944 kubelet[2713]: E0906 09:55:55.572914 2713 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 6 09:55:55.573077 kubelet[2713]: E0906 09:55:55.573042 2713 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 6 09:55:55.579127 kubelet[2713]: I0906 09:55:55.578358 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.578339964 podStartE2EDuration="1.578339964s" podCreationTimestamp="2025-09-06 09:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:55:55.567029727 +0000 UTC m=+1.109782380" watchObservedRunningTime="2025-09-06 09:55:55.578339964 +0000 UTC m=+1.121092617" Sep 6 09:55:59.699276 kubelet[2713]: I0906 09:55:59.699163 2713 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 6 09:55:59.702734 containerd[1557]: time="2025-09-06T09:55:59.702670116Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 6 09:55:59.703128 kubelet[2713]: I0906 09:55:59.702989 2713 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 6 09:56:00.818082 systemd[1]: Created slice kubepods-besteffort-pod2321119c_8f00_4156_9f83_5673c780e709.slice - libcontainer container kubepods-besteffort-pod2321119c_8f00_4156_9f83_5673c780e709.slice. Sep 6 09:56:00.868315 kubelet[2713]: I0906 09:56:00.868134 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2321119c-8f00-4156-9f83-5673c780e709-kube-proxy\") pod \"kube-proxy-8xmwn\" (UID: \"2321119c-8f00-4156-9f83-5673c780e709\") " pod="kube-system/kube-proxy-8xmwn" Sep 6 09:56:00.868315 kubelet[2713]: I0906 09:56:00.868311 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2321119c-8f00-4156-9f83-5673c780e709-xtables-lock\") pod \"kube-proxy-8xmwn\" (UID: \"2321119c-8f00-4156-9f83-5673c780e709\") " pod="kube-system/kube-proxy-8xmwn" Sep 6 09:56:00.869056 kubelet[2713]: I0906 09:56:00.868338 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2321119c-8f00-4156-9f83-5673c780e709-lib-modules\") pod \"kube-proxy-8xmwn\" (UID: \"2321119c-8f00-4156-9f83-5673c780e709\") " pod="kube-system/kube-proxy-8xmwn" Sep 6 09:56:00.869056 kubelet[2713]: I0906 09:56:00.868434 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhdz\" (UniqueName: \"kubernetes.io/projected/2321119c-8f00-4156-9f83-5673c780e709-kube-api-access-sjhdz\") pod \"kube-proxy-8xmwn\" (UID: \"2321119c-8f00-4156-9f83-5673c780e709\") " pod="kube-system/kube-proxy-8xmwn" Sep 6 09:56:00.882887 systemd[1]: Created slice kubepods-besteffort-pod268e0815_b337_409e_ab53_bda599d9303f.slice - libcontainer container kubepods-besteffort-pod268e0815_b337_409e_ab53_bda599d9303f.slice. Sep 6 09:56:00.969179 kubelet[2713]: I0906 09:56:00.969107 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/268e0815-b337-409e-ab53-bda599d9303f-var-lib-calico\") pod \"tigera-operator-755d956888-886pt\" (UID: \"268e0815-b337-409e-ab53-bda599d9303f\") " pod="tigera-operator/tigera-operator-755d956888-886pt" Sep 6 09:56:00.969336 kubelet[2713]: I0906 09:56:00.969260 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxgrw\" (UniqueName: \"kubernetes.io/projected/268e0815-b337-409e-ab53-bda599d9303f-kube-api-access-cxgrw\") pod \"tigera-operator-755d956888-886pt\" (UID: \"268e0815-b337-409e-ab53-bda599d9303f\") " pod="tigera-operator/tigera-operator-755d956888-886pt" Sep 6 09:56:01.133417 containerd[1557]: time="2025-09-06T09:56:01.133237951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8xmwn,Uid:2321119c-8f00-4156-9f83-5673c780e709,Namespace:kube-system,Attempt:0,}" Sep 6 09:56:01.156504 containerd[1557]: time="2025-09-06T09:56:01.156456310Z" level=info msg="connecting to shim 3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202" address="unix:///run/containerd/s/0b4bf8213c51a9f126845fdcee2b6226f6e2b72d946fd940c124b834d2f27c3e" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:01.187918 containerd[1557]: time="2025-09-06T09:56:01.187868643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-886pt,Uid:268e0815-b337-409e-ab53-bda599d9303f,Namespace:tigera-operator,Attempt:0,}" Sep 6 09:56:01.197600 systemd[1]: Started cri-containerd-3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202.scope - libcontainer container 3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202. Sep 6 09:56:01.213944 containerd[1557]: time="2025-09-06T09:56:01.213866473Z" level=info msg="connecting to shim 72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a" address="unix:///run/containerd/s/7e8b97be5a728800a3cd9f4d5108acebeaa5a8d1d5c7c607156e4d982a7bde7c" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:01.236234 containerd[1557]: time="2025-09-06T09:56:01.236183616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8xmwn,Uid:2321119c-8f00-4156-9f83-5673c780e709,Namespace:kube-system,Attempt:0,} returns sandbox id \"3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202\"" Sep 6 09:56:01.244959 containerd[1557]: time="2025-09-06T09:56:01.244433317Z" level=info msg="CreateContainer within sandbox \"3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 6 09:56:01.249548 systemd[1]: Started cri-containerd-72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a.scope - libcontainer container 72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a. Sep 6 09:56:01.262857 containerd[1557]: time="2025-09-06T09:56:01.262815031Z" level=info msg="Container f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:01.277427 containerd[1557]: time="2025-09-06T09:56:01.276009222Z" level=info msg="CreateContainer within sandbox \"3fa5f3972f8d2ad5aa2932c4f330b195aa16cfcc466d31369e52bb4d0a97c202\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875\"" Sep 6 09:56:01.277427 containerd[1557]: time="2025-09-06T09:56:01.276795879Z" level=info msg="StartContainer for \"f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875\"" Sep 6 09:56:01.279249 containerd[1557]: time="2025-09-06T09:56:01.279173234Z" level=info msg="connecting to shim f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875" address="unix:///run/containerd/s/0b4bf8213c51a9f126845fdcee2b6226f6e2b72d946fd940c124b834d2f27c3e" protocol=ttrpc version=3 Sep 6 09:56:01.307048 systemd[1]: Started cri-containerd-f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875.scope - libcontainer container f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875. Sep 6 09:56:01.311991 containerd[1557]: time="2025-09-06T09:56:01.311941528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-886pt,Uid:268e0815-b337-409e-ab53-bda599d9303f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a\"" Sep 6 09:56:01.313990 containerd[1557]: time="2025-09-06T09:56:01.313958447Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 6 09:56:01.358746 containerd[1557]: time="2025-09-06T09:56:01.358688286Z" level=info msg="StartContainer for \"f40e42e143deb0b9813515994533c235fd7b097ae788cbf9f8b28b720e105875\" returns successfully" Sep 6 09:56:01.624968 kubelet[2713]: I0906 09:56:01.624890 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8xmwn" podStartSLOduration=1.624872135 podStartE2EDuration="1.624872135s" podCreationTimestamp="2025-09-06 09:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:56:01.624147276 +0000 UTC m=+7.166899929" watchObservedRunningTime="2025-09-06 09:56:01.624872135 +0000 UTC m=+7.167624788" Sep 6 09:56:02.560959 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3120842334.mount: Deactivated successfully. Sep 6 09:56:02.880679 containerd[1557]: time="2025-09-06T09:56:02.880521777Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:02.881405 containerd[1557]: time="2025-09-06T09:56:02.881347307Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 6 09:56:02.882657 containerd[1557]: time="2025-09-06T09:56:02.882629635Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:02.885518 containerd[1557]: time="2025-09-06T09:56:02.885479444Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:02.886124 containerd[1557]: time="2025-09-06T09:56:02.886090566Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.572093665s" Sep 6 09:56:02.886158 containerd[1557]: time="2025-09-06T09:56:02.886122818Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 6 09:56:02.891621 containerd[1557]: time="2025-09-06T09:56:02.891589271Z" level=info msg="CreateContainer within sandbox \"72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 6 09:56:02.898413 containerd[1557]: time="2025-09-06T09:56:02.898356790Z" level=info msg="Container 637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:02.905770 containerd[1557]: time="2025-09-06T09:56:02.905719458Z" level=info msg="CreateContainer within sandbox \"72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\"" Sep 6 09:56:02.906331 containerd[1557]: time="2025-09-06T09:56:02.906302447Z" level=info msg="StartContainer for \"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\"" Sep 6 09:56:02.907247 containerd[1557]: time="2025-09-06T09:56:02.907213289Z" level=info msg="connecting to shim 637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a" address="unix:///run/containerd/s/7e8b97be5a728800a3cd9f4d5108acebeaa5a8d1d5c7c607156e4d982a7bde7c" protocol=ttrpc version=3 Sep 6 09:56:02.971616 systemd[1]: Started cri-containerd-637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a.scope - libcontainer container 637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a. Sep 6 09:56:03.013615 containerd[1557]: time="2025-09-06T09:56:03.013552145Z" level=info msg="StartContainer for \"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\" returns successfully" Sep 6 09:56:04.371528 kubelet[2713]: I0906 09:56:04.371414 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-886pt" podStartSLOduration=2.79787958 podStartE2EDuration="4.371375463s" podCreationTimestamp="2025-09-06 09:56:00 +0000 UTC" firstStartedPulling="2025-09-06 09:56:01.313477632 +0000 UTC m=+6.856230285" lastFinishedPulling="2025-09-06 09:56:02.886973515 +0000 UTC m=+8.429726168" observedRunningTime="2025-09-06 09:56:03.600136124 +0000 UTC m=+9.142888797" watchObservedRunningTime="2025-09-06 09:56:04.371375463 +0000 UTC m=+9.914128117" Sep 6 09:56:05.084685 systemd[1]: cri-containerd-637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a.scope: Deactivated successfully. Sep 6 09:56:05.092513 containerd[1557]: time="2025-09-06T09:56:05.092441200Z" level=info msg="received exit event container_id:\"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\" id:\"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\" pid:3043 exit_status:1 exited_at:{seconds:1757152565 nanos:88170429}" Sep 6 09:56:05.092845 containerd[1557]: time="2025-09-06T09:56:05.092776085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\" id:\"637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a\" pid:3043 exit_status:1 exited_at:{seconds:1757152565 nanos:88170429}" Sep 6 09:56:05.117335 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a-rootfs.mount: Deactivated successfully. Sep 6 09:56:05.595018 kubelet[2713]: I0906 09:56:05.594975 2713 scope.go:117] "RemoveContainer" containerID="637e5e4984f36486fd37b9d570928ba99aeb3e71344c94d9cfcba01be8d8930a" Sep 6 09:56:05.597414 containerd[1557]: time="2025-09-06T09:56:05.596651574Z" level=info msg="CreateContainer within sandbox \"72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 6 09:56:05.610664 containerd[1557]: time="2025-09-06T09:56:05.610637342Z" level=info msg="Container b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:05.614552 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount110637846.mount: Deactivated successfully. Sep 6 09:56:05.620007 containerd[1557]: time="2025-09-06T09:56:05.619960346Z" level=info msg="CreateContainer within sandbox \"72c08b14ea2e9955b61711754139babdbc0729fb0681553387ad668f4d03989a\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f\"" Sep 6 09:56:05.620547 containerd[1557]: time="2025-09-06T09:56:05.620512964Z" level=info msg="StartContainer for \"b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f\"" Sep 6 09:56:05.621514 containerd[1557]: time="2025-09-06T09:56:05.621481231Z" level=info msg="connecting to shim b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f" address="unix:///run/containerd/s/7e8b97be5a728800a3cd9f4d5108acebeaa5a8d1d5c7c607156e4d982a7bde7c" protocol=ttrpc version=3 Sep 6 09:56:05.641535 systemd[1]: Started cri-containerd-b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f.scope - libcontainer container b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f. Sep 6 09:56:05.680513 containerd[1557]: time="2025-09-06T09:56:05.680386280Z" level=info msg="StartContainer for \"b96ac535c572bfff3187382b9419b9c49e0cf74ca9ca76e65a31563ff06c824f\" returns successfully" Sep 6 09:56:07.195976 update_engine[1546]: I20250906 09:56:07.195809 1546 update_attempter.cc:509] Updating boot flags... Sep 6 09:56:08.430210 sudo[1768]: pam_unix(sudo:session): session closed for user root Sep 6 09:56:08.432937 sshd[1767]: Connection closed by 10.0.0.1 port 35194 Sep 6 09:56:08.433702 sshd-session[1764]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:08.438102 systemd[1]: sshd@6-10.0.0.36:22-10.0.0.1:35194.service: Deactivated successfully. Sep 6 09:56:08.440561 systemd[1]: session-7.scope: Deactivated successfully. Sep 6 09:56:08.440807 systemd[1]: session-7.scope: Consumed 6.261s CPU time, 229M memory peak. Sep 6 09:56:08.442252 systemd-logind[1541]: Session 7 logged out. Waiting for processes to exit. Sep 6 09:56:08.444115 systemd-logind[1541]: Removed session 7. Sep 6 09:56:11.766960 systemd[1]: Created slice kubepods-besteffort-pod32b43095_a8fb_4077_89de_7374525e946e.slice - libcontainer container kubepods-besteffort-pod32b43095_a8fb_4077_89de_7374525e946e.slice. Sep 6 09:56:11.831078 kubelet[2713]: I0906 09:56:11.831019 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/32b43095-a8fb-4077-89de-7374525e946e-typha-certs\") pod \"calico-typha-7f6bdd5c8-rvgnp\" (UID: \"32b43095-a8fb-4077-89de-7374525e946e\") " pod="calico-system/calico-typha-7f6bdd5c8-rvgnp" Sep 6 09:56:11.831078 kubelet[2713]: I0906 09:56:11.831063 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rxd\" (UniqueName: \"kubernetes.io/projected/32b43095-a8fb-4077-89de-7374525e946e-kube-api-access-h4rxd\") pod \"calico-typha-7f6bdd5c8-rvgnp\" (UID: \"32b43095-a8fb-4077-89de-7374525e946e\") " pod="calico-system/calico-typha-7f6bdd5c8-rvgnp" Sep 6 09:56:11.831078 kubelet[2713]: I0906 09:56:11.831089 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32b43095-a8fb-4077-89de-7374525e946e-tigera-ca-bundle\") pod \"calico-typha-7f6bdd5c8-rvgnp\" (UID: \"32b43095-a8fb-4077-89de-7374525e946e\") " pod="calico-system/calico-typha-7f6bdd5c8-rvgnp" Sep 6 09:56:12.007115 systemd[1]: Created slice kubepods-besteffort-poda685fbca_1632_4f89_bbf6_b69a927c79df.slice - libcontainer container kubepods-besteffort-poda685fbca_1632_4f89_bbf6_b69a927c79df.slice. Sep 6 09:56:12.032271 kubelet[2713]: I0906 09:56:12.032130 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-xtables-lock\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032271 kubelet[2713]: I0906 09:56:12.032195 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-lib-modules\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032271 kubelet[2713]: I0906 09:56:12.032212 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a685fbca-1632-4f89-bbf6-b69a927c79df-tigera-ca-bundle\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032271 kubelet[2713]: I0906 09:56:12.032230 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-cni-log-dir\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032568 kubelet[2713]: I0906 09:56:12.032245 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a685fbca-1632-4f89-bbf6-b69a927c79df-node-certs\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032605 kubelet[2713]: I0906 09:56:12.032575 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-var-lib-calico\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032605 kubelet[2713]: I0906 09:56:12.032590 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq64\" (UniqueName: \"kubernetes.io/projected/a685fbca-1632-4f89-bbf6-b69a927c79df-kube-api-access-6dq64\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032687 kubelet[2713]: I0906 09:56:12.032644 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-cni-bin-dir\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032735 kubelet[2713]: I0906 09:56:12.032712 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-var-run-calico\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032772 kubelet[2713]: I0906 09:56:12.032750 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-policysync\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032772 kubelet[2713]: I0906 09:56:12.032767 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-cni-net-dir\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.032828 kubelet[2713]: I0906 09:56:12.032789 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a685fbca-1632-4f89-bbf6-b69a927c79df-flexvol-driver-host\") pod \"calico-node-bmmts\" (UID: \"a685fbca-1632-4f89-bbf6-b69a927c79df\") " pod="calico-system/calico-node-bmmts" Sep 6 09:56:12.072276 containerd[1557]: time="2025-09-06T09:56:12.072225951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f6bdd5c8-rvgnp,Uid:32b43095-a8fb-4077-89de-7374525e946e,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:12.109775 containerd[1557]: time="2025-09-06T09:56:12.109703871Z" level=info msg="connecting to shim d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94" address="unix:///run/containerd/s/9b3820006aec8107da1afb7f6ce337768b5cc0792de51ea6148ed2c1bced2a07" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:12.134778 kubelet[2713]: E0906 09:56:12.134628 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.134778 kubelet[2713]: W0906 09:56:12.134658 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.134778 kubelet[2713]: E0906 09:56:12.134681 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.136412 kubelet[2713]: E0906 09:56:12.135002 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.136412 kubelet[2713]: W0906 09:56:12.135026 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.136412 kubelet[2713]: E0906 09:56:12.135045 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.136412 kubelet[2713]: E0906 09:56:12.135243 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.136412 kubelet[2713]: W0906 09:56:12.135250 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.136412 kubelet[2713]: E0906 09:56:12.135258 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.136654 kubelet[2713]: E0906 09:56:12.136626 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.136689 kubelet[2713]: W0906 09:56:12.136665 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.136689 kubelet[2713]: E0906 09:56:12.136675 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.136922 kubelet[2713]: E0906 09:56:12.136887 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.136922 kubelet[2713]: W0906 09:56:12.136910 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.136922 kubelet[2713]: E0906 09:56:12.136919 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.137133 kubelet[2713]: E0906 09:56:12.137114 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.137133 kubelet[2713]: W0906 09:56:12.137129 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.137189 kubelet[2713]: E0906 09:56:12.137138 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.137378 kubelet[2713]: E0906 09:56:12.137356 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.137378 kubelet[2713]: W0906 09:56:12.137369 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.137378 kubelet[2713]: E0906 09:56:12.137378 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.137628 kubelet[2713]: E0906 09:56:12.137608 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.137628 kubelet[2713]: W0906 09:56:12.137621 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.137689 kubelet[2713]: E0906 09:56:12.137650 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.137910 kubelet[2713]: E0906 09:56:12.137868 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.137910 kubelet[2713]: W0906 09:56:12.137903 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.137910 kubelet[2713]: E0906 09:56:12.137912 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138521 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139459 kubelet[2713]: W0906 09:56:12.138536 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138545 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138750 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139459 kubelet[2713]: W0906 09:56:12.138757 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138766 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138949 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139459 kubelet[2713]: W0906 09:56:12.138957 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.138965 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139459 kubelet[2713]: E0906 09:56:12.139121 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139761 kubelet[2713]: W0906 09:56:12.139128 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139135 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139313 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139761 kubelet[2713]: W0906 09:56:12.139320 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139328 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139510 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139761 kubelet[2713]: W0906 09:56:12.139517 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139525 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.139761 kubelet[2713]: E0906 09:56:12.139664 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.139761 kubelet[2713]: W0906 09:56:12.139671 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.139549 systemd[1]: Started cri-containerd-d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94.scope - libcontainer container d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94. Sep 6 09:56:12.140076 kubelet[2713]: E0906 09:56:12.139679 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.140076 kubelet[2713]: E0906 09:56:12.139829 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.140076 kubelet[2713]: W0906 09:56:12.139837 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.140076 kubelet[2713]: E0906 09:56:12.139844 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.140076 kubelet[2713]: E0906 09:56:12.139974 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.140076 kubelet[2713]: W0906 09:56:12.139980 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.140076 kubelet[2713]: E0906 09:56:12.139988 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.140287 kubelet[2713]: E0906 09:56:12.140143 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.140287 kubelet[2713]: W0906 09:56:12.140151 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.140287 kubelet[2713]: E0906 09:56:12.140158 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.140430 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.142763 kubelet[2713]: W0906 09:56:12.140446 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.140455 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.140862 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.142763 kubelet[2713]: W0906 09:56:12.140871 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.140880 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.141115 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.142763 kubelet[2713]: W0906 09:56:12.141124 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.141133 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.142763 kubelet[2713]: E0906 09:56:12.141306 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143014 kubelet[2713]: W0906 09:56:12.141314 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.141324 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.141913 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143014 kubelet[2713]: W0906 09:56:12.141922 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.141930 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.142223 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143014 kubelet[2713]: W0906 09:56:12.142377 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.142413 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143014 kubelet[2713]: E0906 09:56:12.142829 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143014 kubelet[2713]: W0906 09:56:12.142847 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143239 kubelet[2713]: E0906 09:56:12.142856 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143306 kubelet[2713]: E0906 09:56:12.143281 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143306 kubelet[2713]: W0906 09:56:12.143298 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143306 kubelet[2713]: E0906 09:56:12.143307 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143531 kubelet[2713]: E0906 09:56:12.143510 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143531 kubelet[2713]: W0906 09:56:12.143524 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143595 kubelet[2713]: E0906 09:56:12.143534 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.143708 kubelet[2713]: E0906 09:56:12.143686 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.143755 kubelet[2713]: W0906 09:56:12.143732 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.143755 kubelet[2713]: E0906 09:56:12.143753 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.144245 kubelet[2713]: E0906 09:56:12.144119 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.144818 kubelet[2713]: W0906 09:56:12.144668 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.144818 kubelet[2713]: E0906 09:56:12.144686 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.145362 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.146686 kubelet[2713]: W0906 09:56:12.145377 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.145386 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.145821 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.146686 kubelet[2713]: W0906 09:56:12.145829 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.145839 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.146111 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.146686 kubelet[2713]: W0906 09:56:12.146119 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.146128 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.146686 kubelet[2713]: E0906 09:56:12.146494 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.146934 kubelet[2713]: W0906 09:56:12.146503 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.146934 kubelet[2713]: E0906 09:56:12.146513 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.148861 kubelet[2713]: E0906 09:56:12.148839 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.148861 kubelet[2713]: W0906 09:56:12.148853 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.148861 kubelet[2713]: E0906 09:56:12.148861 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.197974 kubelet[2713]: E0906 09:56:12.197899 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:12.198609 containerd[1557]: time="2025-09-06T09:56:12.198566321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f6bdd5c8-rvgnp,Uid:32b43095-a8fb-4077-89de-7374525e946e,Namespace:calico-system,Attempt:0,} returns sandbox id \"d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94\"" Sep 6 09:56:12.202207 containerd[1557]: time="2025-09-06T09:56:12.202140802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 6 09:56:12.223867 kubelet[2713]: E0906 09:56:12.223827 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.223867 kubelet[2713]: W0906 09:56:12.223854 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.223867 kubelet[2713]: E0906 09:56:12.223876 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.224081 kubelet[2713]: E0906 09:56:12.224065 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.224081 kubelet[2713]: W0906 09:56:12.224077 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.224132 kubelet[2713]: E0906 09:56:12.224085 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.224283 kubelet[2713]: E0906 09:56:12.224265 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.224283 kubelet[2713]: W0906 09:56:12.224277 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.224333 kubelet[2713]: E0906 09:56:12.224285 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.224553 kubelet[2713]: E0906 09:56:12.224535 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.224553 kubelet[2713]: W0906 09:56:12.224547 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.224553 kubelet[2713]: E0906 09:56:12.224554 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.224808 kubelet[2713]: E0906 09:56:12.224782 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.224808 kubelet[2713]: W0906 09:56:12.224794 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.224808 kubelet[2713]: E0906 09:56:12.224802 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.224980 kubelet[2713]: E0906 09:56:12.224965 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.224980 kubelet[2713]: W0906 09:56:12.224975 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225027 kubelet[2713]: E0906 09:56:12.224983 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225153 kubelet[2713]: E0906 09:56:12.225137 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225153 kubelet[2713]: W0906 09:56:12.225148 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225217 kubelet[2713]: E0906 09:56:12.225156 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225308 kubelet[2713]: E0906 09:56:12.225294 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225308 kubelet[2713]: W0906 09:56:12.225304 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225354 kubelet[2713]: E0906 09:56:12.225313 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225500 kubelet[2713]: E0906 09:56:12.225475 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225500 kubelet[2713]: W0906 09:56:12.225486 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225500 kubelet[2713]: E0906 09:56:12.225493 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225648 kubelet[2713]: E0906 09:56:12.225633 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225648 kubelet[2713]: W0906 09:56:12.225643 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225700 kubelet[2713]: E0906 09:56:12.225650 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225811 kubelet[2713]: E0906 09:56:12.225795 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225811 kubelet[2713]: W0906 09:56:12.225806 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225884 kubelet[2713]: E0906 09:56:12.225813 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.225969 kubelet[2713]: E0906 09:56:12.225953 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.225969 kubelet[2713]: W0906 09:56:12.225963 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.225969 kubelet[2713]: E0906 09:56:12.225970 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.226144 kubelet[2713]: E0906 09:56:12.226129 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.226144 kubelet[2713]: W0906 09:56:12.226139 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.226197 kubelet[2713]: E0906 09:56:12.226147 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.226299 kubelet[2713]: E0906 09:56:12.226285 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.226299 kubelet[2713]: W0906 09:56:12.226296 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.226342 kubelet[2713]: E0906 09:56:12.226303 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.226487 kubelet[2713]: E0906 09:56:12.226472 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.226487 kubelet[2713]: W0906 09:56:12.226482 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.226541 kubelet[2713]: E0906 09:56:12.226490 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.226654 kubelet[2713]: E0906 09:56:12.226639 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.226654 kubelet[2713]: W0906 09:56:12.226649 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.226696 kubelet[2713]: E0906 09:56:12.226656 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.226853 kubelet[2713]: E0906 09:56:12.226837 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.226853 kubelet[2713]: W0906 09:56:12.226847 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.226853 kubelet[2713]: E0906 09:56:12.226854 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.227007 kubelet[2713]: E0906 09:56:12.226992 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.227007 kubelet[2713]: W0906 09:56:12.227002 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.227056 kubelet[2713]: E0906 09:56:12.227009 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.227173 kubelet[2713]: E0906 09:56:12.227158 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.227173 kubelet[2713]: W0906 09:56:12.227168 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.227221 kubelet[2713]: E0906 09:56:12.227177 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.227339 kubelet[2713]: E0906 09:56:12.227325 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.227339 kubelet[2713]: W0906 09:56:12.227335 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.227412 kubelet[2713]: E0906 09:56:12.227342 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.241698 kubelet[2713]: E0906 09:56:12.241675 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.241698 kubelet[2713]: W0906 09:56:12.241691 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.241698 kubelet[2713]: E0906 09:56:12.241704 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.241889 kubelet[2713]: I0906 09:56:12.241768 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63b0ea24-5c9f-4c70-aaef-43e73d0fb19a-socket-dir\") pod \"csi-node-driver-l8pm4\" (UID: \"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a\") " pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:12.242041 kubelet[2713]: E0906 09:56:12.242023 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.242041 kubelet[2713]: W0906 09:56:12.242036 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.242091 kubelet[2713]: E0906 09:56:12.242068 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.242120 kubelet[2713]: I0906 09:56:12.242093 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/63b0ea24-5c9f-4c70-aaef-43e73d0fb19a-varrun\") pod \"csi-node-driver-l8pm4\" (UID: \"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a\") " pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:12.242380 kubelet[2713]: E0906 09:56:12.242358 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.242430 kubelet[2713]: W0906 09:56:12.242383 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.242430 kubelet[2713]: E0906 09:56:12.242419 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.242480 kubelet[2713]: I0906 09:56:12.242442 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxf8\" (UniqueName: \"kubernetes.io/projected/63b0ea24-5c9f-4c70-aaef-43e73d0fb19a-kube-api-access-tkxf8\") pod \"csi-node-driver-l8pm4\" (UID: \"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a\") " pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:12.242828 kubelet[2713]: E0906 09:56:12.242800 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.242828 kubelet[2713]: W0906 09:56:12.242813 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.242828 kubelet[2713]: E0906 09:56:12.242823 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.243066 kubelet[2713]: E0906 09:56:12.243048 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.243066 kubelet[2713]: W0906 09:56:12.243060 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.243116 kubelet[2713]: E0906 09:56:12.243070 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.243304 kubelet[2713]: E0906 09:56:12.243287 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.243304 kubelet[2713]: W0906 09:56:12.243298 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.243368 kubelet[2713]: E0906 09:56:12.243307 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.243564 kubelet[2713]: E0906 09:56:12.243549 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.243564 kubelet[2713]: W0906 09:56:12.243559 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.243617 kubelet[2713]: E0906 09:56:12.243568 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.243784 kubelet[2713]: E0906 09:56:12.243756 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.243784 kubelet[2713]: W0906 09:56:12.243768 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.243784 kubelet[2713]: E0906 09:56:12.243776 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.243963 kubelet[2713]: E0906 09:56:12.243948 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.243963 kubelet[2713]: W0906 09:56:12.243959 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.244009 kubelet[2713]: E0906 09:56:12.243967 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.244152 kubelet[2713]: E0906 09:56:12.244124 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.244152 kubelet[2713]: W0906 09:56:12.244136 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.244152 kubelet[2713]: E0906 09:56:12.244143 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.244221 kubelet[2713]: I0906 09:56:12.244164 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63b0ea24-5c9f-4c70-aaef-43e73d0fb19a-kubelet-dir\") pod \"csi-node-driver-l8pm4\" (UID: \"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a\") " pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:12.244371 kubelet[2713]: E0906 09:56:12.244355 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.244371 kubelet[2713]: W0906 09:56:12.244366 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.244441 kubelet[2713]: E0906 09:56:12.244374 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.244441 kubelet[2713]: I0906 09:56:12.244409 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63b0ea24-5c9f-4c70-aaef-43e73d0fb19a-registration-dir\") pod \"csi-node-driver-l8pm4\" (UID: \"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a\") " pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:12.244721 kubelet[2713]: E0906 09:56:12.244689 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.244721 kubelet[2713]: W0906 09:56:12.244718 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.244799 kubelet[2713]: E0906 09:56:12.244745 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.244951 kubelet[2713]: E0906 09:56:12.244934 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.244951 kubelet[2713]: W0906 09:56:12.244945 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.245024 kubelet[2713]: E0906 09:56:12.244954 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.245187 kubelet[2713]: E0906 09:56:12.245168 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.245187 kubelet[2713]: W0906 09:56:12.245181 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.245237 kubelet[2713]: E0906 09:56:12.245190 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.245399 kubelet[2713]: E0906 09:56:12.245362 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.245399 kubelet[2713]: W0906 09:56:12.245372 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.245399 kubelet[2713]: E0906 09:56:12.245379 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.310910 containerd[1557]: time="2025-09-06T09:56:12.310778634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bmmts,Uid:a685fbca-1632-4f89-bbf6-b69a927c79df,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:12.333523 containerd[1557]: time="2025-09-06T09:56:12.333384934Z" level=info msg="connecting to shim 95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee" address="unix:///run/containerd/s/7658504cd4c1f33c743b678ade211adf3dd79a766c4f54f2c30b4fe2a960de31" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:12.345723 kubelet[2713]: E0906 09:56:12.345566 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.345723 kubelet[2713]: W0906 09:56:12.345592 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.345723 kubelet[2713]: E0906 09:56:12.345611 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.346082 kubelet[2713]: E0906 09:56:12.346043 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.346082 kubelet[2713]: W0906 09:56:12.346071 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.346131 kubelet[2713]: E0906 09:56:12.346097 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.346456 kubelet[2713]: E0906 09:56:12.346439 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.346456 kubelet[2713]: W0906 09:56:12.346451 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.346521 kubelet[2713]: E0906 09:56:12.346460 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.346772 kubelet[2713]: E0906 09:56:12.346746 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.346772 kubelet[2713]: W0906 09:56:12.346761 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.346772 kubelet[2713]: E0906 09:56:12.346770 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.346996 kubelet[2713]: E0906 09:56:12.346980 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.346996 kubelet[2713]: W0906 09:56:12.346991 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.347069 kubelet[2713]: E0906 09:56:12.347000 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.347259 kubelet[2713]: E0906 09:56:12.347235 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.347284 kubelet[2713]: W0906 09:56:12.347259 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.347284 kubelet[2713]: E0906 09:56:12.347268 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.347584 kubelet[2713]: E0906 09:56:12.347567 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.347584 kubelet[2713]: W0906 09:56:12.347578 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.347651 kubelet[2713]: E0906 09:56:12.347587 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.347874 kubelet[2713]: E0906 09:56:12.347847 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.347874 kubelet[2713]: W0906 09:56:12.347860 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.347874 kubelet[2713]: E0906 09:56:12.347868 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.349157 kubelet[2713]: E0906 09:56:12.349137 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.349157 kubelet[2713]: W0906 09:56:12.349152 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.349247 kubelet[2713]: E0906 09:56:12.349163 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.349635 kubelet[2713]: E0906 09:56:12.349612 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.349635 kubelet[2713]: W0906 09:56:12.349625 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.349635 kubelet[2713]: E0906 09:56:12.349635 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.350327 kubelet[2713]: E0906 09:56:12.350306 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.350327 kubelet[2713]: W0906 09:56:12.350319 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.350327 kubelet[2713]: E0906 09:56:12.350329 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.350933 kubelet[2713]: E0906 09:56:12.350898 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.350933 kubelet[2713]: W0906 09:56:12.350915 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.350933 kubelet[2713]: E0906 09:56:12.350926 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.352384 kubelet[2713]: E0906 09:56:12.352353 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.352384 kubelet[2713]: W0906 09:56:12.352365 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.352384 kubelet[2713]: E0906 09:56:12.352374 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.354012 kubelet[2713]: E0906 09:56:12.353990 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.354012 kubelet[2713]: W0906 09:56:12.354004 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.354012 kubelet[2713]: E0906 09:56:12.354014 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.354297 kubelet[2713]: E0906 09:56:12.354253 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.354297 kubelet[2713]: W0906 09:56:12.354285 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.354357 kubelet[2713]: E0906 09:56:12.354313 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.355219 kubelet[2713]: E0906 09:56:12.355175 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.355219 kubelet[2713]: W0906 09:56:12.355189 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.355219 kubelet[2713]: E0906 09:56:12.355200 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.355515 kubelet[2713]: E0906 09:56:12.355489 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.355515 kubelet[2713]: W0906 09:56:12.355505 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.355515 kubelet[2713]: E0906 09:56:12.355515 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.355836 kubelet[2713]: E0906 09:56:12.355813 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.355836 kubelet[2713]: W0906 09:56:12.355828 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.355836 kubelet[2713]: E0906 09:56:12.355837 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.356203 kubelet[2713]: E0906 09:56:12.356178 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.356203 kubelet[2713]: W0906 09:56:12.356192 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.356203 kubelet[2713]: E0906 09:56:12.356204 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.356600 kubelet[2713]: E0906 09:56:12.356569 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.356600 kubelet[2713]: W0906 09:56:12.356586 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.356600 kubelet[2713]: E0906 09:56:12.356595 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.357172 kubelet[2713]: E0906 09:56:12.357149 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.357172 kubelet[2713]: W0906 09:56:12.357163 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.357172 kubelet[2713]: E0906 09:56:12.357172 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.357497 kubelet[2713]: E0906 09:56:12.357475 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.357497 kubelet[2713]: W0906 09:56:12.357490 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.357497 kubelet[2713]: E0906 09:56:12.357498 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.358080 kubelet[2713]: E0906 09:56:12.358056 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.358080 kubelet[2713]: W0906 09:56:12.358071 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.358080 kubelet[2713]: E0906 09:56:12.358081 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.358355 kubelet[2713]: E0906 09:56:12.358330 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.358355 kubelet[2713]: W0906 09:56:12.358345 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.358355 kubelet[2713]: E0906 09:56:12.358354 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.358618 kubelet[2713]: E0906 09:56:12.358594 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.358618 kubelet[2713]: W0906 09:56:12.358608 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.358618 kubelet[2713]: E0906 09:56:12.358616 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.359524 systemd[1]: Started cri-containerd-95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee.scope - libcontainer container 95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee. Sep 6 09:56:12.360457 kubelet[2713]: E0906 09:56:12.360438 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:12.360457 kubelet[2713]: W0906 09:56:12.360453 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:12.360528 kubelet[2713]: E0906 09:56:12.360464 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:12.390562 containerd[1557]: time="2025-09-06T09:56:12.390514113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bmmts,Uid:a685fbca-1632-4f89-bbf6-b69a927c79df,Namespace:calico-system,Attempt:0,} returns sandbox id \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\"" Sep 6 09:56:13.489629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2805745598.mount: Deactivated successfully. Sep 6 09:56:14.555419 kubelet[2713]: E0906 09:56:14.554219 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:15.155677 containerd[1557]: time="2025-09-06T09:56:15.155616867Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:15.156516 containerd[1557]: time="2025-09-06T09:56:15.156492319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 6 09:56:15.157816 containerd[1557]: time="2025-09-06T09:56:15.157785397Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:15.159732 containerd[1557]: time="2025-09-06T09:56:15.159700710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:15.160429 containerd[1557]: time="2025-09-06T09:56:15.160236791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.958062866s" Sep 6 09:56:15.160429 containerd[1557]: time="2025-09-06T09:56:15.160277098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 6 09:56:15.161288 containerd[1557]: time="2025-09-06T09:56:15.161245545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 6 09:56:15.173892 containerd[1557]: time="2025-09-06T09:56:15.173844031Z" level=info msg="CreateContainer within sandbox \"d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 6 09:56:15.181953 containerd[1557]: time="2025-09-06T09:56:15.181907151Z" level=info msg="Container d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:15.190595 containerd[1557]: time="2025-09-06T09:56:15.190554243Z" level=info msg="CreateContainer within sandbox \"d6702b7b1008714577dec779ee9ee4dd0b40507dbd32c1a79f7a951050e53c94\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab\"" Sep 6 09:56:15.191121 containerd[1557]: time="2025-09-06T09:56:15.191099542Z" level=info msg="StartContainer for \"d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab\"" Sep 6 09:56:15.192209 containerd[1557]: time="2025-09-06T09:56:15.192165463Z" level=info msg="connecting to shim d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab" address="unix:///run/containerd/s/9b3820006aec8107da1afb7f6ce337768b5cc0792de51ea6148ed2c1bced2a07" protocol=ttrpc version=3 Sep 6 09:56:15.218549 systemd[1]: Started cri-containerd-d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab.scope - libcontainer container d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab. Sep 6 09:56:15.272052 containerd[1557]: time="2025-09-06T09:56:15.271997930Z" level=info msg="StartContainer for \"d1484e4364d6c7437523e2fc84f4ad716155c6de3430f1676897c7ae46c9ffab\" returns successfully" Sep 6 09:56:15.642279 kubelet[2713]: I0906 09:56:15.642209 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7f6bdd5c8-rvgnp" podStartSLOduration=1.681924713 podStartE2EDuration="4.642194847s" podCreationTimestamp="2025-09-06 09:56:11 +0000 UTC" firstStartedPulling="2025-09-06 09:56:12.200819887 +0000 UTC m=+17.743572540" lastFinishedPulling="2025-09-06 09:56:15.161090021 +0000 UTC m=+20.703842674" observedRunningTime="2025-09-06 09:56:15.632053947 +0000 UTC m=+21.174806600" watchObservedRunningTime="2025-09-06 09:56:15.642194847 +0000 UTC m=+21.184947500" Sep 6 09:56:15.648500 kubelet[2713]: E0906 09:56:15.648449 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.648500 kubelet[2713]: W0906 09:56:15.648474 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.648500 kubelet[2713]: E0906 09:56:15.648497 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.648766 kubelet[2713]: E0906 09:56:15.648735 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.648766 kubelet[2713]: W0906 09:56:15.648747 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.648766 kubelet[2713]: E0906 09:56:15.648755 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.648961 kubelet[2713]: E0906 09:56:15.648941 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.648961 kubelet[2713]: W0906 09:56:15.648957 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.649032 kubelet[2713]: E0906 09:56:15.648966 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.649376 kubelet[2713]: E0906 09:56:15.649202 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.649376 kubelet[2713]: W0906 09:56:15.649215 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.649376 kubelet[2713]: E0906 09:56:15.649224 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.649499 kubelet[2713]: E0906 09:56:15.649417 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.649499 kubelet[2713]: W0906 09:56:15.649426 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.649499 kubelet[2713]: E0906 09:56:15.649435 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.649622 kubelet[2713]: E0906 09:56:15.649590 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.649622 kubelet[2713]: W0906 09:56:15.649598 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.649622 kubelet[2713]: E0906 09:56:15.649605 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.650193 kubelet[2713]: E0906 09:56:15.650172 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.650235 kubelet[2713]: W0906 09:56:15.650212 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.650235 kubelet[2713]: E0906 09:56:15.650223 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.650556 kubelet[2713]: E0906 09:56:15.650523 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.650556 kubelet[2713]: W0906 09:56:15.650535 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.650613 kubelet[2713]: E0906 09:56:15.650563 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.650799 kubelet[2713]: E0906 09:56:15.650783 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.650834 kubelet[2713]: W0906 09:56:15.650823 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.650856 kubelet[2713]: E0906 09:56:15.650833 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.651156 kubelet[2713]: E0906 09:56:15.651098 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.651156 kubelet[2713]: W0906 09:56:15.651111 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.651156 kubelet[2713]: E0906 09:56:15.651121 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.651338 kubelet[2713]: E0906 09:56:15.651303 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.651338 kubelet[2713]: W0906 09:56:15.651312 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.651338 kubelet[2713]: E0906 09:56:15.651319 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.651573 kubelet[2713]: E0906 09:56:15.651536 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.651573 kubelet[2713]: W0906 09:56:15.651567 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.651573 kubelet[2713]: E0906 09:56:15.651578 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.651814 kubelet[2713]: E0906 09:56:15.651786 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.651814 kubelet[2713]: W0906 09:56:15.651799 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.651814 kubelet[2713]: E0906 09:56:15.651807 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.651977 kubelet[2713]: E0906 09:56:15.651958 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.651977 kubelet[2713]: W0906 09:56:15.651972 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.651977 kubelet[2713]: E0906 09:56:15.651980 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.652156 kubelet[2713]: E0906 09:56:15.652139 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.652156 kubelet[2713]: W0906 09:56:15.652150 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.652156 kubelet[2713]: E0906 09:56:15.652158 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.674684 kubelet[2713]: E0906 09:56:15.674627 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.674684 kubelet[2713]: W0906 09:56:15.674654 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.674684 kubelet[2713]: E0906 09:56:15.674675 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.674950 kubelet[2713]: E0906 09:56:15.674920 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.674950 kubelet[2713]: W0906 09:56:15.674935 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.674950 kubelet[2713]: E0906 09:56:15.674945 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.675200 kubelet[2713]: E0906 09:56:15.675169 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.675200 kubelet[2713]: W0906 09:56:15.675183 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.675200 kubelet[2713]: E0906 09:56:15.675193 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.675608 kubelet[2713]: E0906 09:56:15.675574 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.675608 kubelet[2713]: W0906 09:56:15.675604 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.675688 kubelet[2713]: E0906 09:56:15.675627 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.675921 kubelet[2713]: E0906 09:56:15.675886 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.675921 kubelet[2713]: W0906 09:56:15.675904 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.675921 kubelet[2713]: E0906 09:56:15.675916 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.676137 kubelet[2713]: E0906 09:56:15.676115 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.676137 kubelet[2713]: W0906 09:56:15.676129 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.676137 kubelet[2713]: E0906 09:56:15.676138 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.676487 kubelet[2713]: E0906 09:56:15.676465 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.676487 kubelet[2713]: W0906 09:56:15.676481 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.676577 kubelet[2713]: E0906 09:56:15.676492 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.676732 kubelet[2713]: E0906 09:56:15.676711 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.676732 kubelet[2713]: W0906 09:56:15.676725 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.676732 kubelet[2713]: E0906 09:56:15.676736 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.676958 kubelet[2713]: E0906 09:56:15.676937 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.676958 kubelet[2713]: W0906 09:56:15.676950 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.677050 kubelet[2713]: E0906 09:56:15.676964 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.677195 kubelet[2713]: E0906 09:56:15.677175 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.677195 kubelet[2713]: W0906 09:56:15.677188 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.677195 kubelet[2713]: E0906 09:56:15.677198 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.677464 kubelet[2713]: E0906 09:56:15.677441 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.677464 kubelet[2713]: W0906 09:56:15.677454 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.677464 kubelet[2713]: E0906 09:56:15.677464 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.677678 kubelet[2713]: E0906 09:56:15.677658 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.677678 kubelet[2713]: W0906 09:56:15.677671 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.677678 kubelet[2713]: E0906 09:56:15.677681 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.677937 kubelet[2713]: E0906 09:56:15.677917 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.677937 kubelet[2713]: W0906 09:56:15.677929 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.677937 kubelet[2713]: E0906 09:56:15.677939 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.678208 kubelet[2713]: E0906 09:56:15.678185 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.678208 kubelet[2713]: W0906 09:56:15.678202 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.678283 kubelet[2713]: E0906 09:56:15.678213 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.678482 kubelet[2713]: E0906 09:56:15.678465 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.678482 kubelet[2713]: W0906 09:56:15.678478 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.678566 kubelet[2713]: E0906 09:56:15.678487 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.678777 kubelet[2713]: E0906 09:56:15.678757 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.678777 kubelet[2713]: W0906 09:56:15.678768 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.678777 kubelet[2713]: E0906 09:56:15.678776 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.679145 kubelet[2713]: E0906 09:56:15.679118 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.679145 kubelet[2713]: W0906 09:56:15.679137 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.679223 kubelet[2713]: E0906 09:56:15.679149 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:15.679429 kubelet[2713]: E0906 09:56:15.679363 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:15.679491 kubelet[2713]: W0906 09:56:15.679437 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:15.679491 kubelet[2713]: E0906 09:56:15.679451 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.554139 kubelet[2713]: E0906 09:56:16.554098 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:16.584005 containerd[1557]: time="2025-09-06T09:56:16.583940179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:16.584735 containerd[1557]: time="2025-09-06T09:56:16.584702538Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 6 09:56:16.585884 containerd[1557]: time="2025-09-06T09:56:16.585835153Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:16.587705 containerd[1557]: time="2025-09-06T09:56:16.587647611Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:16.588222 containerd[1557]: time="2025-09-06T09:56:16.588175466Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.426900696s" Sep 6 09:56:16.588222 containerd[1557]: time="2025-09-06T09:56:16.588206545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 6 09:56:16.592490 containerd[1557]: time="2025-09-06T09:56:16.592440559Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 6 09:56:16.601510 containerd[1557]: time="2025-09-06T09:56:16.601464435Z" level=info msg="Container 48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:16.609655 containerd[1557]: time="2025-09-06T09:56:16.609609094Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\"" Sep 6 09:56:16.610053 containerd[1557]: time="2025-09-06T09:56:16.610025228Z" level=info msg="StartContainer for \"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\"" Sep 6 09:56:16.611763 containerd[1557]: time="2025-09-06T09:56:16.611733790Z" level=info msg="connecting to shim 48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2" address="unix:///run/containerd/s/7658504cd4c1f33c743b678ade211adf3dd79a766c4f54f2c30b4fe2a960de31" protocol=ttrpc version=3 Sep 6 09:56:16.643674 systemd[1]: Started cri-containerd-48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2.scope - libcontainer container 48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2. Sep 6 09:56:16.658703 kubelet[2713]: E0906 09:56:16.658668 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.658703 kubelet[2713]: W0906 09:56:16.658694 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.659180 kubelet[2713]: E0906 09:56:16.658712 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.659459 kubelet[2713]: E0906 09:56:16.659441 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.659500 kubelet[2713]: W0906 09:56:16.659458 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.659500 kubelet[2713]: E0906 09:56:16.659471 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.659735 kubelet[2713]: E0906 09:56:16.659719 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.659735 kubelet[2713]: W0906 09:56:16.659730 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.659805 kubelet[2713]: E0906 09:56:16.659740 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.659981 kubelet[2713]: E0906 09:56:16.659968 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.659981 kubelet[2713]: W0906 09:56:16.659979 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.660037 kubelet[2713]: E0906 09:56:16.659987 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.660191 kubelet[2713]: E0906 09:56:16.660178 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.660191 kubelet[2713]: W0906 09:56:16.660188 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.660244 kubelet[2713]: E0906 09:56:16.660196 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.660473 kubelet[2713]: E0906 09:56:16.660460 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.660473 kubelet[2713]: W0906 09:56:16.660471 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.660532 kubelet[2713]: E0906 09:56:16.660478 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.660673 kubelet[2713]: E0906 09:56:16.660660 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.660673 kubelet[2713]: W0906 09:56:16.660670 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.660740 kubelet[2713]: E0906 09:56:16.660677 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.660869 kubelet[2713]: E0906 09:56:16.660854 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.660869 kubelet[2713]: W0906 09:56:16.660865 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.660929 kubelet[2713]: E0906 09:56:16.660874 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.661063 kubelet[2713]: E0906 09:56:16.661050 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.661063 kubelet[2713]: W0906 09:56:16.661060 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.661125 kubelet[2713]: E0906 09:56:16.661068 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.661525 kubelet[2713]: E0906 09:56:16.661511 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.661525 kubelet[2713]: W0906 09:56:16.661521 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.661608 kubelet[2713]: E0906 09:56:16.661530 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.661748 kubelet[2713]: E0906 09:56:16.661734 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.661748 kubelet[2713]: W0906 09:56:16.661744 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.661821 kubelet[2713]: E0906 09:56:16.661752 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.661954 kubelet[2713]: E0906 09:56:16.661925 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.661954 kubelet[2713]: W0906 09:56:16.661935 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.661954 kubelet[2713]: E0906 09:56:16.661943 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.662157 kubelet[2713]: E0906 09:56:16.662135 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.662157 kubelet[2713]: W0906 09:56:16.662153 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.662212 kubelet[2713]: E0906 09:56:16.662161 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.663502 kubelet[2713]: E0906 09:56:16.663484 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.663502 kubelet[2713]: W0906 09:56:16.663493 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.663581 kubelet[2713]: E0906 09:56:16.663502 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.663733 kubelet[2713]: E0906 09:56:16.663715 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.663733 kubelet[2713]: W0906 09:56:16.663725 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.663733 kubelet[2713]: E0906 09:56:16.663732 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.681101 kubelet[2713]: E0906 09:56:16.681055 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.681101 kubelet[2713]: W0906 09:56:16.681091 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.681101 kubelet[2713]: E0906 09:56:16.681112 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.681334 kubelet[2713]: E0906 09:56:16.681322 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.681334 kubelet[2713]: W0906 09:56:16.681330 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.681423 kubelet[2713]: E0906 09:56:16.681340 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.681614 kubelet[2713]: E0906 09:56:16.681593 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.681794 kubelet[2713]: W0906 09:56:16.681662 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.681794 kubelet[2713]: E0906 09:56:16.681691 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.682104 kubelet[2713]: E0906 09:56:16.681959 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.682104 kubelet[2713]: W0906 09:56:16.681970 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.682104 kubelet[2713]: E0906 09:56:16.681980 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.682217 kubelet[2713]: E0906 09:56:16.682202 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.682217 kubelet[2713]: W0906 09:56:16.682213 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.682264 kubelet[2713]: E0906 09:56:16.682225 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.682477 kubelet[2713]: E0906 09:56:16.682383 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.682477 kubelet[2713]: W0906 09:56:16.682442 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.682477 kubelet[2713]: E0906 09:56:16.682452 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.682729 kubelet[2713]: E0906 09:56:16.682704 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.682769 kubelet[2713]: W0906 09:56:16.682737 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.682769 kubelet[2713]: E0906 09:56:16.682746 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.682954 kubelet[2713]: E0906 09:56:16.682938 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.682954 kubelet[2713]: W0906 09:56:16.682950 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.683073 kubelet[2713]: E0906 09:56:16.682961 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.683239 kubelet[2713]: E0906 09:56:16.683182 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.683239 kubelet[2713]: W0906 09:56:16.683190 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.683239 kubelet[2713]: E0906 09:56:16.683198 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.683370 kubelet[2713]: E0906 09:56:16.683348 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.683370 kubelet[2713]: W0906 09:56:16.683357 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.683370 kubelet[2713]: E0906 09:56:16.683364 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.683564 kubelet[2713]: E0906 09:56:16.683538 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.683898 kubelet[2713]: W0906 09:56:16.683883 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.683929 kubelet[2713]: E0906 09:56:16.683898 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.684082 kubelet[2713]: E0906 09:56:16.684069 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.684082 kubelet[2713]: W0906 09:56:16.684079 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.684136 kubelet[2713]: E0906 09:56:16.684087 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.684456 kubelet[2713]: E0906 09:56:16.684438 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.684456 kubelet[2713]: W0906 09:56:16.684454 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.684512 kubelet[2713]: E0906 09:56:16.684466 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.684736 kubelet[2713]: E0906 09:56:16.684722 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.684773 kubelet[2713]: W0906 09:56:16.684735 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.684795 kubelet[2713]: E0906 09:56:16.684771 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.685015 kubelet[2713]: E0906 09:56:16.685000 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.685015 kubelet[2713]: W0906 09:56:16.685013 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.685083 kubelet[2713]: E0906 09:56:16.685024 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.685254 kubelet[2713]: E0906 09:56:16.685239 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.685254 kubelet[2713]: W0906 09:56:16.685252 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.685328 kubelet[2713]: E0906 09:56:16.685264 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.685551 kubelet[2713]: E0906 09:56:16.685534 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.685551 kubelet[2713]: W0906 09:56:16.685547 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.685616 kubelet[2713]: E0906 09:56:16.685558 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.686110 kubelet[2713]: E0906 09:56:16.686068 2713 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 6 09:56:16.686110 kubelet[2713]: W0906 09:56:16.686080 2713 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 6 09:56:16.686110 kubelet[2713]: E0906 09:56:16.686089 2713 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 6 09:56:16.708315 systemd[1]: cri-containerd-48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2.scope: Deactivated successfully. Sep 6 09:56:16.709931 containerd[1557]: time="2025-09-06T09:56:16.709898360Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\" id:\"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\" pid:3498 exited_at:{seconds:1757152576 nanos:709489819}" Sep 6 09:56:16.717563 containerd[1557]: time="2025-09-06T09:56:16.717537746Z" level=info msg="received exit event container_id:\"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\" id:\"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\" pid:3498 exited_at:{seconds:1757152576 nanos:709489819}" Sep 6 09:56:16.726878 containerd[1557]: time="2025-09-06T09:56:16.726822734Z" level=info msg="StartContainer for \"48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2\" returns successfully" Sep 6 09:56:16.743511 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-48ed2c76173898a47a8171759d505f81dc7df33ad6e197ad938b73b46c1df8b2-rootfs.mount: Deactivated successfully. Sep 6 09:56:17.632029 containerd[1557]: time="2025-09-06T09:56:17.631967770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 6 09:56:18.553475 kubelet[2713]: E0906 09:56:18.553412 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:20.553595 kubelet[2713]: E0906 09:56:20.553533 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:21.290456 containerd[1557]: time="2025-09-06T09:56:21.290411614Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:21.291331 containerd[1557]: time="2025-09-06T09:56:21.291294326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 6 09:56:21.292562 containerd[1557]: time="2025-09-06T09:56:21.292530473Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:21.294566 containerd[1557]: time="2025-09-06T09:56:21.294518797Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:21.295020 containerd[1557]: time="2025-09-06T09:56:21.294993531Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.662977089s" Sep 6 09:56:21.295074 containerd[1557]: time="2025-09-06T09:56:21.295019800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 6 09:56:21.299666 containerd[1557]: time="2025-09-06T09:56:21.299605664Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 6 09:56:21.309082 containerd[1557]: time="2025-09-06T09:56:21.309023736Z" level=info msg="Container 0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:21.321039 containerd[1557]: time="2025-09-06T09:56:21.320979796Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\"" Sep 6 09:56:21.321478 containerd[1557]: time="2025-09-06T09:56:21.321450803Z" level=info msg="StartContainer for \"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\"" Sep 6 09:56:21.323073 containerd[1557]: time="2025-09-06T09:56:21.323030618Z" level=info msg="connecting to shim 0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c" address="unix:///run/containerd/s/7658504cd4c1f33c743b678ade211adf3dd79a766c4f54f2c30b4fe2a960de31" protocol=ttrpc version=3 Sep 6 09:56:21.351680 systemd[1]: Started cri-containerd-0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c.scope - libcontainer container 0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c. Sep 6 09:56:21.400823 containerd[1557]: time="2025-09-06T09:56:21.400772286Z" level=info msg="StartContainer for \"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\" returns successfully" Sep 6 09:56:22.366268 systemd[1]: cri-containerd-0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c.scope: Deactivated successfully. Sep 6 09:56:22.367108 systemd[1]: cri-containerd-0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c.scope: Consumed 630ms CPU time, 179.8M memory peak, 4.6M read from disk, 171.3M written to disk. Sep 6 09:56:22.367802 containerd[1557]: time="2025-09-06T09:56:22.367766991Z" level=info msg="received exit event container_id:\"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\" id:\"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\" pid:3592 exited_at:{seconds:1757152582 nanos:367259186}" Sep 6 09:56:22.368239 containerd[1557]: time="2025-09-06T09:56:22.367844317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\" id:\"0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c\" pid:3592 exited_at:{seconds:1757152582 nanos:367259186}" Sep 6 09:56:22.390856 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0e5c0c6523234646fd770ea3e1689086fb060c63d4ae6753dd8c77e9cba1b71c-rootfs.mount: Deactivated successfully. Sep 6 09:56:22.422694 kubelet[2713]: I0906 09:56:22.422642 2713 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 6 09:56:22.588933 systemd[1]: Created slice kubepods-burstable-podc0286ad4_01b4_4d4f_9f29_776595a29e27.slice - libcontainer container kubepods-burstable-podc0286ad4_01b4_4d4f_9f29_776595a29e27.slice. Sep 6 09:56:22.598856 systemd[1]: Created slice kubepods-besteffort-pod8375b038_9619_4019_920e_7d98311eee19.slice - libcontainer container kubepods-besteffort-pod8375b038_9619_4019_920e_7d98311eee19.slice. Sep 6 09:56:22.608365 systemd[1]: Created slice kubepods-besteffort-pod51b3ff5b_aa04_4bb2_abbf_9a90272a9848.slice - libcontainer container kubepods-besteffort-pod51b3ff5b_aa04_4bb2_abbf_9a90272a9848.slice. Sep 6 09:56:22.614206 systemd[1]: Created slice kubepods-besteffort-pod994d4ee8_70cc_4102_9c4a_25ba50a55654.slice - libcontainer container kubepods-besteffort-pod994d4ee8_70cc_4102_9c4a_25ba50a55654.slice. Sep 6 09:56:22.621703 systemd[1]: Created slice kubepods-besteffort-pod63b0ea24_5c9f_4c70_aaef_43e73d0fb19a.slice - libcontainer container kubepods-besteffort-pod63b0ea24_5c9f_4c70_aaef_43e73d0fb19a.slice. Sep 6 09:56:22.622540 kubelet[2713]: I0906 09:56:22.622026 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/51b3ff5b-aa04-4bb2-abbf-9a90272a9848-goldmane-key-pair\") pod \"goldmane-54d579b49d-6pvdz\" (UID: \"51b3ff5b-aa04-4bb2-abbf-9a90272a9848\") " pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:22.622540 kubelet[2713]: I0906 09:56:22.622058 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxdkj\" (UniqueName: \"kubernetes.io/projected/51b3ff5b-aa04-4bb2-abbf-9a90272a9848-kube-api-access-sxdkj\") pod \"goldmane-54d579b49d-6pvdz\" (UID: \"51b3ff5b-aa04-4bb2-abbf-9a90272a9848\") " pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:22.622540 kubelet[2713]: I0906 09:56:22.622078 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/470610e9-9f1a-4a88-9c59-2bfe2488c508-calico-apiserver-certs\") pod \"calico-apiserver-7458d6cd87-b7tqr\" (UID: \"470610e9-9f1a-4a88-9c59-2bfe2488c508\") " pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" Sep 6 09:56:22.622540 kubelet[2713]: I0906 09:56:22.622096 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2914104c-6ace-4f41-a69b-2a4aeaf020e8-calico-apiserver-certs\") pod \"calico-apiserver-7458d6cd87-dzfxs\" (UID: \"2914104c-6ace-4f41-a69b-2a4aeaf020e8\") " pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" Sep 6 09:56:22.622540 kubelet[2713]: I0906 09:56:22.622111 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51b3ff5b-aa04-4bb2-abbf-9a90272a9848-config\") pod \"goldmane-54d579b49d-6pvdz\" (UID: \"51b3ff5b-aa04-4bb2-abbf-9a90272a9848\") " pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:22.622702 kubelet[2713]: I0906 09:56:22.622128 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzghj\" (UniqueName: \"kubernetes.io/projected/994d4ee8-70cc-4102-9c4a-25ba50a55654-kube-api-access-zzghj\") pod \"whisker-65df49d854-9cmsr\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " pod="calico-system/whisker-65df49d854-9cmsr" Sep 6 09:56:22.622702 kubelet[2713]: I0906 09:56:22.622142 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxz9l\" (UniqueName: \"kubernetes.io/projected/470610e9-9f1a-4a88-9c59-2bfe2488c508-kube-api-access-nxz9l\") pod \"calico-apiserver-7458d6cd87-b7tqr\" (UID: \"470610e9-9f1a-4a88-9c59-2bfe2488c508\") " pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" Sep 6 09:56:22.622702 kubelet[2713]: I0906 09:56:22.622159 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8375b038-9619-4019-920e-7d98311eee19-tigera-ca-bundle\") pod \"calico-kube-controllers-dcbc46f74-v8wfw\" (UID: \"8375b038-9619-4019-920e-7d98311eee19\") " pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" Sep 6 09:56:22.622702 kubelet[2713]: I0906 09:56:22.622176 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v84n\" (UniqueName: \"kubernetes.io/projected/2914104c-6ace-4f41-a69b-2a4aeaf020e8-kube-api-access-6v84n\") pod \"calico-apiserver-7458d6cd87-dzfxs\" (UID: \"2914104c-6ace-4f41-a69b-2a4aeaf020e8\") " pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" Sep 6 09:56:22.622702 kubelet[2713]: I0906 09:56:22.622190 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51b3ff5b-aa04-4bb2-abbf-9a90272a9848-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-6pvdz\" (UID: \"51b3ff5b-aa04-4bb2-abbf-9a90272a9848\") " pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:22.622820 kubelet[2713]: I0906 09:56:22.622210 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0286ad4-01b4-4d4f-9f29-776595a29e27-config-volume\") pod \"coredns-674b8bbfcf-s469z\" (UID: \"c0286ad4-01b4-4d4f-9f29-776595a29e27\") " pod="kube-system/coredns-674b8bbfcf-s469z" Sep 6 09:56:22.622820 kubelet[2713]: I0906 09:56:22.622226 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfxx\" (UniqueName: \"kubernetes.io/projected/c0286ad4-01b4-4d4f-9f29-776595a29e27-kube-api-access-xqfxx\") pod \"coredns-674b8bbfcf-s469z\" (UID: \"c0286ad4-01b4-4d4f-9f29-776595a29e27\") " pod="kube-system/coredns-674b8bbfcf-s469z" Sep 6 09:56:22.622820 kubelet[2713]: I0906 09:56:22.622244 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-backend-key-pair\") pod \"whisker-65df49d854-9cmsr\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " pod="calico-system/whisker-65df49d854-9cmsr" Sep 6 09:56:22.622820 kubelet[2713]: I0906 09:56:22.622259 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-ca-bundle\") pod \"whisker-65df49d854-9cmsr\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " pod="calico-system/whisker-65df49d854-9cmsr" Sep 6 09:56:22.622820 kubelet[2713]: I0906 09:56:22.622277 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmkd\" (UniqueName: \"kubernetes.io/projected/8375b038-9619-4019-920e-7d98311eee19-kube-api-access-5nmkd\") pod \"calico-kube-controllers-dcbc46f74-v8wfw\" (UID: \"8375b038-9619-4019-920e-7d98311eee19\") " pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" Sep 6 09:56:22.624683 containerd[1557]: time="2025-09-06T09:56:22.624640887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-l8pm4,Uid:63b0ea24-5c9f-4c70-aaef-43e73d0fb19a,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:22.634716 systemd[1]: Created slice kubepods-besteffort-pod2914104c_6ace_4f41_a69b_2a4aeaf020e8.slice - libcontainer container kubepods-besteffort-pod2914104c_6ace_4f41_a69b_2a4aeaf020e8.slice. Sep 6 09:56:22.645843 systemd[1]: Created slice kubepods-besteffort-pod470610e9_9f1a_4a88_9c59_2bfe2488c508.slice - libcontainer container kubepods-besteffort-pod470610e9_9f1a_4a88_9c59_2bfe2488c508.slice. Sep 6 09:56:22.651246 systemd[1]: Created slice kubepods-burstable-pod150001d0_9071_4406_80e3_3b959c3604cc.slice - libcontainer container kubepods-burstable-pod150001d0_9071_4406_80e3_3b959c3604cc.slice. Sep 6 09:56:22.656366 containerd[1557]: time="2025-09-06T09:56:22.656313133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 6 09:56:22.713253 containerd[1557]: time="2025-09-06T09:56:22.713160956Z" level=error msg="Failed to destroy network for sandbox \"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.714615 containerd[1557]: time="2025-09-06T09:56:22.714560610Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-l8pm4,Uid:63b0ea24-5c9f-4c70-aaef-43e73d0fb19a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.714917 kubelet[2713]: E0906 09:56:22.714856 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.714917 kubelet[2713]: E0906 09:56:22.714925 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:22.715080 kubelet[2713]: E0906 09:56:22.714945 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-l8pm4" Sep 6 09:56:22.715080 kubelet[2713]: E0906 09:56:22.714989 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-l8pm4_calico-system(63b0ea24-5c9f-4c70-aaef-43e73d0fb19a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-l8pm4_calico-system(63b0ea24-5c9f-4c70-aaef-43e73d0fb19a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25391dff307dac4c8d7cfed0f5b5c171cbcfe131b7d70bb7e9b921406efd3beb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-l8pm4" podUID="63b0ea24-5c9f-4c70-aaef-43e73d0fb19a" Sep 6 09:56:22.715512 systemd[1]: run-netns-cni\x2dbd64d958\x2def54\x2d236f\x2d36d6\x2d5c2854d97998.mount: Deactivated successfully. Sep 6 09:56:22.725053 kubelet[2713]: I0906 09:56:22.724997 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/150001d0-9071-4406-80e3-3b959c3604cc-config-volume\") pod \"coredns-674b8bbfcf-l9289\" (UID: \"150001d0-9071-4406-80e3-3b959c3604cc\") " pod="kube-system/coredns-674b8bbfcf-l9289" Sep 6 09:56:22.725053 kubelet[2713]: I0906 09:56:22.725040 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-582fv\" (UniqueName: \"kubernetes.io/projected/150001d0-9071-4406-80e3-3b959c3604cc-kube-api-access-582fv\") pod \"coredns-674b8bbfcf-l9289\" (UID: \"150001d0-9071-4406-80e3-3b959c3604cc\") " pod="kube-system/coredns-674b8bbfcf-l9289" Sep 6 09:56:22.896739 containerd[1557]: time="2025-09-06T09:56:22.896622760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s469z,Uid:c0286ad4-01b4-4d4f-9f29-776595a29e27,Namespace:kube-system,Attempt:0,}" Sep 6 09:56:22.904589 containerd[1557]: time="2025-09-06T09:56:22.904556123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-dcbc46f74-v8wfw,Uid:8375b038-9619-4019-920e-7d98311eee19,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:22.912056 containerd[1557]: time="2025-09-06T09:56:22.912026144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-6pvdz,Uid:51b3ff5b-aa04-4bb2-abbf-9a90272a9848,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:22.918177 containerd[1557]: time="2025-09-06T09:56:22.917685005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65df49d854-9cmsr,Uid:994d4ee8-70cc-4102-9c4a-25ba50a55654,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:22.942461 containerd[1557]: time="2025-09-06T09:56:22.942401157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-dzfxs,Uid:2914104c-6ace-4f41-a69b-2a4aeaf020e8,Namespace:calico-apiserver,Attempt:0,}" Sep 6 09:56:22.952421 containerd[1557]: time="2025-09-06T09:56:22.952350545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-b7tqr,Uid:470610e9-9f1a-4a88-9c59-2bfe2488c508,Namespace:calico-apiserver,Attempt:0,}" Sep 6 09:56:22.957743 containerd[1557]: time="2025-09-06T09:56:22.957694454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l9289,Uid:150001d0-9071-4406-80e3-3b959c3604cc,Namespace:kube-system,Attempt:0,}" Sep 6 09:56:22.980304 containerd[1557]: time="2025-09-06T09:56:22.980038442Z" level=error msg="Failed to destroy network for sandbox \"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.982994 containerd[1557]: time="2025-09-06T09:56:22.982960762Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s469z,Uid:c0286ad4-01b4-4d4f-9f29-776595a29e27,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.985346 kubelet[2713]: E0906 09:56:22.985279 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:22.985431 kubelet[2713]: E0906 09:56:22.985405 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-s469z" Sep 6 09:56:22.985467 kubelet[2713]: E0906 09:56:22.985431 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-s469z" Sep 6 09:56:22.985547 kubelet[2713]: E0906 09:56:22.985507 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-s469z_kube-system(c0286ad4-01b4-4d4f-9f29-776595a29e27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-s469z_kube-system(c0286ad4-01b4-4d4f-9f29-776595a29e27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a712168c3d05467d53d0fe2d77ec09bdc8f6fd616bba9c0db8dec719f5cddc8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-s469z" podUID="c0286ad4-01b4-4d4f-9f29-776595a29e27" Sep 6 09:56:23.002444 containerd[1557]: time="2025-09-06T09:56:23.002358172Z" level=error msg="Failed to destroy network for sandbox \"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.004505 containerd[1557]: time="2025-09-06T09:56:23.004468123Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-dcbc46f74-v8wfw,Uid:8375b038-9619-4019-920e-7d98311eee19,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.004822 kubelet[2713]: E0906 09:56:23.004752 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.004822 kubelet[2713]: E0906 09:56:23.004837 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" Sep 6 09:56:23.005009 kubelet[2713]: E0906 09:56:23.004861 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" Sep 6 09:56:23.005009 kubelet[2713]: E0906 09:56:23.004916 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-dcbc46f74-v8wfw_calico-system(8375b038-9619-4019-920e-7d98311eee19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-dcbc46f74-v8wfw_calico-system(8375b038-9619-4019-920e-7d98311eee19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dde3a5852841cbb33fe53c426a6aeeab7a0c3c07582e9915e4ea7cd226ad04a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" podUID="8375b038-9619-4019-920e-7d98311eee19" Sep 6 09:56:23.007721 containerd[1557]: time="2025-09-06T09:56:23.007549542Z" level=error msg="Failed to destroy network for sandbox \"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.010423 containerd[1557]: time="2025-09-06T09:56:23.010351564Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65df49d854-9cmsr,Uid:994d4ee8-70cc-4102-9c4a-25ba50a55654,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.010666 kubelet[2713]: E0906 09:56:23.010626 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.010728 kubelet[2713]: E0906 09:56:23.010695 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65df49d854-9cmsr" Sep 6 09:56:23.010728 kubelet[2713]: E0906 09:56:23.010716 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65df49d854-9cmsr" Sep 6 09:56:23.010902 kubelet[2713]: E0906 09:56:23.010768 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65df49d854-9cmsr_calico-system(994d4ee8-70cc-4102-9c4a-25ba50a55654)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65df49d854-9cmsr_calico-system(994d4ee8-70cc-4102-9c4a-25ba50a55654)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d599e26c0a366d11cacf766d0ce8dd3f4298051d646b1f72c0ba741cfcc392ef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65df49d854-9cmsr" podUID="994d4ee8-70cc-4102-9c4a-25ba50a55654" Sep 6 09:56:23.021609 containerd[1557]: time="2025-09-06T09:56:23.021503320Z" level=error msg="Failed to destroy network for sandbox \"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.029671 containerd[1557]: time="2025-09-06T09:56:23.029534563Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-6pvdz,Uid:51b3ff5b-aa04-4bb2-abbf-9a90272a9848,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.030130 kubelet[2713]: E0906 09:56:23.029996 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.030130 kubelet[2713]: E0906 09:56:23.030063 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:23.030130 kubelet[2713]: E0906 09:56:23.030089 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-6pvdz" Sep 6 09:56:23.030321 kubelet[2713]: E0906 09:56:23.030285 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-6pvdz_calico-system(51b3ff5b-aa04-4bb2-abbf-9a90272a9848)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-6pvdz_calico-system(51b3ff5b-aa04-4bb2-abbf-9a90272a9848)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"546b21bdc35048f8658f9101c13fcce23d7dcefe5fddf311f2f0641d093331a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-6pvdz" podUID="51b3ff5b-aa04-4bb2-abbf-9a90272a9848" Sep 6 09:56:23.037524 containerd[1557]: time="2025-09-06T09:56:23.037473673Z" level=error msg="Failed to destroy network for sandbox \"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.039665 containerd[1557]: time="2025-09-06T09:56:23.039570869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-b7tqr,Uid:470610e9-9f1a-4a88-9c59-2bfe2488c508,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.039942 kubelet[2713]: E0906 09:56:23.039895 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.039999 kubelet[2713]: E0906 09:56:23.039960 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" Sep 6 09:56:23.039999 kubelet[2713]: E0906 09:56:23.039982 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" Sep 6 09:56:23.040147 kubelet[2713]: E0906 09:56:23.040049 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7458d6cd87-b7tqr_calico-apiserver(470610e9-9f1a-4a88-9c59-2bfe2488c508)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7458d6cd87-b7tqr_calico-apiserver(470610e9-9f1a-4a88-9c59-2bfe2488c508)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cedfb2e1483ddbe4976cf1d71635959c5773f02f20ff706db1a756e398fcf72c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" podUID="470610e9-9f1a-4a88-9c59-2bfe2488c508" Sep 6 09:56:23.044982 containerd[1557]: time="2025-09-06T09:56:23.044920687Z" level=error msg="Failed to destroy network for sandbox \"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.046998 containerd[1557]: time="2025-09-06T09:56:23.046952040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-dzfxs,Uid:2914104c-6ace-4f41-a69b-2a4aeaf020e8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.047236 kubelet[2713]: E0906 09:56:23.047199 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.047301 kubelet[2713]: E0906 09:56:23.047259 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" Sep 6 09:56:23.047301 kubelet[2713]: E0906 09:56:23.047280 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" Sep 6 09:56:23.047360 kubelet[2713]: E0906 09:56:23.047336 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7458d6cd87-dzfxs_calico-apiserver(2914104c-6ace-4f41-a69b-2a4aeaf020e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7458d6cd87-dzfxs_calico-apiserver(2914104c-6ace-4f41-a69b-2a4aeaf020e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8db65965f9a3486ed1fe118dcafbe9106ba237252c7568e229ed3b138c02d6bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" podUID="2914104c-6ace-4f41-a69b-2a4aeaf020e8" Sep 6 09:56:23.055752 containerd[1557]: time="2025-09-06T09:56:23.055704229Z" level=error msg="Failed to destroy network for sandbox \"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.056987 containerd[1557]: time="2025-09-06T09:56:23.056926590Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l9289,Uid:150001d0-9071-4406-80e3-3b959c3604cc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.057225 kubelet[2713]: E0906 09:56:23.057180 2713 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 6 09:56:23.057277 kubelet[2713]: E0906 09:56:23.057254 2713 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-l9289" Sep 6 09:56:23.057305 kubelet[2713]: E0906 09:56:23.057286 2713 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-l9289" Sep 6 09:56:23.057375 kubelet[2713]: E0906 09:56:23.057341 2713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-l9289_kube-system(150001d0-9071-4406-80e3-3b959c3604cc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-l9289_kube-system(150001d0-9071-4406-80e3-3b959c3604cc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70ca1fdb480dd64d03e34d8ab9af0496bfe74490cc915a32abeb3074a6b26a5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-l9289" podUID="150001d0-9071-4406-80e3-3b959c3604cc" Sep 6 09:56:28.510085 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3138189783.mount: Deactivated successfully. Sep 6 09:56:29.315270 containerd[1557]: time="2025-09-06T09:56:29.315215654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:29.331050 containerd[1557]: time="2025-09-06T09:56:29.316411321Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 6 09:56:29.331050 containerd[1557]: time="2025-09-06T09:56:29.318534833Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:29.331198 containerd[1557]: time="2025-09-06T09:56:29.321487924Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.665114586s" Sep 6 09:56:29.331231 containerd[1557]: time="2025-09-06T09:56:29.331202774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 6 09:56:29.331578 containerd[1557]: time="2025-09-06T09:56:29.331535740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:29.356658 containerd[1557]: time="2025-09-06T09:56:29.356605772Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 6 09:56:29.367314 containerd[1557]: time="2025-09-06T09:56:29.367265869Z" level=info msg="Container e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:29.379537 containerd[1557]: time="2025-09-06T09:56:29.379499113Z" level=info msg="CreateContainer within sandbox \"95365e07736046998a22d68e3ff81193e4b90aeeb5caebb289e7d494fa0a66ee\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\"" Sep 6 09:56:29.380599 containerd[1557]: time="2025-09-06T09:56:29.379938518Z" level=info msg="StartContainer for \"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\"" Sep 6 09:56:29.381737 containerd[1557]: time="2025-09-06T09:56:29.381698227Z" level=info msg="connecting to shim e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607" address="unix:///run/containerd/s/7658504cd4c1f33c743b678ade211adf3dd79a766c4f54f2c30b4fe2a960de31" protocol=ttrpc version=3 Sep 6 09:56:29.408543 systemd[1]: Started cri-containerd-e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607.scope - libcontainer container e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607. Sep 6 09:56:29.456645 containerd[1557]: time="2025-09-06T09:56:29.456583363Z" level=info msg="StartContainer for \"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\" returns successfully" Sep 6 09:56:29.556805 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 6 09:56:29.556956 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 6 09:56:29.721101 kubelet[2713]: I0906 09:56:29.720687 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-bmmts" podStartSLOduration=1.7800585309999999 podStartE2EDuration="18.720666901s" podCreationTimestamp="2025-09-06 09:56:11 +0000 UTC" firstStartedPulling="2025-09-06 09:56:12.391606646 +0000 UTC m=+17.934359289" lastFinishedPulling="2025-09-06 09:56:29.332215006 +0000 UTC m=+34.874967659" observedRunningTime="2025-09-06 09:56:29.718839116 +0000 UTC m=+35.261591779" watchObservedRunningTime="2025-09-06 09:56:29.720666901 +0000 UTC m=+35.263419554" Sep 6 09:56:29.777435 kubelet[2713]: I0906 09:56:29.776860 2713 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-ca-bundle\") pod \"994d4ee8-70cc-4102-9c4a-25ba50a55654\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " Sep 6 09:56:29.777435 kubelet[2713]: I0906 09:56:29.776961 2713 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzghj\" (UniqueName: \"kubernetes.io/projected/994d4ee8-70cc-4102-9c4a-25ba50a55654-kube-api-access-zzghj\") pod \"994d4ee8-70cc-4102-9c4a-25ba50a55654\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " Sep 6 09:56:29.777673 kubelet[2713]: I0906 09:56:29.776983 2713 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-backend-key-pair\") pod \"994d4ee8-70cc-4102-9c4a-25ba50a55654\" (UID: \"994d4ee8-70cc-4102-9c4a-25ba50a55654\") " Sep 6 09:56:29.777673 kubelet[2713]: I0906 09:56:29.777614 2713 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "994d4ee8-70cc-4102-9c4a-25ba50a55654" (UID: "994d4ee8-70cc-4102-9c4a-25ba50a55654"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 6 09:56:29.789269 systemd[1]: var-lib-kubelet-pods-994d4ee8\x2d70cc\x2d4102\x2d9c4a\x2d25ba50a55654-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 6 09:56:29.789723 systemd[1]: var-lib-kubelet-pods-994d4ee8\x2d70cc\x2d4102\x2d9c4a\x2d25ba50a55654-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzzghj.mount: Deactivated successfully. Sep 6 09:56:29.791904 kubelet[2713]: I0906 09:56:29.791801 2713 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "994d4ee8-70cc-4102-9c4a-25ba50a55654" (UID: "994d4ee8-70cc-4102-9c4a-25ba50a55654"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 6 09:56:29.792291 kubelet[2713]: I0906 09:56:29.792191 2713 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994d4ee8-70cc-4102-9c4a-25ba50a55654-kube-api-access-zzghj" (OuterVolumeSpecName: "kube-api-access-zzghj") pod "994d4ee8-70cc-4102-9c4a-25ba50a55654" (UID: "994d4ee8-70cc-4102-9c4a-25ba50a55654"). InnerVolumeSpecName "kube-api-access-zzghj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 6 09:56:29.871260 containerd[1557]: time="2025-09-06T09:56:29.871213763Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\" id:\"f212d7b3636efb1c9d87a9a5c170172d7334bfa2b301ca3615c7fec95d593cbd\" pid:3966 exit_status:1 exited_at:{seconds:1757152589 nanos:870846883}" Sep 6 09:56:29.878047 kubelet[2713]: I0906 09:56:29.878004 2713 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 6 09:56:29.878047 kubelet[2713]: I0906 09:56:29.878037 2713 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzghj\" (UniqueName: \"kubernetes.io/projected/994d4ee8-70cc-4102-9c4a-25ba50a55654-kube-api-access-zzghj\") on node \"localhost\" DevicePath \"\"" Sep 6 09:56:29.878047 kubelet[2713]: I0906 09:56:29.878047 2713 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/994d4ee8-70cc-4102-9c4a-25ba50a55654-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 6 09:56:30.000953 systemd[1]: Removed slice kubepods-besteffort-pod994d4ee8_70cc_4102_9c4a_25ba50a55654.slice - libcontainer container kubepods-besteffort-pod994d4ee8_70cc_4102_9c4a_25ba50a55654.slice. Sep 6 09:56:30.089506 systemd[1]: Created slice kubepods-besteffort-pod9eac0b29_952f_49ac_bc23_766e59bf4fd0.slice - libcontainer container kubepods-besteffort-pod9eac0b29_952f_49ac_bc23_766e59bf4fd0.slice. Sep 6 09:56:30.180883 kubelet[2713]: I0906 09:56:30.180778 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9eac0b29-952f-49ac-bc23-766e59bf4fd0-whisker-backend-key-pair\") pod \"whisker-7b556f6fdf-thnfg\" (UID: \"9eac0b29-952f-49ac-bc23-766e59bf4fd0\") " pod="calico-system/whisker-7b556f6fdf-thnfg" Sep 6 09:56:30.180883 kubelet[2713]: I0906 09:56:30.180831 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44jc\" (UniqueName: \"kubernetes.io/projected/9eac0b29-952f-49ac-bc23-766e59bf4fd0-kube-api-access-w44jc\") pod \"whisker-7b556f6fdf-thnfg\" (UID: \"9eac0b29-952f-49ac-bc23-766e59bf4fd0\") " pod="calico-system/whisker-7b556f6fdf-thnfg" Sep 6 09:56:30.180883 kubelet[2713]: I0906 09:56:30.180848 2713 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eac0b29-952f-49ac-bc23-766e59bf4fd0-whisker-ca-bundle\") pod \"whisker-7b556f6fdf-thnfg\" (UID: \"9eac0b29-952f-49ac-bc23-766e59bf4fd0\") " pod="calico-system/whisker-7b556f6fdf-thnfg" Sep 6 09:56:30.397116 containerd[1557]: time="2025-09-06T09:56:30.396992036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b556f6fdf-thnfg,Uid:9eac0b29-952f-49ac-bc23-766e59bf4fd0,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:30.547708 systemd-networkd[1450]: cali313586360c7: Link UP Sep 6 09:56:30.548321 systemd-networkd[1450]: cali313586360c7: Gained carrier Sep 6 09:56:30.564145 kubelet[2713]: I0906 09:56:30.563324 2713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994d4ee8-70cc-4102-9c4a-25ba50a55654" path="/var/lib/kubelet/pods/994d4ee8-70cc-4102-9c4a-25ba50a55654/volumes" Sep 6 09:56:30.565952 containerd[1557]: 2025-09-06 09:56:30.420 [INFO][3992] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 6 09:56:30.565952 containerd[1557]: 2025-09-06 09:56:30.436 [INFO][3992] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7b556f6fdf--thnfg-eth0 whisker-7b556f6fdf- calico-system 9eac0b29-952f-49ac-bc23-766e59bf4fd0 926 0 2025-09-06 09:56:30 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7b556f6fdf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7b556f6fdf-thnfg eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali313586360c7 [] [] }} ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-" Sep 6 09:56:30.565952 containerd[1557]: 2025-09-06 09:56:30.436 [INFO][3992] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.565952 containerd[1557]: 2025-09-06 09:56:30.496 [INFO][4006] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" HandleID="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Workload="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.497 [INFO][4006] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" HandleID="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Workload="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001357a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7b556f6fdf-thnfg", "timestamp":"2025-09-06 09:56:30.496905026 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.499 [INFO][4006] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.499 [INFO][4006] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.499 [INFO][4006] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.506 [INFO][4006] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" host="localhost" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.516 [INFO][4006] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.522 [INFO][4006] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.524 [INFO][4006] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.526 [INFO][4006] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:30.566191 containerd[1557]: 2025-09-06 09:56:30.526 [INFO][4006] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" host="localhost" Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.527 [INFO][4006] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.531 [INFO][4006] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" host="localhost" Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.535 [INFO][4006] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" host="localhost" Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.536 [INFO][4006] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" host="localhost" Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.536 [INFO][4006] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:30.566511 containerd[1557]: 2025-09-06 09:56:30.536 [INFO][4006] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" HandleID="k8s-pod-network.d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Workload="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.566653 containerd[1557]: 2025-09-06 09:56:30.539 [INFO][3992] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7b556f6fdf--thnfg-eth0", GenerateName:"whisker-7b556f6fdf-", Namespace:"calico-system", SelfLink:"", UID:"9eac0b29-952f-49ac-bc23-766e59bf4fd0", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b556f6fdf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7b556f6fdf-thnfg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali313586360c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:30.566653 containerd[1557]: 2025-09-06 09:56:30.540 [INFO][3992] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.566740 containerd[1557]: 2025-09-06 09:56:30.540 [INFO][3992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali313586360c7 ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.566740 containerd[1557]: 2025-09-06 09:56:30.548 [INFO][3992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.566781 containerd[1557]: 2025-09-06 09:56:30.550 [INFO][3992] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7b556f6fdf--thnfg-eth0", GenerateName:"whisker-7b556f6fdf-", Namespace:"calico-system", SelfLink:"", UID:"9eac0b29-952f-49ac-bc23-766e59bf4fd0", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b556f6fdf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd", Pod:"whisker-7b556f6fdf-thnfg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali313586360c7", MAC:"02:0d:df:4d:fc:8d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:30.566835 containerd[1557]: 2025-09-06 09:56:30.561 [INFO][3992] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" Namespace="calico-system" Pod="whisker-7b556f6fdf-thnfg" WorkloadEndpoint="localhost-k8s-whisker--7b556f6fdf--thnfg-eth0" Sep 6 09:56:30.781850 containerd[1557]: time="2025-09-06T09:56:30.781781054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\" id:\"4b2af3d6e5d64e3ef5d4bb6b9619090f5768ef95d0cd2f977cf04cee2b493101\" pid:4034 exit_status:1 exited_at:{seconds:1757152590 nanos:781429142}" Sep 6 09:56:30.800353 containerd[1557]: time="2025-09-06T09:56:30.800293815Z" level=info msg="connecting to shim d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd" address="unix:///run/containerd/s/8a0ed4aa9959d2f3c6711f61497e3044b37b7f961201488622b5f1409e7807d0" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:30.827673 systemd[1]: Started cri-containerd-d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd.scope - libcontainer container d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd. Sep 6 09:56:30.840302 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:30.878924 containerd[1557]: time="2025-09-06T09:56:30.878877731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b556f6fdf-thnfg,Uid:9eac0b29-952f-49ac-bc23-766e59bf4fd0,Namespace:calico-system,Attempt:0,} returns sandbox id \"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd\"" Sep 6 09:56:30.880954 containerd[1557]: time="2025-09-06T09:56:30.880899702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 6 09:56:31.374289 systemd-networkd[1450]: vxlan.calico: Link UP Sep 6 09:56:31.374297 systemd-networkd[1450]: vxlan.calico: Gained carrier Sep 6 09:56:31.782037 containerd[1557]: time="2025-09-06T09:56:31.781855530Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\" id:\"1d09808a7adbdabcfc6d76d9a99bbb1dd6bb829f4d3730b21ee94398712839ed\" pid:4308 exit_status:1 exited_at:{seconds:1757152591 nanos:781179610}" Sep 6 09:56:32.321798 containerd[1557]: time="2025-09-06T09:56:32.321719138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:32.322448 containerd[1557]: time="2025-09-06T09:56:32.322379539Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 6 09:56:32.323681 containerd[1557]: time="2025-09-06T09:56:32.323625619Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:32.325689 containerd[1557]: time="2025-09-06T09:56:32.325658669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:32.326244 containerd[1557]: time="2025-09-06T09:56:32.326215285Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.445276179s" Sep 6 09:56:32.326244 containerd[1557]: time="2025-09-06T09:56:32.326244430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 6 09:56:32.331132 containerd[1557]: time="2025-09-06T09:56:32.331101015Z" level=info msg="CreateContainer within sandbox \"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 6 09:56:32.339942 containerd[1557]: time="2025-09-06T09:56:32.339888184Z" level=info msg="Container 9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:32.349276 containerd[1557]: time="2025-09-06T09:56:32.349217372Z" level=info msg="CreateContainer within sandbox \"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996\"" Sep 6 09:56:32.350423 containerd[1557]: time="2025-09-06T09:56:32.349834661Z" level=info msg="StartContainer for \"9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996\"" Sep 6 09:56:32.350973 containerd[1557]: time="2025-09-06T09:56:32.350950228Z" level=info msg="connecting to shim 9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996" address="unix:///run/containerd/s/8a0ed4aa9959d2f3c6711f61497e3044b37b7f961201488622b5f1409e7807d0" protocol=ttrpc version=3 Sep 6 09:56:32.385646 systemd[1]: Started cri-containerd-9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996.scope - libcontainer container 9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996. Sep 6 09:56:32.436434 containerd[1557]: time="2025-09-06T09:56:32.436363443Z" level=info msg="StartContainer for \"9a3ee5967c448f656e376a292f9a230fc7744be88ee546f5af62eb0bff816996\" returns successfully" Sep 6 09:56:32.440184 containerd[1557]: time="2025-09-06T09:56:32.440131522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 6 09:56:32.494579 systemd-networkd[1450]: cali313586360c7: Gained IPv6LL Sep 6 09:56:33.070616 systemd-networkd[1450]: vxlan.calico: Gained IPv6LL Sep 6 09:56:33.554490 containerd[1557]: time="2025-09-06T09:56:33.554438555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l9289,Uid:150001d0-9071-4406-80e3-3b959c3604cc,Namespace:kube-system,Attempt:0,}" Sep 6 09:56:33.554975 containerd[1557]: time="2025-09-06T09:56:33.554536659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-b7tqr,Uid:470610e9-9f1a-4a88-9c59-2bfe2488c508,Namespace:calico-apiserver,Attempt:0,}" Sep 6 09:56:33.689349 systemd-networkd[1450]: calife683daa0dd: Link UP Sep 6 09:56:33.690268 systemd-networkd[1450]: calife683daa0dd: Gained carrier Sep 6 09:56:33.707183 containerd[1557]: 2025-09-06 09:56:33.595 [INFO][4365] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--l9289-eth0 coredns-674b8bbfcf- kube-system 150001d0-9071-4406-80e3-3b959c3604cc 859 0 2025-09-06 09:56:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-l9289 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calife683daa0dd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-" Sep 6 09:56:33.707183 containerd[1557]: 2025-09-06 09:56:33.596 [INFO][4365] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.707183 containerd[1557]: 2025-09-06 09:56:33.627 [INFO][4392] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" HandleID="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Workload="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.627 [INFO][4392] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" HandleID="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Workload="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00019f3f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-l9289", "timestamp":"2025-09-06 09:56:33.627787686 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.628 [INFO][4392] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.628 [INFO][4392] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.628 [INFO][4392] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.636 [INFO][4392] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" host="localhost" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.641 [INFO][4392] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.644 [INFO][4392] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.646 [INFO][4392] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.648 [INFO][4392] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:33.707465 containerd[1557]: 2025-09-06 09:56:33.648 [INFO][4392] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" host="localhost" Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.649 [INFO][4392] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572 Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.653 [INFO][4392] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" host="localhost" Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.661 [INFO][4392] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" host="localhost" Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.661 [INFO][4392] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" host="localhost" Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.661 [INFO][4392] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:33.708586 containerd[1557]: 2025-09-06 09:56:33.661 [INFO][4392] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" HandleID="k8s-pod-network.1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Workload="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.708722 containerd[1557]: 2025-09-06 09:56:33.675 [INFO][4365] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--l9289-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"150001d0-9071-4406-80e3-3b959c3604cc", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-l9289", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife683daa0dd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:33.708800 containerd[1557]: 2025-09-06 09:56:33.676 [INFO][4365] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.708800 containerd[1557]: 2025-09-06 09:56:33.677 [INFO][4365] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calife683daa0dd ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.708800 containerd[1557]: 2025-09-06 09:56:33.690 [INFO][4365] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.709638 containerd[1557]: 2025-09-06 09:56:33.690 [INFO][4365] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--l9289-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"150001d0-9071-4406-80e3-3b959c3604cc", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572", Pod:"coredns-674b8bbfcf-l9289", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calife683daa0dd", MAC:"ae:70:56:94:6b:00", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:33.709638 containerd[1557]: 2025-09-06 09:56:33.703 [INFO][4365] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" Namespace="kube-system" Pod="coredns-674b8bbfcf-l9289" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--l9289-eth0" Sep 6 09:56:33.763334 containerd[1557]: time="2025-09-06T09:56:33.762704007Z" level=info msg="connecting to shim 1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572" address="unix:///run/containerd/s/27c613876dd3c47d637c610e37c8c7ff51c77a49fab2cad4a1fe6fae8b1d12c1" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:33.771840 systemd[1]: Started sshd@7-10.0.0.36:22-10.0.0.1:37046.service - OpenSSH per-connection server daemon (10.0.0.1:37046). Sep 6 09:56:33.787935 systemd-networkd[1450]: cali2d99a414c10: Link UP Sep 6 09:56:33.789505 systemd-networkd[1450]: cali2d99a414c10: Gained carrier Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.599 [INFO][4376] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0 calico-apiserver-7458d6cd87- calico-apiserver 470610e9-9f1a-4a88-9c59-2bfe2488c508 860 0 2025-09-06 09:56:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7458d6cd87 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7458d6cd87-b7tqr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2d99a414c10 [] [] }} ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.599 [INFO][4376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.631 [INFO][4395] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" HandleID="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Workload="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.631 [INFO][4395] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" HandleID="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Workload="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7458d6cd87-b7tqr", "timestamp":"2025-09-06 09:56:33.631028865 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.631 [INFO][4395] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.661 [INFO][4395] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.662 [INFO][4395] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.740 [INFO][4395] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.747 [INFO][4395] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.759 [INFO][4395] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.761 [INFO][4395] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.764 [INFO][4395] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.764 [INFO][4395] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.765 [INFO][4395] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7 Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.769 [INFO][4395] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.778 [INFO][4395] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.778 [INFO][4395] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" host="localhost" Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.778 [INFO][4395] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:33.808490 containerd[1557]: 2025-09-06 09:56:33.778 [INFO][4395] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" HandleID="k8s-pod-network.1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Workload="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.782 [INFO][4376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0", GenerateName:"calico-apiserver-7458d6cd87-", Namespace:"calico-apiserver", SelfLink:"", UID:"470610e9-9f1a-4a88-9c59-2bfe2488c508", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7458d6cd87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7458d6cd87-b7tqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2d99a414c10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.782 [INFO][4376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.782 [INFO][4376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d99a414c10 ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.789 [INFO][4376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.789 [INFO][4376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0", GenerateName:"calico-apiserver-7458d6cd87-", Namespace:"calico-apiserver", SelfLink:"", UID:"470610e9-9f1a-4a88-9c59-2bfe2488c508", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7458d6cd87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7", Pod:"calico-apiserver-7458d6cd87-b7tqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2d99a414c10", MAC:"ba:af:d0:9e:8b:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:33.809028 containerd[1557]: 2025-09-06 09:56:33.802 [INFO][4376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-b7tqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--b7tqr-eth0" Sep 6 09:56:33.824619 systemd[1]: Started cri-containerd-1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572.scope - libcontainer container 1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572. Sep 6 09:56:33.843272 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:33.847153 containerd[1557]: time="2025-09-06T09:56:33.847106433Z" level=info msg="connecting to shim 1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7" address="unix:///run/containerd/s/52caac076d4ca83fe1460474cea11e4c49aa87288bb9e2e87749a72623556b98" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:33.867456 sshd[4442]: Accepted publickey for core from 10.0.0.1 port 37046 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:33.868786 sshd-session[4442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:33.873603 systemd[1]: Started cri-containerd-1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7.scope - libcontainer container 1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7. Sep 6 09:56:33.883066 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 6 09:56:33.883143 systemd-logind[1541]: New session 8 of user core. Sep 6 09:56:33.892575 containerd[1557]: time="2025-09-06T09:56:33.892523051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l9289,Uid:150001d0-9071-4406-80e3-3b959c3604cc,Namespace:kube-system,Attempt:0,} returns sandbox id \"1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572\"" Sep 6 09:56:33.895016 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:33.912641 containerd[1557]: time="2025-09-06T09:56:33.912586842Z" level=info msg="CreateContainer within sandbox \"1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 09:56:33.927186 containerd[1557]: time="2025-09-06T09:56:33.927095819Z" level=info msg="Container 04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:33.930348 containerd[1557]: time="2025-09-06T09:56:33.930297944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-b7tqr,Uid:470610e9-9f1a-4a88-9c59-2bfe2488c508,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7\"" Sep 6 09:56:33.936540 containerd[1557]: time="2025-09-06T09:56:33.936489035Z" level=info msg="CreateContainer within sandbox \"1ad5c4a528d4a7c196ab0436850177fffca3c1f2a3d233001ca18228991fc572\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5\"" Sep 6 09:56:33.937045 containerd[1557]: time="2025-09-06T09:56:33.937004824Z" level=info msg="StartContainer for \"04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5\"" Sep 6 09:56:33.938460 containerd[1557]: time="2025-09-06T09:56:33.938056970Z" level=info msg="connecting to shim 04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5" address="unix:///run/containerd/s/27c613876dd3c47d637c610e37c8c7ff51c77a49fab2cad4a1fe6fae8b1d12c1" protocol=ttrpc version=3 Sep 6 09:56:33.962563 systemd[1]: Started cri-containerd-04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5.scope - libcontainer container 04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5. Sep 6 09:56:34.005837 containerd[1557]: time="2025-09-06T09:56:34.005788438Z" level=info msg="StartContainer for \"04b17775b0606758afaed32bf8e84d3fd7279586a8afbc5b7e7a748754f970b5\" returns successfully" Sep 6 09:56:34.052711 sshd[4521]: Connection closed by 10.0.0.1 port 37046 Sep 6 09:56:34.053083 sshd-session[4442]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:34.058659 systemd[1]: sshd@7-10.0.0.36:22-10.0.0.1:37046.service: Deactivated successfully. Sep 6 09:56:34.061189 systemd[1]: session-8.scope: Deactivated successfully. Sep 6 09:56:34.062511 systemd-logind[1541]: Session 8 logged out. Waiting for processes to exit. Sep 6 09:56:34.063998 systemd-logind[1541]: Removed session 8. Sep 6 09:56:34.555578 containerd[1557]: time="2025-09-06T09:56:34.555465735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-dzfxs,Uid:2914104c-6ace-4f41-a69b-2a4aeaf020e8,Namespace:calico-apiserver,Attempt:0,}" Sep 6 09:56:34.676816 systemd-networkd[1450]: cali9b3766527a5: Link UP Sep 6 09:56:34.677495 systemd-networkd[1450]: cali9b3766527a5: Gained carrier Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.606 [INFO][4582] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0 calico-apiserver-7458d6cd87- calico-apiserver 2914104c-6ace-4f41-a69b-2a4aeaf020e8 858 0 2025-09-06 09:56:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7458d6cd87 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7458d6cd87-dzfxs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9b3766527a5 [] [] }} ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.606 [INFO][4582] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.637 [INFO][4598] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" HandleID="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Workload="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.637 [INFO][4598] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" HandleID="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Workload="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7458d6cd87-dzfxs", "timestamp":"2025-09-06 09:56:34.637415215 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.637 [INFO][4598] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.637 [INFO][4598] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.637 [INFO][4598] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.645 [INFO][4598] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.649 [INFO][4598] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.653 [INFO][4598] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.655 [INFO][4598] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.657 [INFO][4598] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.657 [INFO][4598] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.658 [INFO][4598] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.662 [INFO][4598] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.668 [INFO][4598] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.668 [INFO][4598] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" host="localhost" Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.668 [INFO][4598] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:34.701366 containerd[1557]: 2025-09-06 09:56:34.668 [INFO][4598] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" HandleID="k8s-pod-network.a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Workload="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.673 [INFO][4582] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0", GenerateName:"calico-apiserver-7458d6cd87-", Namespace:"calico-apiserver", SelfLink:"", UID:"2914104c-6ace-4f41-a69b-2a4aeaf020e8", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7458d6cd87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7458d6cd87-dzfxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9b3766527a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.673 [INFO][4582] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.673 [INFO][4582] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9b3766527a5 ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.678 [INFO][4582] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.680 [INFO][4582] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0", GenerateName:"calico-apiserver-7458d6cd87-", Namespace:"calico-apiserver", SelfLink:"", UID:"2914104c-6ace-4f41-a69b-2a4aeaf020e8", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7458d6cd87", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a", Pod:"calico-apiserver-7458d6cd87-dzfxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9b3766527a5", MAC:"d6:af:31:47:29:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:34.701950 containerd[1557]: 2025-09-06 09:56:34.692 [INFO][4582] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" Namespace="calico-apiserver" Pod="calico-apiserver-7458d6cd87-dzfxs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7458d6cd87--dzfxs-eth0" Sep 6 09:56:34.768425 kubelet[2713]: I0906 09:56:34.767216 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-l9289" podStartSLOduration=34.767197851 podStartE2EDuration="34.767197851s" podCreationTimestamp="2025-09-06 09:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:56:34.751781665 +0000 UTC m=+40.294534328" watchObservedRunningTime="2025-09-06 09:56:34.767197851 +0000 UTC m=+40.309950504" Sep 6 09:56:34.785751 containerd[1557]: time="2025-09-06T09:56:34.785418888Z" level=info msg="connecting to shim a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a" address="unix:///run/containerd/s/26fe0236f46d4bc426604e0b99a054ffadb6e961a6a83ba70f52ec2a668b7c12" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:34.821757 systemd[1]: Started cri-containerd-a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a.scope - libcontainer container a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a. Sep 6 09:56:34.841020 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:34.863580 systemd-networkd[1450]: cali2d99a414c10: Gained IPv6LL Sep 6 09:56:34.877186 containerd[1557]: time="2025-09-06T09:56:34.877146734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7458d6cd87-dzfxs,Uid:2914104c-6ace-4f41-a69b-2a4aeaf020e8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a\"" Sep 6 09:56:34.953970 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1011282817.mount: Deactivated successfully. Sep 6 09:56:34.974824 containerd[1557]: time="2025-09-06T09:56:34.974761358Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:34.975597 containerd[1557]: time="2025-09-06T09:56:34.975538649Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 6 09:56:34.976877 containerd[1557]: time="2025-09-06T09:56:34.976845844Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:34.979403 containerd[1557]: time="2025-09-06T09:56:34.979316375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:34.979876 containerd[1557]: time="2025-09-06T09:56:34.979832103Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.539650577s" Sep 6 09:56:34.979918 containerd[1557]: time="2025-09-06T09:56:34.979878560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 6 09:56:34.982076 containerd[1557]: time="2025-09-06T09:56:34.982052314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 09:56:34.985578 containerd[1557]: time="2025-09-06T09:56:34.985533873Z" level=info msg="CreateContainer within sandbox \"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 6 09:56:34.993875 containerd[1557]: time="2025-09-06T09:56:34.993830938Z" level=info msg="Container cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:35.004442 containerd[1557]: time="2025-09-06T09:56:35.004381816Z" level=info msg="CreateContainer within sandbox \"d726887d4d7524f000331f7c0b1fc2d441a82f17428432e1355ca609db33a9dd\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c\"" Sep 6 09:56:35.004979 containerd[1557]: time="2025-09-06T09:56:35.004927130Z" level=info msg="StartContainer for \"cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c\"" Sep 6 09:56:35.006222 containerd[1557]: time="2025-09-06T09:56:35.006196143Z" level=info msg="connecting to shim cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c" address="unix:///run/containerd/s/8a0ed4aa9959d2f3c6711f61497e3044b37b7f961201488622b5f1409e7807d0" protocol=ttrpc version=3 Sep 6 09:56:35.031614 systemd[1]: Started cri-containerd-cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c.scope - libcontainer container cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c. Sep 6 09:56:35.085974 containerd[1557]: time="2025-09-06T09:56:35.085801723Z" level=info msg="StartContainer for \"cb8329129f1e181309f4fdb53848c45dcbf9133dca6c95bba2f805088f12722c\" returns successfully" Sep 6 09:56:35.118655 systemd-networkd[1450]: calife683daa0dd: Gained IPv6LL Sep 6 09:56:36.334667 systemd-networkd[1450]: cali9b3766527a5: Gained IPv6LL Sep 6 09:56:36.554000 containerd[1557]: time="2025-09-06T09:56:36.553944855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-l8pm4,Uid:63b0ea24-5c9f-4c70-aaef-43e73d0fb19a,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:36.554510 containerd[1557]: time="2025-09-06T09:56:36.553995229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s469z,Uid:c0286ad4-01b4-4d4f-9f29-776595a29e27,Namespace:kube-system,Attempt:0,}" Sep 6 09:56:36.683139 systemd-networkd[1450]: cali56e05868a40: Link UP Sep 6 09:56:36.684036 systemd-networkd[1450]: cali56e05868a40: Gained carrier Sep 6 09:56:36.697412 kubelet[2713]: I0906 09:56:36.697335 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7b556f6fdf-thnfg" podStartSLOduration=2.597014444 podStartE2EDuration="6.697317631s" podCreationTimestamp="2025-09-06 09:56:30 +0000 UTC" firstStartedPulling="2025-09-06 09:56:30.880660091 +0000 UTC m=+36.423412744" lastFinishedPulling="2025-09-06 09:56:34.980963278 +0000 UTC m=+40.523715931" observedRunningTime="2025-09-06 09:56:35.730071904 +0000 UTC m=+41.272824577" watchObservedRunningTime="2025-09-06 09:56:36.697317631 +0000 UTC m=+42.240070284" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.596 [INFO][4710] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--s469z-eth0 coredns-674b8bbfcf- kube-system c0286ad4-01b4-4d4f-9f29-776595a29e27 855 0 2025-09-06 09:56:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-s469z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali56e05868a40 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.596 [INFO][4710] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.628 [INFO][4742] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" HandleID="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Workload="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.628 [INFO][4742] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" HandleID="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Workload="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-s469z", "timestamp":"2025-09-06 09:56:36.628082161 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.628 [INFO][4742] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.628 [INFO][4742] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.628 [INFO][4742] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.637 [INFO][4742] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.645 [INFO][4742] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.651 [INFO][4742] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.652 [INFO][4742] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.654 [INFO][4742] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.654 [INFO][4742] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.656 [INFO][4742] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09 Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.661 [INFO][4742] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.667 [INFO][4742] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.668 [INFO][4742] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" host="localhost" Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.668 [INFO][4742] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:36.703414 containerd[1557]: 2025-09-06 09:56:36.668 [INFO][4742] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" HandleID="k8s-pod-network.bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Workload="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.674 [INFO][4710] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--s469z-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0286ad4-01b4-4d4f-9f29-776595a29e27", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-s469z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56e05868a40", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.674 [INFO][4710] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.674 [INFO][4710] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali56e05868a40 ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.683 [INFO][4710] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.688 [INFO][4710] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--s469z-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c0286ad4-01b4-4d4f-9f29-776595a29e27", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09", Pod:"coredns-674b8bbfcf-s469z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali56e05868a40", MAC:"7a:fd:1d:c6:6d:8e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:36.703948 containerd[1557]: 2025-09-06 09:56:36.698 [INFO][4710] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" Namespace="kube-system" Pod="coredns-674b8bbfcf-s469z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--s469z-eth0" Sep 6 09:56:36.862244 systemd-networkd[1450]: calibeab4a3689b: Link UP Sep 6 09:56:36.863713 systemd-networkd[1450]: calibeab4a3689b: Gained carrier Sep 6 09:56:36.865841 containerd[1557]: time="2025-09-06T09:56:36.864904140Z" level=info msg="connecting to shim bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09" address="unix:///run/containerd/s/7020b20dc9e5fa3af8ae6897c84b2a4b47bb1e4329cedd6c727766bdbc4cca8b" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.591 [INFO][4707] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--l8pm4-eth0 csi-node-driver- calico-system 63b0ea24-5c9f-4c70-aaef-43e73d0fb19a 734 0 2025-09-06 09:56:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-l8pm4 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calibeab4a3689b [] [] }} ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.592 [INFO][4707] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.636 [INFO][4735] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" HandleID="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Workload="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.636 [INFO][4735] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" HandleID="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Workload="localhost-k8s-csi--node--driver--l8pm4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035f690), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-l8pm4", "timestamp":"2025-09-06 09:56:36.636702639 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.637 [INFO][4735] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.668 [INFO][4735] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.668 [INFO][4735] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.738 [INFO][4735] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.746 [INFO][4735] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.751 [INFO][4735] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.753 [INFO][4735] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.755 [INFO][4735] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.755 [INFO][4735] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.756 [INFO][4735] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27 Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.842 [INFO][4735] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.849 [INFO][4735] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.849 [INFO][4735] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" host="localhost" Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.849 [INFO][4735] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:36.884724 containerd[1557]: 2025-09-06 09:56:36.849 [INFO][4735] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" HandleID="k8s-pod-network.8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Workload="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.853 [INFO][4707] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--l8pm4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-l8pm4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calibeab4a3689b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.853 [INFO][4707] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.853 [INFO][4707] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibeab4a3689b ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.864 [INFO][4707] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.864 [INFO][4707] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--l8pm4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"63b0ea24-5c9f-4c70-aaef-43e73d0fb19a", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27", Pod:"csi-node-driver-l8pm4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calibeab4a3689b", MAC:"da:2a:62:b6:54:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:36.886237 containerd[1557]: 2025-09-06 09:56:36.878 [INFO][4707] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" Namespace="calico-system" Pod="csi-node-driver-l8pm4" WorkloadEndpoint="localhost-k8s-csi--node--driver--l8pm4-eth0" Sep 6 09:56:36.920550 systemd[1]: Started cri-containerd-bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09.scope - libcontainer container bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09. Sep 6 09:56:36.934901 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:37.021469 containerd[1557]: time="2025-09-06T09:56:37.021385540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s469z,Uid:c0286ad4-01b4-4d4f-9f29-776595a29e27,Namespace:kube-system,Attempt:0,} returns sandbox id \"bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09\"" Sep 6 09:56:37.028707 containerd[1557]: time="2025-09-06T09:56:37.028657595Z" level=info msg="CreateContainer within sandbox \"bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 09:56:37.064426 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1564572646.mount: Deactivated successfully. Sep 6 09:56:37.064850 containerd[1557]: time="2025-09-06T09:56:37.064798809Z" level=info msg="Container 1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:37.066063 containerd[1557]: time="2025-09-06T09:56:37.065616554Z" level=info msg="connecting to shim 8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27" address="unix:///run/containerd/s/4c3a3fb9e6f63eb3ba631a00b5a855a9df83a0f5eec4f4de74958a9876803df2" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:37.072263 containerd[1557]: time="2025-09-06T09:56:37.072228089Z" level=info msg="CreateContainer within sandbox \"bf83d9921c050c3c38152550743fb5207d6def7a8b033165a098c35e89afeb09\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71\"" Sep 6 09:56:37.073595 containerd[1557]: time="2025-09-06T09:56:37.073539071Z" level=info msg="StartContainer for \"1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71\"" Sep 6 09:56:37.075778 containerd[1557]: time="2025-09-06T09:56:37.075731939Z" level=info msg="connecting to shim 1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71" address="unix:///run/containerd/s/7020b20dc9e5fa3af8ae6897c84b2a4b47bb1e4329cedd6c727766bdbc4cca8b" protocol=ttrpc version=3 Sep 6 09:56:37.094547 systemd[1]: Started cri-containerd-8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27.scope - libcontainer container 8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27. Sep 6 09:56:37.110564 systemd[1]: Started cri-containerd-1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71.scope - libcontainer container 1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71. Sep 6 09:56:37.119875 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:37.157109 containerd[1557]: time="2025-09-06T09:56:37.156974673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-l8pm4,Uid:63b0ea24-5c9f-4c70-aaef-43e73d0fb19a,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27\"" Sep 6 09:56:37.157344 containerd[1557]: time="2025-09-06T09:56:37.157222288Z" level=info msg="StartContainer for \"1155b81366d91adf13e0d0e25736943a346e9c5bc266ca717361bc13eddfcd71\" returns successfully" Sep 6 09:56:37.554606 containerd[1557]: time="2025-09-06T09:56:37.554302019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-dcbc46f74-v8wfw,Uid:8375b038-9619-4019-920e-7d98311eee19,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:37.555092 containerd[1557]: time="2025-09-06T09:56:37.554805644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-6pvdz,Uid:51b3ff5b-aa04-4bb2-abbf-9a90272a9848,Namespace:calico-system,Attempt:0,}" Sep 6 09:56:37.650525 containerd[1557]: time="2025-09-06T09:56:37.649857730Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:37.651765 containerd[1557]: time="2025-09-06T09:56:37.651731869Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 6 09:56:37.653067 containerd[1557]: time="2025-09-06T09:56:37.652998958Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:37.655258 containerd[1557]: time="2025-09-06T09:56:37.655222954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:37.656622 containerd[1557]: time="2025-09-06T09:56:37.656597275Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.674516278s" Sep 6 09:56:37.656707 containerd[1557]: time="2025-09-06T09:56:37.656693796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 6 09:56:37.658414 containerd[1557]: time="2025-09-06T09:56:37.658371066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 09:56:37.664317 containerd[1557]: time="2025-09-06T09:56:37.664265113Z" level=info msg="CreateContainer within sandbox \"1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 09:56:37.674915 containerd[1557]: time="2025-09-06T09:56:37.674056008Z" level=info msg="Container 2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:37.684528 containerd[1557]: time="2025-09-06T09:56:37.684460235Z" level=info msg="CreateContainer within sandbox \"1882cf149040ddc65eb2503ba46be2dc77ed0166933527c1453f8047368bddf7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4\"" Sep 6 09:56:37.685142 containerd[1557]: time="2025-09-06T09:56:37.685099055Z" level=info msg="StartContainer for \"2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4\"" Sep 6 09:56:37.686160 containerd[1557]: time="2025-09-06T09:56:37.686133377Z" level=info msg="connecting to shim 2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4" address="unix:///run/containerd/s/52caac076d4ca83fe1460474cea11e4c49aa87288bb9e2e87749a72623556b98" protocol=ttrpc version=3 Sep 6 09:56:37.708409 systemd-networkd[1450]: cali451cd17c984: Link UP Sep 6 09:56:37.709350 systemd-networkd[1450]: cali451cd17c984: Gained carrier Sep 6 09:56:37.710581 systemd[1]: Started cri-containerd-2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4.scope - libcontainer container 2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4. Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.625 [INFO][4919] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--6pvdz-eth0 goldmane-54d579b49d- calico-system 51b3ff5b-aa04-4bb2-abbf-9a90272a9848 857 0 2025-09-06 09:56:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-6pvdz eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali451cd17c984 [] [] }} ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.625 [INFO][4919] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.658 [INFO][4943] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" HandleID="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Workload="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.658 [INFO][4943] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" HandleID="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Workload="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7070), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-6pvdz", "timestamp":"2025-09-06 09:56:37.658182882 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.658 [INFO][4943] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.658 [INFO][4943] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.659 [INFO][4943] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.667 [INFO][4943] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.671 [INFO][4943] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.678 [INFO][4943] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.680 [INFO][4943] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.683 [INFO][4943] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.683 [INFO][4943] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.686 [INFO][4943] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1 Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.691 [INFO][4943] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.696 [INFO][4943] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.696 [INFO][4943] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" host="localhost" Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.696 [INFO][4943] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:37.725991 containerd[1557]: 2025-09-06 09:56:37.696 [INFO][4943] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" HandleID="k8s-pod-network.c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Workload="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.704 [INFO][4919] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--6pvdz-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"51b3ff5b-aa04-4bb2-abbf-9a90272a9848", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-6pvdz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali451cd17c984", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.704 [INFO][4919] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.704 [INFO][4919] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali451cd17c984 ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.709 [INFO][4919] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.710 [INFO][4919] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--6pvdz-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"51b3ff5b-aa04-4bb2-abbf-9a90272a9848", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1", Pod:"goldmane-54d579b49d-6pvdz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali451cd17c984", MAC:"06:ee:3f:99:db:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:37.727285 containerd[1557]: 2025-09-06 09:56:37.722 [INFO][4919] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" Namespace="calico-system" Pod="goldmane-54d579b49d-6pvdz" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--6pvdz-eth0" Sep 6 09:56:37.750428 kubelet[2713]: I0906 09:56:37.750303 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-s469z" podStartSLOduration=37.75028056 podStartE2EDuration="37.75028056s" podCreationTimestamp="2025-09-06 09:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 09:56:37.745641208 +0000 UTC m=+43.288393861" watchObservedRunningTime="2025-09-06 09:56:37.75028056 +0000 UTC m=+43.293033213" Sep 6 09:56:37.775165 containerd[1557]: time="2025-09-06T09:56:37.775103201Z" level=info msg="connecting to shim c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1" address="unix:///run/containerd/s/0df0fe8c1935072d2a3c0d2fdd7a88e40ec79b3a66b6fc0cf3c09fbe8a9663fc" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:37.805091 containerd[1557]: time="2025-09-06T09:56:37.804818579Z" level=info msg="StartContainer for \"2083017b64b1aa72e25ded0caf8558c333c186628508f383a6f42f89801597e4\" returns successfully" Sep 6 09:56:37.810544 systemd[1]: Started cri-containerd-c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1.scope - libcontainer container c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1. Sep 6 09:56:37.826340 systemd-networkd[1450]: cali1e7ea829acf: Link UP Sep 6 09:56:37.827282 systemd-networkd[1450]: cali1e7ea829acf: Gained carrier Sep 6 09:56:37.837531 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.625 [INFO][4908] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0 calico-kube-controllers-dcbc46f74- calico-system 8375b038-9619-4019-920e-7d98311eee19 856 0 2025-09-06 09:56:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:dcbc46f74 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-dcbc46f74-v8wfw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1e7ea829acf [] [] }} ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.625 [INFO][4908] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.658 [INFO][4941] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" HandleID="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Workload="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.660 [INFO][4941] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" HandleID="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Workload="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003df010), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-dcbc46f74-v8wfw", "timestamp":"2025-09-06 09:56:37.65776082 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.660 [INFO][4941] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.696 [INFO][4941] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.697 [INFO][4941] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.769 [INFO][4941] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.783 [INFO][4941] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.788 [INFO][4941] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.790 [INFO][4941] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.795 [INFO][4941] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.795 [INFO][4941] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.798 [INFO][4941] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.804 [INFO][4941] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.815 [INFO][4941] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.815 [INFO][4941] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" host="localhost" Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.815 [INFO][4941] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 09:56:37.850220 containerd[1557]: 2025-09-06 09:56:37.815 [INFO][4941] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" HandleID="k8s-pod-network.a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Workload="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.819 [INFO][4908] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0", GenerateName:"calico-kube-controllers-dcbc46f74-", Namespace:"calico-system", SelfLink:"", UID:"8375b038-9619-4019-920e-7d98311eee19", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"dcbc46f74", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-dcbc46f74-v8wfw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1e7ea829acf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.819 [INFO][4908] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.819 [INFO][4908] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1e7ea829acf ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.828 [INFO][4908] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.828 [INFO][4908] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0", GenerateName:"calico-kube-controllers-dcbc46f74-", Namespace:"calico-system", SelfLink:"", UID:"8375b038-9619-4019-920e-7d98311eee19", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.September, 6, 9, 56, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"dcbc46f74", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f", Pod:"calico-kube-controllers-dcbc46f74-v8wfw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1e7ea829acf", MAC:"7a:e6:a9:b0:58:d5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 09:56:37.851219 containerd[1557]: 2025-09-06 09:56:37.844 [INFO][4908] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" Namespace="calico-system" Pod="calico-kube-controllers-dcbc46f74-v8wfw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--dcbc46f74--v8wfw-eth0" Sep 6 09:56:37.897093 containerd[1557]: time="2025-09-06T09:56:37.897040760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-6pvdz,Uid:51b3ff5b-aa04-4bb2-abbf-9a90272a9848,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1\"" Sep 6 09:56:37.897341 containerd[1557]: time="2025-09-06T09:56:37.897295269Z" level=info msg="connecting to shim a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f" address="unix:///run/containerd/s/e17586a5d497872589184c2163bdff78ce62f824d713bd7bd93e86167218f272" namespace=k8s.io protocol=ttrpc version=3 Sep 6 09:56:37.935503 systemd[1]: Started cri-containerd-a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f.scope - libcontainer container a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f. Sep 6 09:56:37.959091 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 6 09:56:38.012838 containerd[1557]: time="2025-09-06T09:56:38.012767986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-dcbc46f74-v8wfw,Uid:8375b038-9619-4019-920e-7d98311eee19,Namespace:calico-system,Attempt:0,} returns sandbox id \"a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f\"" Sep 6 09:56:38.083269 containerd[1557]: time="2025-09-06T09:56:38.083058703Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:38.084224 containerd[1557]: time="2025-09-06T09:56:38.084188995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 6 09:56:38.085820 containerd[1557]: time="2025-09-06T09:56:38.085784070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 427.285706ms" Sep 6 09:56:38.085820 containerd[1557]: time="2025-09-06T09:56:38.085816711Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 6 09:56:38.086964 containerd[1557]: time="2025-09-06T09:56:38.086931504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 6 09:56:38.091299 containerd[1557]: time="2025-09-06T09:56:38.091246316Z" level=info msg="CreateContainer within sandbox \"a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 09:56:38.102413 containerd[1557]: time="2025-09-06T09:56:38.101985389Z" level=info msg="Container c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:38.111063 containerd[1557]: time="2025-09-06T09:56:38.111003833Z" level=info msg="CreateContainer within sandbox \"a527e3f61a7eae67240aed4063a8635a14386ac7e10db714aceda7133f32b27a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438\"" Sep 6 09:56:38.111665 containerd[1557]: time="2025-09-06T09:56:38.111605482Z" level=info msg="StartContainer for \"c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438\"" Sep 6 09:56:38.113049 containerd[1557]: time="2025-09-06T09:56:38.113022082Z" level=info msg="connecting to shim c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438" address="unix:///run/containerd/s/26fe0236f46d4bc426604e0b99a054ffadb6e961a6a83ba70f52ec2a668b7c12" protocol=ttrpc version=3 Sep 6 09:56:38.138539 systemd[1]: Started cri-containerd-c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438.scope - libcontainer container c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438. Sep 6 09:56:38.191611 systemd-networkd[1450]: calibeab4a3689b: Gained IPv6LL Sep 6 09:56:38.203052 containerd[1557]: time="2025-09-06T09:56:38.203015272Z" level=info msg="StartContainer for \"c0fb75c7379f961e0786c170e52b03b4e3a8778090db23d67c591bf1933fb438\" returns successfully" Sep 6 09:56:38.702978 systemd-networkd[1450]: cali56e05868a40: Gained IPv6LL Sep 6 09:56:38.762984 kubelet[2713]: I0906 09:56:38.762863 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7458d6cd87-b7tqr" podStartSLOduration=26.036678011 podStartE2EDuration="29.762846124s" podCreationTimestamp="2025-09-06 09:56:09 +0000 UTC" firstStartedPulling="2025-09-06 09:56:33.932014599 +0000 UTC m=+39.474767252" lastFinishedPulling="2025-09-06 09:56:37.658182712 +0000 UTC m=+43.200935365" observedRunningTime="2025-09-06 09:56:38.760733327 +0000 UTC m=+44.303485981" watchObservedRunningTime="2025-09-06 09:56:38.762846124 +0000 UTC m=+44.305598777" Sep 6 09:56:38.788657 kubelet[2713]: I0906 09:56:38.788537 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7458d6cd87-dzfxs" podStartSLOduration=26.581037073 podStartE2EDuration="29.788500413s" podCreationTimestamp="2025-09-06 09:56:09 +0000 UTC" firstStartedPulling="2025-09-06 09:56:34.879326198 +0000 UTC m=+40.422078861" lastFinishedPulling="2025-09-06 09:56:38.086789548 +0000 UTC m=+43.629542201" observedRunningTime="2025-09-06 09:56:38.785903426 +0000 UTC m=+44.328656079" watchObservedRunningTime="2025-09-06 09:56:38.788500413 +0000 UTC m=+44.331253066" Sep 6 09:56:39.067756 systemd[1]: Started sshd@8-10.0.0.36:22-10.0.0.1:37062.service - OpenSSH per-connection server daemon (10.0.0.1:37062). Sep 6 09:56:39.086549 systemd-networkd[1450]: cali451cd17c984: Gained IPv6LL Sep 6 09:56:39.137062 sshd[5151]: Accepted publickey for core from 10.0.0.1 port 37062 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:39.139044 sshd-session[5151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:39.145932 systemd-logind[1541]: New session 9 of user core. Sep 6 09:56:39.152538 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 6 09:56:39.307815 sshd[5154]: Connection closed by 10.0.0.1 port 37062 Sep 6 09:56:39.308193 sshd-session[5151]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:39.313182 systemd[1]: sshd@8-10.0.0.36:22-10.0.0.1:37062.service: Deactivated successfully. Sep 6 09:56:39.318110 systemd[1]: session-9.scope: Deactivated successfully. Sep 6 09:56:39.322319 systemd-logind[1541]: Session 9 logged out. Waiting for processes to exit. Sep 6 09:56:39.326245 systemd-logind[1541]: Removed session 9. Sep 6 09:56:39.470584 systemd-networkd[1450]: cali1e7ea829acf: Gained IPv6LL Sep 6 09:56:39.744055 kubelet[2713]: I0906 09:56:39.744013 2713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 09:56:40.494786 containerd[1557]: time="2025-09-06T09:56:40.494708973Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:40.495559 containerd[1557]: time="2025-09-06T09:56:40.495515518Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 6 09:56:40.496940 containerd[1557]: time="2025-09-06T09:56:40.496864190Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:40.498824 containerd[1557]: time="2025-09-06T09:56:40.498785817Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:40.499309 containerd[1557]: time="2025-09-06T09:56:40.499272530Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.412303746s" Sep 6 09:56:40.499309 containerd[1557]: time="2025-09-06T09:56:40.499304330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 6 09:56:40.500273 containerd[1557]: time="2025-09-06T09:56:40.500244656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 6 09:56:40.504214 containerd[1557]: time="2025-09-06T09:56:40.504165236Z" level=info msg="CreateContainer within sandbox \"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 6 09:56:40.515675 containerd[1557]: time="2025-09-06T09:56:40.515611764Z" level=info msg="Container 11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:40.535938 containerd[1557]: time="2025-09-06T09:56:40.535879522Z" level=info msg="CreateContainer within sandbox \"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30\"" Sep 6 09:56:40.536783 containerd[1557]: time="2025-09-06T09:56:40.536698019Z" level=info msg="StartContainer for \"11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30\"" Sep 6 09:56:40.539430 containerd[1557]: time="2025-09-06T09:56:40.538483160Z" level=info msg="connecting to shim 11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30" address="unix:///run/containerd/s/4c3a3fb9e6f63eb3ba631a00b5a855a9df83a0f5eec4f4de74958a9876803df2" protocol=ttrpc version=3 Sep 6 09:56:40.570536 systemd[1]: Started cri-containerd-11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30.scope - libcontainer container 11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30. Sep 6 09:56:40.619178 containerd[1557]: time="2025-09-06T09:56:40.618575625Z" level=info msg="StartContainer for \"11b86a187b5de09296c945daae1ea629f1c4832a9c7e1ca3b64103505da55f30\" returns successfully" Sep 6 09:56:42.346176 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2332381073.mount: Deactivated successfully. Sep 6 09:56:42.746509 containerd[1557]: time="2025-09-06T09:56:42.746450714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:42.747450 containerd[1557]: time="2025-09-06T09:56:42.747380930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 6 09:56:42.748771 containerd[1557]: time="2025-09-06T09:56:42.748720575Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:42.750722 containerd[1557]: time="2025-09-06T09:56:42.750685994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:42.751336 containerd[1557]: time="2025-09-06T09:56:42.751288585Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.25101279s" Sep 6 09:56:42.751378 containerd[1557]: time="2025-09-06T09:56:42.751335994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 6 09:56:42.753103 containerd[1557]: time="2025-09-06T09:56:42.753064528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 6 09:56:42.761671 containerd[1557]: time="2025-09-06T09:56:42.761634244Z" level=info msg="CreateContainer within sandbox \"c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 6 09:56:42.770623 containerd[1557]: time="2025-09-06T09:56:42.770585295Z" level=info msg="Container d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:42.784543 containerd[1557]: time="2025-09-06T09:56:42.784497748Z" level=info msg="CreateContainer within sandbox \"c9e67935b535c2539b2e4727935411d446ea6b71b0e21621e01bdcbcaf864cb1\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\"" Sep 6 09:56:42.785110 containerd[1557]: time="2025-09-06T09:56:42.785083959Z" level=info msg="StartContainer for \"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\"" Sep 6 09:56:42.786156 containerd[1557]: time="2025-09-06T09:56:42.786113922Z" level=info msg="connecting to shim d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d" address="unix:///run/containerd/s/0df0fe8c1935072d2a3c0d2fdd7a88e40ec79b3a66b6fc0cf3c09fbe8a9663fc" protocol=ttrpc version=3 Sep 6 09:56:42.821548 systemd[1]: Started cri-containerd-d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d.scope - libcontainer container d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d. Sep 6 09:56:42.871005 containerd[1557]: time="2025-09-06T09:56:42.870962553Z" level=info msg="StartContainer for \"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\" returns successfully" Sep 6 09:56:43.770026 kubelet[2713]: I0906 09:56:43.769890 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-6pvdz" podStartSLOduration=27.917403527 podStartE2EDuration="32.769871458s" podCreationTimestamp="2025-09-06 09:56:11 +0000 UTC" firstStartedPulling="2025-09-06 09:56:37.900468587 +0000 UTC m=+43.443221240" lastFinishedPulling="2025-09-06 09:56:42.752936518 +0000 UTC m=+48.295689171" observedRunningTime="2025-09-06 09:56:43.768340915 +0000 UTC m=+49.311093568" watchObservedRunningTime="2025-09-06 09:56:43.769871458 +0000 UTC m=+49.312624121" Sep 6 09:56:43.845971 containerd[1557]: time="2025-09-06T09:56:43.845909641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\" id:\"80130f06856293525d8f66c5d6a524abd3f21c8880d9efb8417c5a125422c6d1\" pid:5269 exit_status:1 exited_at:{seconds:1757152603 nanos:845411486}" Sep 6 09:56:44.322741 systemd[1]: Started sshd@9-10.0.0.36:22-10.0.0.1:39108.service - OpenSSH per-connection server daemon (10.0.0.1:39108). Sep 6 09:56:44.401944 sshd[5283]: Accepted publickey for core from 10.0.0.1 port 39108 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:44.403832 sshd-session[5283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:44.411311 systemd-logind[1541]: New session 10 of user core. Sep 6 09:56:44.421545 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 6 09:56:44.563948 sshd[5286]: Connection closed by 10.0.0.1 port 39108 Sep 6 09:56:44.565621 sshd-session[5283]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:44.580310 systemd[1]: sshd@9-10.0.0.36:22-10.0.0.1:39108.service: Deactivated successfully. Sep 6 09:56:44.583495 systemd[1]: session-10.scope: Deactivated successfully. Sep 6 09:56:44.584777 systemd-logind[1541]: Session 10 logged out. Waiting for processes to exit. Sep 6 09:56:44.588746 systemd[1]: Started sshd@10-10.0.0.36:22-10.0.0.1:39112.service - OpenSSH per-connection server daemon (10.0.0.1:39112). Sep 6 09:56:44.589714 systemd-logind[1541]: Removed session 10. Sep 6 09:56:44.639606 sshd[5301]: Accepted publickey for core from 10.0.0.1 port 39112 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:44.641682 sshd-session[5301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:44.646904 systemd-logind[1541]: New session 11 of user core. Sep 6 09:56:44.659642 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 6 09:56:44.845568 sshd[5304]: Connection closed by 10.0.0.1 port 39112 Sep 6 09:56:44.846523 sshd-session[5301]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:44.859795 systemd[1]: sshd@10-10.0.0.36:22-10.0.0.1:39112.service: Deactivated successfully. Sep 6 09:56:44.864239 systemd[1]: session-11.scope: Deactivated successfully. Sep 6 09:56:44.868088 systemd-logind[1541]: Session 11 logged out. Waiting for processes to exit. Sep 6 09:56:44.871546 systemd-logind[1541]: Removed session 11. Sep 6 09:56:44.873096 systemd[1]: Started sshd@11-10.0.0.36:22-10.0.0.1:39114.service - OpenSSH per-connection server daemon (10.0.0.1:39114). Sep 6 09:56:44.895961 containerd[1557]: time="2025-09-06T09:56:44.895905777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\" id:\"fca09d42523d5f58d71429a558b2d17fb47940b8f546144132a12222c025bac8\" pid:5323 exit_status:1 exited_at:{seconds:1757152604 nanos:895575538}" Sep 6 09:56:45.112220 sshd[5338]: Accepted publickey for core from 10.0.0.1 port 39114 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:45.114902 sshd-session[5338]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:45.120846 systemd-logind[1541]: New session 12 of user core. Sep 6 09:56:45.126593 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 6 09:56:45.278043 sshd[5345]: Connection closed by 10.0.0.1 port 39114 Sep 6 09:56:45.278521 sshd-session[5338]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:45.289573 systemd[1]: sshd@11-10.0.0.36:22-10.0.0.1:39114.service: Deactivated successfully. Sep 6 09:56:45.292911 systemd[1]: session-12.scope: Deactivated successfully. Sep 6 09:56:45.294553 systemd-logind[1541]: Session 12 logged out. Waiting for processes to exit. Sep 6 09:56:45.296913 systemd-logind[1541]: Removed session 12. Sep 6 09:56:46.035264 containerd[1557]: time="2025-09-06T09:56:46.035203054Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:46.036079 containerd[1557]: time="2025-09-06T09:56:46.036025407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 6 09:56:46.037185 containerd[1557]: time="2025-09-06T09:56:46.037151280Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:46.039490 containerd[1557]: time="2025-09-06T09:56:46.039452128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:46.040009 containerd[1557]: time="2025-09-06T09:56:46.039965631Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.286873902s" Sep 6 09:56:46.040009 containerd[1557]: time="2025-09-06T09:56:46.040003913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 6 09:56:46.041974 containerd[1557]: time="2025-09-06T09:56:46.041938764Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 6 09:56:46.054628 containerd[1557]: time="2025-09-06T09:56:46.054593121Z" level=info msg="CreateContainer within sandbox \"a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 6 09:56:46.071965 containerd[1557]: time="2025-09-06T09:56:46.071912993Z" level=info msg="Container 94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:46.081597 containerd[1557]: time="2025-09-06T09:56:46.081547612Z" level=info msg="CreateContainer within sandbox \"a207fc7df523562f9371ba00c7ab4c2d60c85d8434df5ece9c91dc730b491b6f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\"" Sep 6 09:56:46.082197 containerd[1557]: time="2025-09-06T09:56:46.082170291Z" level=info msg="StartContainer for \"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\"" Sep 6 09:56:46.083287 containerd[1557]: time="2025-09-06T09:56:46.083243594Z" level=info msg="connecting to shim 94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8" address="unix:///run/containerd/s/e17586a5d497872589184c2163bdff78ce62f824d713bd7bd93e86167218f272" protocol=ttrpc version=3 Sep 6 09:56:46.108562 systemd[1]: Started cri-containerd-94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8.scope - libcontainer container 94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8. Sep 6 09:56:46.162637 containerd[1557]: time="2025-09-06T09:56:46.162582628Z" level=info msg="StartContainer for \"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\" returns successfully" Sep 6 09:56:46.774419 kubelet[2713]: I0906 09:56:46.774334 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-dcbc46f74-v8wfw" podStartSLOduration=26.747429454 podStartE2EDuration="34.774318558s" podCreationTimestamp="2025-09-06 09:56:12 +0000 UTC" firstStartedPulling="2025-09-06 09:56:38.014254117 +0000 UTC m=+43.557006770" lastFinishedPulling="2025-09-06 09:56:46.041143221 +0000 UTC m=+51.583895874" observedRunningTime="2025-09-06 09:56:46.773373564 +0000 UTC m=+52.316126227" watchObservedRunningTime="2025-09-06 09:56:46.774318558 +0000 UTC m=+52.317071201" Sep 6 09:56:47.730565 containerd[1557]: time="2025-09-06T09:56:47.730495966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:47.731145 containerd[1557]: time="2025-09-06T09:56:47.731112123Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 6 09:56:47.732450 containerd[1557]: time="2025-09-06T09:56:47.732425127Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:47.734426 containerd[1557]: time="2025-09-06T09:56:47.734367101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 09:56:47.735212 containerd[1557]: time="2025-09-06T09:56:47.735162804Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.693188613s" Sep 6 09:56:47.735212 containerd[1557]: time="2025-09-06T09:56:47.735208109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 6 09:56:47.739715 containerd[1557]: time="2025-09-06T09:56:47.739655373Z" level=info msg="CreateContainer within sandbox \"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 6 09:56:47.749211 containerd[1557]: time="2025-09-06T09:56:47.749171370Z" level=info msg="Container a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627: CDI devices from CRI Config.CDIDevices: []" Sep 6 09:56:47.759746 containerd[1557]: time="2025-09-06T09:56:47.759712459Z" level=info msg="CreateContainer within sandbox \"8b2ec5ee0833ec576f6e33e4b7a4c504cdeb7b56318011dba0fbf5462f20df27\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627\"" Sep 6 09:56:47.760262 containerd[1557]: time="2025-09-06T09:56:47.760240220Z" level=info msg="StartContainer for \"a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627\"" Sep 6 09:56:47.761733 containerd[1557]: time="2025-09-06T09:56:47.761709437Z" level=info msg="connecting to shim a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627" address="unix:///run/containerd/s/4c3a3fb9e6f63eb3ba631a00b5a855a9df83a0f5eec4f4de74958a9876803df2" protocol=ttrpc version=3 Sep 6 09:56:47.773131 kubelet[2713]: I0906 09:56:47.773097 2713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 09:56:47.791520 systemd[1]: Started cri-containerd-a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627.scope - libcontainer container a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627. Sep 6 09:56:47.957352 containerd[1557]: time="2025-09-06T09:56:47.957296326Z" level=info msg="StartContainer for \"a6fb1e4a1635d34e31cf814bce67cdfd595a17c6f0f529d56ae1941960cde627\" returns successfully" Sep 6 09:56:48.275586 containerd[1557]: time="2025-09-06T09:56:48.275538265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\" id:\"19f708d42ea4c12a2dd39fc4485fb569f31690b8b011e584906d6ca15e04ff28\" pid:5455 exited_at:{seconds:1757152608 nanos:275233092}" Sep 6 09:56:48.330516 containerd[1557]: time="2025-09-06T09:56:48.330388403Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\" id:\"28d8379d6f0ffb4333abcd019c7341ac26fd14c0379a63746ff81cc8db233095\" pid:5480 exited_at:{seconds:1757152608 nanos:330062873}" Sep 6 09:56:48.609230 kubelet[2713]: I0906 09:56:48.609107 2713 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 6 09:56:48.610460 kubelet[2713]: I0906 09:56:48.610442 2713 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 6 09:56:48.794427 kubelet[2713]: I0906 09:56:48.794229 2713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-l8pm4" podStartSLOduration=26.21816917 podStartE2EDuration="36.794212092s" podCreationTimestamp="2025-09-06 09:56:12 +0000 UTC" firstStartedPulling="2025-09-06 09:56:37.159747029 +0000 UTC m=+42.702499682" lastFinishedPulling="2025-09-06 09:56:47.735789951 +0000 UTC m=+53.278542604" observedRunningTime="2025-09-06 09:56:48.790543819 +0000 UTC m=+54.333296472" watchObservedRunningTime="2025-09-06 09:56:48.794212092 +0000 UTC m=+54.336964745" Sep 6 09:56:50.290546 systemd[1]: Started sshd@12-10.0.0.36:22-10.0.0.1:38280.service - OpenSSH per-connection server daemon (10.0.0.1:38280). Sep 6 09:56:50.370632 sshd[5491]: Accepted publickey for core from 10.0.0.1 port 38280 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:50.372540 sshd-session[5491]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:50.377703 systemd-logind[1541]: New session 13 of user core. Sep 6 09:56:50.388542 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 6 09:56:50.544350 sshd[5494]: Connection closed by 10.0.0.1 port 38280 Sep 6 09:56:50.545685 sshd-session[5491]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:50.551626 systemd[1]: sshd@12-10.0.0.36:22-10.0.0.1:38280.service: Deactivated successfully. Sep 6 09:56:50.553791 systemd[1]: session-13.scope: Deactivated successfully. Sep 6 09:56:50.554645 systemd-logind[1541]: Session 13 logged out. Waiting for processes to exit. Sep 6 09:56:50.556232 systemd-logind[1541]: Removed session 13. Sep 6 09:56:55.559790 systemd[1]: Started sshd@13-10.0.0.36:22-10.0.0.1:38286.service - OpenSSH per-connection server daemon (10.0.0.1:38286). Sep 6 09:56:55.618263 sshd[5519]: Accepted publickey for core from 10.0.0.1 port 38286 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:56:55.619892 sshd-session[5519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:56:55.624227 systemd-logind[1541]: New session 14 of user core. Sep 6 09:56:55.637529 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 6 09:56:55.754464 sshd[5522]: Connection closed by 10.0.0.1 port 38286 Sep 6 09:56:55.754940 sshd-session[5519]: pam_unix(sshd:session): session closed for user core Sep 6 09:56:55.759628 systemd[1]: sshd@13-10.0.0.36:22-10.0.0.1:38286.service: Deactivated successfully. Sep 6 09:56:55.762355 systemd[1]: session-14.scope: Deactivated successfully. Sep 6 09:56:55.763257 systemd-logind[1541]: Session 14 logged out. Waiting for processes to exit. Sep 6 09:56:55.764910 systemd-logind[1541]: Removed session 14. Sep 6 09:57:00.767678 systemd[1]: Started sshd@14-10.0.0.36:22-10.0.0.1:59706.service - OpenSSH per-connection server daemon (10.0.0.1:59706). Sep 6 09:57:00.825531 sshd[5538]: Accepted publickey for core from 10.0.0.1 port 59706 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:00.827065 sshd-session[5538]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:00.831786 systemd-logind[1541]: New session 15 of user core. Sep 6 09:57:00.842544 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 6 09:57:00.961135 sshd[5541]: Connection closed by 10.0.0.1 port 59706 Sep 6 09:57:00.961663 sshd-session[5538]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:00.974049 systemd[1]: sshd@14-10.0.0.36:22-10.0.0.1:59706.service: Deactivated successfully. Sep 6 09:57:00.976782 systemd[1]: session-15.scope: Deactivated successfully. Sep 6 09:57:00.977951 systemd-logind[1541]: Session 15 logged out. Waiting for processes to exit. Sep 6 09:57:00.981158 systemd-logind[1541]: Removed session 15. Sep 6 09:57:00.982776 systemd[1]: Started sshd@15-10.0.0.36:22-10.0.0.1:59722.service - OpenSSH per-connection server daemon (10.0.0.1:59722). Sep 6 09:57:01.042387 sshd[5554]: Accepted publickey for core from 10.0.0.1 port 59722 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:01.044053 sshd-session[5554]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:01.048853 systemd-logind[1541]: New session 16 of user core. Sep 6 09:57:01.056564 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 6 09:57:01.429916 sshd[5557]: Connection closed by 10.0.0.1 port 59722 Sep 6 09:57:01.430358 sshd-session[5554]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:01.441152 systemd[1]: sshd@15-10.0.0.36:22-10.0.0.1:59722.service: Deactivated successfully. Sep 6 09:57:01.443341 systemd[1]: session-16.scope: Deactivated successfully. Sep 6 09:57:01.444322 systemd-logind[1541]: Session 16 logged out. Waiting for processes to exit. Sep 6 09:57:01.446941 systemd[1]: Started sshd@16-10.0.0.36:22-10.0.0.1:59730.service - OpenSSH per-connection server daemon (10.0.0.1:59730). Sep 6 09:57:01.447632 systemd-logind[1541]: Removed session 16. Sep 6 09:57:01.514904 sshd[5569]: Accepted publickey for core from 10.0.0.1 port 59730 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:01.516172 sshd-session[5569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:01.520607 systemd-logind[1541]: New session 17 of user core. Sep 6 09:57:01.530517 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 6 09:57:01.795725 containerd[1557]: time="2025-09-06T09:57:01.795672747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0bc79b1078ddcc3a8049f88a650e0b4d2220a7a2b9dbeff8db0f1a3377c2607\" id:\"5354d071defc9e4ade5bcba5bb4c160f3abc5a8ac278a6723266db2f791b1110\" pid:5593 exited_at:{seconds:1757152621 nanos:795215626}" Sep 6 09:57:02.414947 sshd[5572]: Connection closed by 10.0.0.1 port 59730 Sep 6 09:57:02.415434 sshd-session[5569]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:02.427011 systemd[1]: sshd@16-10.0.0.36:22-10.0.0.1:59730.service: Deactivated successfully. Sep 6 09:57:02.430600 systemd[1]: session-17.scope: Deactivated successfully. Sep 6 09:57:02.431759 systemd-logind[1541]: Session 17 logged out. Waiting for processes to exit. Sep 6 09:57:02.439834 systemd[1]: Started sshd@17-10.0.0.36:22-10.0.0.1:59746.service - OpenSSH per-connection server daemon (10.0.0.1:59746). Sep 6 09:57:02.440987 systemd-logind[1541]: Removed session 17. Sep 6 09:57:02.492711 sshd[5617]: Accepted publickey for core from 10.0.0.1 port 59746 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:02.494255 sshd-session[5617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:02.498974 systemd-logind[1541]: New session 18 of user core. Sep 6 09:57:02.507648 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 6 09:57:02.786909 sshd[5620]: Connection closed by 10.0.0.1 port 59746 Sep 6 09:57:02.787378 sshd-session[5617]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:02.799374 systemd[1]: sshd@17-10.0.0.36:22-10.0.0.1:59746.service: Deactivated successfully. Sep 6 09:57:02.802305 systemd[1]: session-18.scope: Deactivated successfully. Sep 6 09:57:02.803433 systemd-logind[1541]: Session 18 logged out. Waiting for processes to exit. Sep 6 09:57:02.808887 systemd[1]: Started sshd@18-10.0.0.36:22-10.0.0.1:59750.service - OpenSSH per-connection server daemon (10.0.0.1:59750). Sep 6 09:57:02.809621 systemd-logind[1541]: Removed session 18. Sep 6 09:57:02.855803 sshd[5631]: Accepted publickey for core from 10.0.0.1 port 59750 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:02.857705 sshd-session[5631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:02.862308 systemd-logind[1541]: New session 19 of user core. Sep 6 09:57:02.870588 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 6 09:57:02.978927 sshd[5634]: Connection closed by 10.0.0.1 port 59750 Sep 6 09:57:02.979239 sshd-session[5631]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:02.983728 systemd[1]: sshd@18-10.0.0.36:22-10.0.0.1:59750.service: Deactivated successfully. Sep 6 09:57:02.986318 systemd[1]: session-19.scope: Deactivated successfully. Sep 6 09:57:02.987208 systemd-logind[1541]: Session 19 logged out. Waiting for processes to exit. Sep 6 09:57:02.988516 systemd-logind[1541]: Removed session 19. Sep 6 09:57:05.397624 kubelet[2713]: I0906 09:57:05.397543 2713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 09:57:07.997324 systemd[1]: Started sshd@19-10.0.0.36:22-10.0.0.1:59764.service - OpenSSH per-connection server daemon (10.0.0.1:59764). Sep 6 09:57:08.057869 sshd[5651]: Accepted publickey for core from 10.0.0.1 port 59764 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:08.059579 sshd-session[5651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:08.064019 systemd-logind[1541]: New session 20 of user core. Sep 6 09:57:08.071556 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 6 09:57:08.190274 sshd[5654]: Connection closed by 10.0.0.1 port 59764 Sep 6 09:57:08.190642 sshd-session[5651]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:08.195908 systemd[1]: sshd@19-10.0.0.36:22-10.0.0.1:59764.service: Deactivated successfully. Sep 6 09:57:08.198322 systemd[1]: session-20.scope: Deactivated successfully. Sep 6 09:57:08.199061 systemd-logind[1541]: Session 20 logged out. Waiting for processes to exit. Sep 6 09:57:08.200275 systemd-logind[1541]: Removed session 20. Sep 6 09:57:13.206940 systemd[1]: Started sshd@20-10.0.0.36:22-10.0.0.1:46202.service - OpenSSH per-connection server daemon (10.0.0.1:46202). Sep 6 09:57:13.286795 sshd[5675]: Accepted publickey for core from 10.0.0.1 port 46202 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:13.288518 sshd-session[5675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:13.294319 systemd-logind[1541]: New session 21 of user core. Sep 6 09:57:13.301679 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 6 09:57:13.505477 sshd[5678]: Connection closed by 10.0.0.1 port 46202 Sep 6 09:57:13.505937 sshd-session[5675]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:13.511514 systemd[1]: sshd@20-10.0.0.36:22-10.0.0.1:46202.service: Deactivated successfully. Sep 6 09:57:13.514367 systemd[1]: session-21.scope: Deactivated successfully. Sep 6 09:57:13.515426 systemd-logind[1541]: Session 21 logged out. Waiting for processes to exit. Sep 6 09:57:13.517409 systemd-logind[1541]: Removed session 21. Sep 6 09:57:14.920115 containerd[1557]: time="2025-09-06T09:57:14.920045612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d01300f3a388e17ae2445989de7682b789c8f574897dfd8a9b6645e35fa7f98d\" id:\"f6f4306474255e99dcd2ed9048d6f89e4c7946becd87898901ac01fa70f4a155\" pid:5702 exited_at:{seconds:1757152634 nanos:864544739}" Sep 6 09:57:18.340357 containerd[1557]: time="2025-09-06T09:57:18.340301167Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94a282de7c40fb078d4c63e4fc6ec8e7a2e55894842baa8be582760e670bd2e8\" id:\"e49115f8f7cd626ed1766903dd4609a992c06f5a5aaa2187d013087acab27685\" pid:5730 exited_at:{seconds:1757152638 nanos:339782407}" Sep 6 09:57:18.523226 systemd[1]: Started sshd@21-10.0.0.36:22-10.0.0.1:46214.service - OpenSSH per-connection server daemon (10.0.0.1:46214). Sep 6 09:57:18.576274 sshd[5741]: Accepted publickey for core from 10.0.0.1 port 46214 ssh2: RSA SHA256:RP0lRXGXzH090yOxzY7O6pG5YAyAkwYHkc0+TZI0kSM Sep 6 09:57:18.577848 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 09:57:18.582927 systemd-logind[1541]: New session 22 of user core. Sep 6 09:57:18.596756 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 6 09:57:18.709727 sshd[5744]: Connection closed by 10.0.0.1 port 46214 Sep 6 09:57:18.710110 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Sep 6 09:57:18.714919 systemd[1]: sshd@21-10.0.0.36:22-10.0.0.1:46214.service: Deactivated successfully. Sep 6 09:57:18.717156 systemd[1]: session-22.scope: Deactivated successfully. Sep 6 09:57:18.717928 systemd-logind[1541]: Session 22 logged out. Waiting for processes to exit. Sep 6 09:57:18.719184 systemd-logind[1541]: Removed session 22.