Sep 4 15:40:57.811165 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 4 13:44:59 -00 2025 Sep 4 15:40:57.811189 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:40:57.811198 kernel: BIOS-provided physical RAM map: Sep 4 15:40:57.811205 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 4 15:40:57.811211 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 4 15:40:57.811218 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 4 15:40:57.811225 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Sep 4 15:40:57.811232 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Sep 4 15:40:57.811256 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 4 15:40:57.811264 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 4 15:40:57.811272 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 4 15:40:57.811278 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 4 15:40:57.811284 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 4 15:40:57.811291 kernel: NX (Execute Disable) protection: active Sep 4 15:40:57.811301 kernel: APIC: Static calls initialized Sep 4 15:40:57.811309 kernel: SMBIOS 2.8 present. Sep 4 15:40:57.811319 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Sep 4 15:40:57.811326 kernel: DMI: Memory slots populated: 1/1 Sep 4 15:40:57.811333 kernel: Hypervisor detected: KVM Sep 4 15:40:57.811340 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 4 15:40:57.811347 kernel: kvm-clock: using sched offset of 4159484402 cycles Sep 4 15:40:57.811355 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 4 15:40:57.811362 kernel: tsc: Detected 2794.750 MHz processor Sep 4 15:40:57.811372 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 15:40:57.811380 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 15:40:57.811387 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Sep 4 15:40:57.811394 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 4 15:40:57.811402 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 15:40:57.811409 kernel: Using GB pages for direct mapping Sep 4 15:40:57.811416 kernel: ACPI: Early table checksum verification disabled Sep 4 15:40:57.811423 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Sep 4 15:40:57.811430 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811440 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811447 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811455 kernel: ACPI: FACS 0x000000009CFE0000 000040 Sep 4 15:40:57.811462 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811469 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811476 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811483 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 15:40:57.811491 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Sep 4 15:40:57.811503 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Sep 4 15:40:57.811511 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Sep 4 15:40:57.811518 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Sep 4 15:40:57.811525 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Sep 4 15:40:57.811533 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Sep 4 15:40:57.811540 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Sep 4 15:40:57.811550 kernel: No NUMA configuration found Sep 4 15:40:57.811558 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Sep 4 15:40:57.811565 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Sep 4 15:40:57.811573 kernel: Zone ranges: Sep 4 15:40:57.811580 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 15:40:57.811588 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Sep 4 15:40:57.811595 kernel: Normal empty Sep 4 15:40:57.811602 kernel: Device empty Sep 4 15:40:57.811610 kernel: Movable zone start for each node Sep 4 15:40:57.811617 kernel: Early memory node ranges Sep 4 15:40:57.811627 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 4 15:40:57.811634 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Sep 4 15:40:57.811641 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Sep 4 15:40:57.811649 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 15:40:57.811656 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 4 15:40:57.811663 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 4 15:40:57.811671 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 4 15:40:57.811706 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 4 15:40:57.811714 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 4 15:40:57.811724 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 15:40:57.811732 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 4 15:40:57.811743 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 15:40:57.811750 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 4 15:40:57.811758 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 4 15:40:57.811765 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 15:40:57.811773 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 4 15:40:57.811789 kernel: TSC deadline timer available Sep 4 15:40:57.811798 kernel: CPU topo: Max. logical packages: 1 Sep 4 15:40:57.811827 kernel: CPU topo: Max. logical dies: 1 Sep 4 15:40:57.811835 kernel: CPU topo: Max. dies per package: 1 Sep 4 15:40:57.811842 kernel: CPU topo: Max. threads per core: 1 Sep 4 15:40:57.811849 kernel: CPU topo: Num. cores per package: 4 Sep 4 15:40:57.811857 kernel: CPU topo: Num. threads per package: 4 Sep 4 15:40:57.811864 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 4 15:40:57.811872 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 4 15:40:57.811879 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 4 15:40:57.811892 kernel: kvm-guest: setup PV sched yield Sep 4 15:40:57.811900 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 4 15:40:57.811911 kernel: Booting paravirtualized kernel on KVM Sep 4 15:40:57.811918 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 15:40:57.811926 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 4 15:40:57.811934 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 4 15:40:57.811941 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 4 15:40:57.811948 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 4 15:40:57.811956 kernel: kvm-guest: PV spinlocks enabled Sep 4 15:40:57.811963 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 4 15:40:57.811972 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:40:57.811983 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 15:40:57.811991 kernel: random: crng init done Sep 4 15:40:57.811998 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 4 15:40:57.812006 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 15:40:57.812013 kernel: Fallback order for Node 0: 0 Sep 4 15:40:57.812021 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Sep 4 15:40:57.812028 kernel: Policy zone: DMA32 Sep 4 15:40:57.812036 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 15:40:57.812046 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 4 15:40:57.812054 kernel: ftrace: allocating 40102 entries in 157 pages Sep 4 15:40:57.812061 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 15:40:57.812068 kernel: Dynamic Preempt: voluntary Sep 4 15:40:57.812076 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 15:40:57.812084 kernel: rcu: RCU event tracing is enabled. Sep 4 15:40:57.812092 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 4 15:40:57.812099 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 15:40:57.812110 kernel: Rude variant of Tasks RCU enabled. Sep 4 15:40:57.812120 kernel: Tracing variant of Tasks RCU enabled. Sep 4 15:40:57.812128 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 15:40:57.812135 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 4 15:40:57.812143 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 15:40:57.812151 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 15:40:57.812159 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 15:40:57.812166 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 4 15:40:57.812174 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 15:40:57.812193 kernel: Console: colour VGA+ 80x25 Sep 4 15:40:57.812200 kernel: printk: legacy console [ttyS0] enabled Sep 4 15:40:57.812208 kernel: ACPI: Core revision 20240827 Sep 4 15:40:57.812216 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 4 15:40:57.812226 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 15:40:57.812241 kernel: x2apic enabled Sep 4 15:40:57.812249 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 15:40:57.812259 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 4 15:40:57.812268 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 4 15:40:57.812278 kernel: kvm-guest: setup PV IPIs Sep 4 15:40:57.812286 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 15:40:57.812293 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 4 15:40:57.812301 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 4 15:40:57.812309 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 4 15:40:57.812317 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 4 15:40:57.812324 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 4 15:40:57.812332 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 15:40:57.812342 kernel: Spectre V2 : Mitigation: Retpolines Sep 4 15:40:57.812350 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 4 15:40:57.812358 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 4 15:40:57.812366 kernel: active return thunk: retbleed_return_thunk Sep 4 15:40:57.812374 kernel: RETBleed: Mitigation: untrained return thunk Sep 4 15:40:57.812382 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 15:40:57.812390 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 15:40:57.812398 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 4 15:40:57.812406 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 4 15:40:57.812416 kernel: active return thunk: srso_return_thunk Sep 4 15:40:57.812424 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 4 15:40:57.812432 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 15:40:57.812440 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 15:40:57.812447 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 15:40:57.812455 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 15:40:57.812463 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 4 15:40:57.812470 kernel: Freeing SMP alternatives memory: 32K Sep 4 15:40:57.812478 kernel: pid_max: default: 32768 minimum: 301 Sep 4 15:40:57.812488 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 15:40:57.812496 kernel: landlock: Up and running. Sep 4 15:40:57.812504 kernel: SELinux: Initializing. Sep 4 15:40:57.812514 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 15:40:57.812522 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 15:40:57.812530 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 4 15:40:57.812538 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 4 15:40:57.812546 kernel: ... version: 0 Sep 4 15:40:57.812556 kernel: ... bit width: 48 Sep 4 15:40:57.812564 kernel: ... generic registers: 6 Sep 4 15:40:57.812571 kernel: ... value mask: 0000ffffffffffff Sep 4 15:40:57.812579 kernel: ... max period: 00007fffffffffff Sep 4 15:40:57.812587 kernel: ... fixed-purpose events: 0 Sep 4 15:40:57.812594 kernel: ... event mask: 000000000000003f Sep 4 15:40:57.812602 kernel: signal: max sigframe size: 1776 Sep 4 15:40:57.812609 kernel: rcu: Hierarchical SRCU implementation. Sep 4 15:40:57.812617 kernel: rcu: Max phase no-delay instances is 400. Sep 4 15:40:57.812625 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 4 15:40:57.812635 kernel: smp: Bringing up secondary CPUs ... Sep 4 15:40:57.812643 kernel: smpboot: x86: Booting SMP configuration: Sep 4 15:40:57.812651 kernel: .... node #0, CPUs: #1 #2 #3 Sep 4 15:40:57.812658 kernel: smp: Brought up 1 node, 4 CPUs Sep 4 15:40:57.812666 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 4 15:40:57.812674 kernel: Memory: 2428920K/2571752K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 136904K reserved, 0K cma-reserved) Sep 4 15:40:57.812719 kernel: devtmpfs: initialized Sep 4 15:40:57.812728 kernel: x86/mm: Memory block size: 128MB Sep 4 15:40:57.812736 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 15:40:57.812747 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 4 15:40:57.812755 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 15:40:57.812762 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 15:40:57.812775 kernel: audit: initializing netlink subsys (disabled) Sep 4 15:40:57.812783 kernel: audit: type=2000 audit(1757000455.132:1): state=initialized audit_enabled=0 res=1 Sep 4 15:40:57.812791 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 15:40:57.812798 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 15:40:57.812806 kernel: cpuidle: using governor menu Sep 4 15:40:57.812814 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 15:40:57.812825 kernel: dca service started, version 1.12.1 Sep 4 15:40:57.812832 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 4 15:40:57.812840 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 4 15:40:57.812848 kernel: PCI: Using configuration type 1 for base access Sep 4 15:40:57.812856 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 15:40:57.812864 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 15:40:57.812872 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 15:40:57.812879 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 15:40:57.812890 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 15:40:57.812897 kernel: ACPI: Added _OSI(Module Device) Sep 4 15:40:57.812905 kernel: ACPI: Added _OSI(Processor Device) Sep 4 15:40:57.812913 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 15:40:57.812920 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 15:40:57.812928 kernel: ACPI: Interpreter enabled Sep 4 15:40:57.812935 kernel: ACPI: PM: (supports S0 S3 S5) Sep 4 15:40:57.812943 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 15:40:57.812951 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 15:40:57.812959 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 15:40:57.812969 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 4 15:40:57.812977 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 4 15:40:57.813180 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 15:40:57.813317 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 4 15:40:57.813440 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 4 15:40:57.813451 kernel: PCI host bridge to bus 0000:00 Sep 4 15:40:57.813588 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 15:40:57.813835 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 15:40:57.813946 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 15:40:57.814055 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Sep 4 15:40:57.814164 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 4 15:40:57.814287 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 4 15:40:57.814398 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 4 15:40:57.814582 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 4 15:40:57.814738 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 4 15:40:57.814860 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Sep 4 15:40:57.814980 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Sep 4 15:40:57.815099 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Sep 4 15:40:57.815218 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 15:40:57.815365 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 4 15:40:57.815495 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Sep 4 15:40:57.815616 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Sep 4 15:40:57.815756 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Sep 4 15:40:57.815894 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 4 15:40:57.816015 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Sep 4 15:40:57.816135 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Sep 4 15:40:57.816262 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Sep 4 15:40:57.816411 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 4 15:40:57.816533 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Sep 4 15:40:57.816652 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Sep 4 15:40:57.816810 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Sep 4 15:40:57.816932 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Sep 4 15:40:57.817067 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 4 15:40:57.817194 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 4 15:40:57.817351 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 4 15:40:57.817480 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Sep 4 15:40:57.817599 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Sep 4 15:40:57.817757 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 4 15:40:57.817880 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 4 15:40:57.817890 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 4 15:40:57.817903 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 4 15:40:57.817911 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 15:40:57.817919 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 4 15:40:57.817927 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 4 15:40:57.817935 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 4 15:40:57.817943 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 4 15:40:57.817951 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 4 15:40:57.817958 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 4 15:40:57.817966 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 4 15:40:57.817977 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 4 15:40:57.817985 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 4 15:40:57.817993 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 4 15:40:57.818000 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 4 15:40:57.818008 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 4 15:40:57.818016 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 4 15:40:57.818024 kernel: iommu: Default domain type: Translated Sep 4 15:40:57.818031 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 15:40:57.818040 kernel: PCI: Using ACPI for IRQ routing Sep 4 15:40:57.818050 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 15:40:57.818057 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 4 15:40:57.818065 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Sep 4 15:40:57.818184 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 4 15:40:57.818315 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 4 15:40:57.818434 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 15:40:57.818445 kernel: vgaarb: loaded Sep 4 15:40:57.818453 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 4 15:40:57.818464 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 4 15:40:57.818472 kernel: clocksource: Switched to clocksource kvm-clock Sep 4 15:40:57.818480 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 15:40:57.818488 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 15:40:57.818496 kernel: pnp: PnP ACPI init Sep 4 15:40:57.818707 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 4 15:40:57.818720 kernel: pnp: PnP ACPI: found 6 devices Sep 4 15:40:57.818728 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 15:40:57.818740 kernel: NET: Registered PF_INET protocol family Sep 4 15:40:57.818747 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 15:40:57.818756 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 4 15:40:57.818763 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 15:40:57.818771 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 15:40:57.818779 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 4 15:40:57.818787 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 4 15:40:57.818795 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 15:40:57.818803 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 15:40:57.818813 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 15:40:57.818821 kernel: NET: Registered PF_XDP protocol family Sep 4 15:40:57.818933 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 15:40:57.819044 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 15:40:57.819160 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 15:40:57.819283 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Sep 4 15:40:57.819394 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 4 15:40:57.819503 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 4 15:40:57.819517 kernel: PCI: CLS 0 bytes, default 64 Sep 4 15:40:57.819525 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 4 15:40:57.819533 kernel: Initialise system trusted keyrings Sep 4 15:40:57.819541 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 4 15:40:57.819549 kernel: Key type asymmetric registered Sep 4 15:40:57.819557 kernel: Asymmetric key parser 'x509' registered Sep 4 15:40:57.819565 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 15:40:57.819573 kernel: io scheduler mq-deadline registered Sep 4 15:40:57.819581 kernel: io scheduler kyber registered Sep 4 15:40:57.819589 kernel: io scheduler bfq registered Sep 4 15:40:57.819599 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 15:40:57.819607 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 4 15:40:57.819615 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 4 15:40:57.819623 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 4 15:40:57.819631 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 15:40:57.819639 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 15:40:57.819647 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 4 15:40:57.819655 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 15:40:57.819663 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 15:40:57.819820 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 4 15:40:57.819833 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 15:40:57.819947 kernel: rtc_cmos 00:04: registered as rtc0 Sep 4 15:40:57.820061 kernel: rtc_cmos 00:04: setting system clock to 2025-09-04T15:40:57 UTC (1757000457) Sep 4 15:40:57.820174 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 4 15:40:57.820184 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 4 15:40:57.820192 kernel: NET: Registered PF_INET6 protocol family Sep 4 15:40:57.820204 kernel: Segment Routing with IPv6 Sep 4 15:40:57.820212 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 15:40:57.820220 kernel: NET: Registered PF_PACKET protocol family Sep 4 15:40:57.820228 kernel: Key type dns_resolver registered Sep 4 15:40:57.820243 kernel: IPI shorthand broadcast: enabled Sep 4 15:40:57.820251 kernel: sched_clock: Marking stable (2963001822, 108807394)->(3091614626, -19805410) Sep 4 15:40:57.820259 kernel: registered taskstats version 1 Sep 4 15:40:57.820267 kernel: Loading compiled-in X.509 certificates Sep 4 15:40:57.820275 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 1106dff6b31a2cb943a47c73d0d8dff07e2a7490' Sep 4 15:40:57.820286 kernel: Demotion targets for Node 0: null Sep 4 15:40:57.820294 kernel: Key type .fscrypt registered Sep 4 15:40:57.820302 kernel: Key type fscrypt-provisioning registered Sep 4 15:40:57.820310 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 15:40:57.820317 kernel: ima: Allocated hash algorithm: sha1 Sep 4 15:40:57.820325 kernel: ima: No architecture policies found Sep 4 15:40:57.820333 kernel: clk: Disabling unused clocks Sep 4 15:40:57.820341 kernel: Warning: unable to open an initial console. Sep 4 15:40:57.820349 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 4 15:40:57.820359 kernel: Write protecting the kernel read-only data: 24576k Sep 4 15:40:57.820367 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 4 15:40:57.820375 kernel: Run /init as init process Sep 4 15:40:57.820382 kernel: with arguments: Sep 4 15:40:57.820390 kernel: /init Sep 4 15:40:57.820398 kernel: with environment: Sep 4 15:40:57.820405 kernel: HOME=/ Sep 4 15:40:57.820413 kernel: TERM=linux Sep 4 15:40:57.820420 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 15:40:57.820432 systemd[1]: Successfully made /usr/ read-only. Sep 4 15:40:57.820453 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:40:57.820465 systemd[1]: Detected virtualization kvm. Sep 4 15:40:57.820473 systemd[1]: Detected architecture x86-64. Sep 4 15:40:57.820481 systemd[1]: Running in initrd. Sep 4 15:40:57.820492 systemd[1]: No hostname configured, using default hostname. Sep 4 15:40:57.820500 systemd[1]: Hostname set to . Sep 4 15:40:57.820509 systemd[1]: Initializing machine ID from VM UUID. Sep 4 15:40:57.820517 systemd[1]: Queued start job for default target initrd.target. Sep 4 15:40:57.820526 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:40:57.820535 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:40:57.820544 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 15:40:57.820553 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:40:57.820564 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 15:40:57.820573 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 15:40:57.820583 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 15:40:57.820591 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 15:40:57.820600 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:40:57.820609 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:40:57.820617 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:40:57.820628 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:40:57.820636 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:40:57.820645 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:40:57.820654 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:40:57.820662 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:40:57.820671 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 15:40:57.820694 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 15:40:57.820703 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:40:57.820715 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:40:57.820725 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:40:57.820734 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:40:57.820742 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 15:40:57.820751 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:40:57.820762 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 15:40:57.820774 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 15:40:57.820782 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 15:40:57.820791 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:40:57.820799 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:40:57.820808 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:40:57.820817 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 15:40:57.820828 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:40:57.820837 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 15:40:57.820866 systemd-journald[220]: Collecting audit messages is disabled. Sep 4 15:40:57.820889 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 15:40:57.820899 systemd-journald[220]: Journal started Sep 4 15:40:57.820917 systemd-journald[220]: Runtime Journal (/run/log/journal/f8e5019f17df401897887788581dcbf1) is 6M, max 48.6M, 42.5M free. Sep 4 15:40:57.810658 systemd-modules-load[221]: Inserted module 'overlay' Sep 4 15:40:57.857937 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:40:57.857968 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 15:40:57.857985 kernel: Bridge firewalling registered Sep 4 15:40:57.838766 systemd-modules-load[221]: Inserted module 'br_netfilter' Sep 4 15:40:57.859472 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:40:57.861808 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:40:57.864161 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 15:40:57.870266 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 15:40:57.874067 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:40:57.875093 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:40:57.885537 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:40:57.892217 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:40:57.895176 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:40:57.897523 systemd-tmpfiles[246]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 15:40:57.900871 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:40:57.903043 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 15:40:57.904562 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:40:57.913408 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:40:57.927341 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 15:40:57.952985 systemd-resolved[262]: Positive Trust Anchors: Sep 4 15:40:57.953005 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:40:57.953035 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:40:57.955569 systemd-resolved[262]: Defaulting to hostname 'linux'. Sep 4 15:40:57.961241 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:40:57.962407 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:40:58.035718 kernel: SCSI subsystem initialized Sep 4 15:40:58.044708 kernel: Loading iSCSI transport class v2.0-870. Sep 4 15:40:58.055717 kernel: iscsi: registered transport (tcp) Sep 4 15:40:58.078709 kernel: iscsi: registered transport (qla4xxx) Sep 4 15:40:58.078735 kernel: QLogic iSCSI HBA Driver Sep 4 15:40:58.099699 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:40:58.121493 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:40:58.123776 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:40:58.180468 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 15:40:58.182516 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 15:40:58.240710 kernel: raid6: avx2x4 gen() 30208 MB/s Sep 4 15:40:58.257704 kernel: raid6: avx2x2 gen() 30719 MB/s Sep 4 15:40:58.274819 kernel: raid6: avx2x1 gen() 24728 MB/s Sep 4 15:40:58.274850 kernel: raid6: using algorithm avx2x2 gen() 30719 MB/s Sep 4 15:40:58.292754 kernel: raid6: .... xor() 18940 MB/s, rmw enabled Sep 4 15:40:58.292787 kernel: raid6: using avx2x2 recovery algorithm Sep 4 15:40:58.312707 kernel: xor: automatically using best checksumming function avx Sep 4 15:40:58.477731 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 15:40:58.487090 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:40:58.489263 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:40:58.523133 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 4 15:40:58.528932 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:40:58.532604 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 15:40:58.566057 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 4 15:40:58.598351 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:40:58.600819 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:40:58.687151 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:40:58.688716 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 15:40:58.757726 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 4 15:40:58.759706 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 15:40:58.766766 kernel: AES CTR mode by8 optimization enabled Sep 4 15:40:58.772060 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 4 15:40:58.776720 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 15:40:58.776744 kernel: GPT:9289727 != 19775487 Sep 4 15:40:58.776756 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 15:40:58.776766 kernel: GPT:9289727 != 19775487 Sep 4 15:40:58.776776 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 15:40:58.776786 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 15:40:58.799883 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:40:58.800742 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:40:58.806598 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:40:58.811536 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:40:58.812390 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 15:40:58.828705 kernel: libata version 3.00 loaded. Sep 4 15:40:58.840305 kernel: ahci 0000:00:1f.2: version 3.0 Sep 4 15:40:58.840516 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 4 15:40:58.844044 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 4 15:40:58.845834 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 4 15:40:58.845850 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 4 15:40:58.846048 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 4 15:40:58.844016 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 4 15:40:58.850709 kernel: scsi host0: ahci Sep 4 15:40:58.850923 kernel: scsi host1: ahci Sep 4 15:40:58.851079 kernel: scsi host2: ahci Sep 4 15:40:58.851234 kernel: scsi host3: ahci Sep 4 15:40:58.854701 kernel: scsi host4: ahci Sep 4 15:40:58.855015 kernel: scsi host5: ahci Sep 4 15:40:58.855169 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 1 Sep 4 15:40:58.855181 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 1 Sep 4 15:40:58.855191 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 1 Sep 4 15:40:58.855202 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 1 Sep 4 15:40:58.855221 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 1 Sep 4 15:40:58.855232 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 1 Sep 4 15:40:58.866336 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 4 15:40:58.896997 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 15:40:58.897527 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:40:58.909186 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 4 15:40:58.910434 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 4 15:40:58.913663 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 15:40:58.947989 disk-uuid[634]: Primary Header is updated. Sep 4 15:40:58.947989 disk-uuid[634]: Secondary Entries is updated. Sep 4 15:40:58.947989 disk-uuid[634]: Secondary Header is updated. Sep 4 15:40:58.951706 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 15:40:59.160780 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 4 15:40:59.160868 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 4 15:40:59.160881 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 4 15:40:59.161710 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 4 15:40:59.162708 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 4 15:40:59.163715 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 4 15:40:59.164718 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 15:40:59.164733 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 4 15:40:59.164984 kernel: ata3.00: applying bridge limits Sep 4 15:40:59.166118 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 15:40:59.166131 kernel: ata3.00: configured for UDMA/100 Sep 4 15:40:59.168702 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 4 15:40:59.211739 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 4 15:40:59.212064 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 4 15:40:59.229723 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 4 15:40:59.669177 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 15:40:59.672117 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:40:59.674747 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:40:59.676946 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:40:59.680013 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 15:40:59.709208 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:40:59.959709 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 15:40:59.960120 disk-uuid[635]: The operation has completed successfully. Sep 4 15:40:59.990866 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 15:40:59.990990 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 15:41:00.022647 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 15:41:00.047258 sh[663]: Success Sep 4 15:41:00.065744 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 15:41:00.065833 kernel: device-mapper: uevent: version 1.0.3 Sep 4 15:41:00.065847 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 15:41:00.077725 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 4 15:41:00.112474 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 15:41:00.115927 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 15:41:00.131157 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 15:41:00.139399 kernel: BTRFS: device fsid 03d586f6-54f4-4e78-a040-c693154b15e4 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (675) Sep 4 15:41:00.139435 kernel: BTRFS info (device dm-0): first mount of filesystem 03d586f6-54f4-4e78-a040-c693154b15e4 Sep 4 15:41:00.139447 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:41:00.144865 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 15:41:00.144896 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 15:41:00.146328 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 15:41:00.147730 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:41:00.149099 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 15:41:00.149967 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 15:41:00.151644 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 15:41:00.181709 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (708) Sep 4 15:41:00.181746 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:41:00.183189 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:41:00.186298 kernel: BTRFS info (device vda6): turning on async discard Sep 4 15:41:00.186321 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 15:41:00.191699 kernel: BTRFS info (device vda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:41:00.192231 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 15:41:00.195582 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 15:41:00.329011 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:41:00.334868 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:41:00.341764 ignition[753]: Ignition 2.22.0 Sep 4 15:41:00.341778 ignition[753]: Stage: fetch-offline Sep 4 15:41:00.341850 ignition[753]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:00.341861 ignition[753]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:00.341943 ignition[753]: parsed url from cmdline: "" Sep 4 15:41:00.341948 ignition[753]: no config URL provided Sep 4 15:41:00.341953 ignition[753]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 15:41:00.341961 ignition[753]: no config at "/usr/lib/ignition/user.ign" Sep 4 15:41:00.341985 ignition[753]: op(1): [started] loading QEMU firmware config module Sep 4 15:41:00.341990 ignition[753]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 4 15:41:00.351060 ignition[753]: op(1): [finished] loading QEMU firmware config module Sep 4 15:41:00.378316 systemd-networkd[851]: lo: Link UP Sep 4 15:41:00.378329 systemd-networkd[851]: lo: Gained carrier Sep 4 15:41:00.381509 systemd-networkd[851]: Enumeration completed Sep 4 15:41:00.382462 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:41:00.385157 systemd-networkd[851]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 15:41:00.385166 systemd-networkd[851]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 15:41:00.388968 systemd[1]: Reached target network.target - Network. Sep 4 15:41:00.391538 systemd-networkd[851]: eth0: Link UP Sep 4 15:41:00.392581 systemd-networkd[851]: eth0: Gained carrier Sep 4 15:41:00.392591 systemd-networkd[851]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 15:41:00.402405 ignition[753]: parsing config with SHA512: 639319dd9d160573de8259154c7a4205eeeb104d45777337713703d2808c64befd95a22ae985d5320c091503a3ed5c38dd36acd7cb7f1124e9befc6533cbf480 Sep 4 15:41:00.407427 unknown[753]: fetched base config from "system" Sep 4 15:41:00.407442 unknown[753]: fetched user config from "qemu" Sep 4 15:41:00.407877 ignition[753]: fetch-offline: fetch-offline passed Sep 4 15:41:00.407945 ignition[753]: Ignition finished successfully Sep 4 15:41:00.409734 systemd-networkd[851]: eth0: DHCPv4 address 10.0.0.9/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 15:41:00.411066 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:41:00.416323 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 15:41:00.417429 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 15:41:00.461905 ignition[858]: Ignition 2.22.0 Sep 4 15:41:00.461918 ignition[858]: Stage: kargs Sep 4 15:41:00.462050 ignition[858]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:00.462061 ignition[858]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:00.462847 ignition[858]: kargs: kargs passed Sep 4 15:41:00.462900 ignition[858]: Ignition finished successfully Sep 4 15:41:00.470586 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 15:41:00.473861 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 15:41:00.560115 ignition[866]: Ignition 2.22.0 Sep 4 15:41:00.560129 ignition[866]: Stage: disks Sep 4 15:41:00.560303 ignition[866]: no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:00.560314 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:00.561045 ignition[866]: disks: disks passed Sep 4 15:41:00.561089 ignition[866]: Ignition finished successfully Sep 4 15:41:00.571056 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 15:41:00.573461 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 15:41:00.573546 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 15:41:00.575617 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:41:00.577856 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:41:00.581593 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:41:00.583799 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 15:41:00.617519 systemd-fsck[876]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 4 15:41:00.775615 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 15:41:00.776939 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 15:41:00.894717 kernel: EXT4-fs (vda9): mounted filesystem b9579306-9cef-42ea-893b-17169f1ea8af r/w with ordered data mode. Quota mode: none. Sep 4 15:41:00.895976 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 15:41:00.898741 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 15:41:00.902966 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:41:00.906215 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 15:41:00.908787 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 15:41:00.908849 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 15:41:00.911074 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:41:00.918815 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 15:41:00.922993 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (885) Sep 4 15:41:00.923023 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:41:00.923034 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:41:00.924603 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 15:41:00.927359 kernel: BTRFS info (device vda6): turning on async discard Sep 4 15:41:00.927389 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 15:41:00.928952 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:41:00.997007 initrd-setup-root[909]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 15:41:01.002391 initrd-setup-root[916]: cut: /sysroot/etc/group: No such file or directory Sep 4 15:41:01.007983 initrd-setup-root[923]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 15:41:01.013352 initrd-setup-root[930]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 15:41:01.121415 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 15:41:01.123959 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 15:41:01.126170 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 15:41:01.142850 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 15:41:01.144592 kernel: BTRFS info (device vda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:41:01.157866 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 15:41:01.185736 ignition[999]: INFO : Ignition 2.22.0 Sep 4 15:41:01.185736 ignition[999]: INFO : Stage: mount Sep 4 15:41:01.187810 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:01.187810 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:01.190429 ignition[999]: INFO : mount: mount passed Sep 4 15:41:01.191251 ignition[999]: INFO : Ignition finished successfully Sep 4 15:41:01.194401 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 15:41:01.197078 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 15:41:01.223771 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 15:41:01.252368 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1011) Sep 4 15:41:01.252415 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 15:41:01.252427 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 15:41:01.256271 kernel: BTRFS info (device vda6): turning on async discard Sep 4 15:41:01.256297 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 15:41:01.257995 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 15:41:01.304924 ignition[1028]: INFO : Ignition 2.22.0 Sep 4 15:41:01.304924 ignition[1028]: INFO : Stage: files Sep 4 15:41:01.306926 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:01.306926 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:01.306926 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Sep 4 15:41:01.306926 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 15:41:01.306926 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 15:41:01.313897 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 15:41:01.313897 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 15:41:01.313897 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 15:41:01.313897 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 4 15:41:01.313897 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 4 15:41:01.309501 unknown[1028]: wrote ssh authorized keys file for user: core Sep 4 15:41:01.417245 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 15:41:01.521825 systemd-networkd[851]: eth0: Gained IPv6LL Sep 4 15:41:02.049807 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 4 15:41:02.049807 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:41:02.054309 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 15:41:02.068298 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 4 15:41:02.759883 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 15:41:05.205549 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 15:41:05.205549 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 15:41:05.210080 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:41:05.212674 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 15:41:05.212674 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 15:41:05.212674 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 4 15:41:05.217302 ignition[1028]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:41:05.217302 ignition[1028]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 15:41:05.217302 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 4 15:41:05.217302 ignition[1028]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 4 15:41:05.236284 ignition[1028]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:41:05.241200 ignition[1028]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 15:41:05.242833 ignition[1028]: INFO : files: files passed Sep 4 15:41:05.242833 ignition[1028]: INFO : Ignition finished successfully Sep 4 15:41:05.245802 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 15:41:05.250554 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 15:41:05.265509 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 15:41:05.270154 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 15:41:05.270312 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 15:41:05.276712 initrd-setup-root-after-ignition[1057]: grep: /sysroot/oem/oem-release: No such file or directory Sep 4 15:41:05.280494 initrd-setup-root-after-ignition[1059]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:41:05.282230 initrd-setup-root-after-ignition[1059]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:41:05.283800 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 15:41:05.286313 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:41:05.286576 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 15:41:05.290990 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 15:41:05.342991 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 15:41:05.343223 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 15:41:05.347690 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 15:41:05.349988 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 15:41:05.350149 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 15:41:05.354014 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 15:41:05.379956 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:41:05.384118 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 15:41:05.418629 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:41:05.420189 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:41:05.422816 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 15:41:05.424130 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 15:41:05.424249 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 15:41:05.429565 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 15:41:05.429711 systemd[1]: Stopped target basic.target - Basic System. Sep 4 15:41:05.432833 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 15:41:05.433812 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 15:41:05.436028 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 15:41:05.438122 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 15:41:05.438435 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 15:41:05.438920 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 15:41:05.444879 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 15:41:05.446023 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 15:41:05.446340 systemd[1]: Stopped target swap.target - Swaps. Sep 4 15:41:05.449535 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 15:41:05.449640 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 15:41:05.451389 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:41:05.451742 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:41:05.452009 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 15:41:05.452179 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:41:05.458765 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 15:41:05.458868 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 15:41:05.461787 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 15:41:05.461901 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 15:41:05.462834 systemd[1]: Stopped target paths.target - Path Units. Sep 4 15:41:05.464727 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 15:41:05.469746 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:41:05.469881 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 15:41:05.472344 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 15:41:05.472658 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 15:41:05.472764 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 15:41:05.475531 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 15:41:05.475612 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 15:41:05.477157 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 15:41:05.477260 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 15:41:05.478890 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 15:41:05.478993 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 15:41:05.483653 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 15:41:05.484713 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 15:41:05.484823 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:41:05.488174 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 15:41:05.489096 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 15:41:05.489208 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:41:05.491332 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 15:41:05.491439 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 15:41:05.511620 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 15:41:05.511812 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 15:41:05.546700 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 15:41:05.568450 ignition[1083]: INFO : Ignition 2.22.0 Sep 4 15:41:05.568450 ignition[1083]: INFO : Stage: umount Sep 4 15:41:05.571033 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 15:41:05.571033 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 15:41:05.571033 ignition[1083]: INFO : umount: umount passed Sep 4 15:41:05.571033 ignition[1083]: INFO : Ignition finished successfully Sep 4 15:41:05.573066 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 15:41:05.573194 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 15:41:05.574007 systemd[1]: Stopped target network.target - Network. Sep 4 15:41:05.576858 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 15:41:05.576955 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 15:41:05.577992 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 15:41:05.578068 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 15:41:05.580627 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 15:41:05.580706 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 15:41:05.581493 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 15:41:05.581539 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 15:41:05.584654 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 15:41:05.585598 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 15:41:05.595219 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 15:41:05.595392 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 15:41:05.600105 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 15:41:05.600479 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 15:41:05.600526 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:41:05.603542 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 15:41:05.604823 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 15:41:05.604964 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 15:41:05.609087 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 15:41:05.609572 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 15:41:05.611008 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 15:41:05.611066 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:41:05.613758 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 15:41:05.616696 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 15:41:05.616752 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 15:41:05.617718 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 15:41:05.617762 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:41:05.621880 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 15:41:05.621926 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 15:41:05.622809 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:41:05.625549 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 15:41:05.643499 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 15:41:05.643631 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 15:41:05.665528 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 15:41:05.665777 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:41:05.666812 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 15:41:05.666878 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 15:41:05.669213 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 15:41:05.669253 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:41:05.671414 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 15:41:05.671467 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 15:41:05.672930 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 15:41:05.672985 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 15:41:05.674468 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 15:41:05.674523 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 15:41:05.684187 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 15:41:05.684251 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 15:41:05.684305 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:41:05.688478 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 15:41:05.688528 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:41:05.692075 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 15:41:05.692177 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:41:05.715890 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 15:41:05.716018 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 15:41:05.764020 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 15:41:05.764163 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 15:41:05.766661 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 15:41:05.767835 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 15:41:05.767888 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 15:41:05.770615 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 15:41:05.797030 systemd[1]: Switching root. Sep 4 15:41:05.826493 systemd-journald[220]: Journal stopped Sep 4 15:41:07.244292 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Sep 4 15:41:07.244384 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 15:41:07.244410 kernel: SELinux: policy capability open_perms=1 Sep 4 15:41:07.244426 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 15:41:07.244446 kernel: SELinux: policy capability always_check_network=0 Sep 4 15:41:07.244471 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 15:41:07.244489 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 15:41:07.244503 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 15:41:07.244791 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 15:41:07.244804 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 15:41:07.244829 kernel: audit: type=1403 audit(1757000466.433:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 15:41:07.244847 systemd[1]: Successfully loaded SELinux policy in 60.574ms. Sep 4 15:41:07.244878 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.873ms. Sep 4 15:41:07.244891 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 15:41:07.244903 systemd[1]: Detected virtualization kvm. Sep 4 15:41:07.244928 systemd[1]: Detected architecture x86-64. Sep 4 15:41:07.244943 systemd[1]: Detected first boot. Sep 4 15:41:07.244961 systemd[1]: Initializing machine ID from VM UUID. Sep 4 15:41:07.244983 zram_generator::config[1133]: No configuration found. Sep 4 15:41:07.245001 kernel: Guest personality initialized and is inactive Sep 4 15:41:07.245020 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 4 15:41:07.245036 kernel: Initialized host personality Sep 4 15:41:07.245055 kernel: NET: Registered PF_VSOCK protocol family Sep 4 15:41:07.245082 systemd[1]: Populated /etc with preset unit settings. Sep 4 15:41:07.245104 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 15:41:07.245119 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 15:41:07.245130 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 15:41:07.245146 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 15:41:07.245158 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 15:41:07.245180 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 15:41:07.245192 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 15:41:07.245204 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 15:41:07.245216 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 15:41:07.245229 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 15:41:07.245241 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 15:41:07.245252 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 15:41:07.245267 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 15:41:07.245279 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 15:41:07.245291 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 15:41:07.245303 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 15:41:07.245316 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 15:41:07.245328 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 15:41:07.245340 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 15:41:07.245354 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 15:41:07.245367 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 15:41:07.245379 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 15:41:07.245391 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 15:41:07.245403 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 15:41:07.245416 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 15:41:07.245428 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 15:41:07.245446 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 15:41:07.245459 systemd[1]: Reached target slices.target - Slice Units. Sep 4 15:41:07.245474 systemd[1]: Reached target swap.target - Swaps. Sep 4 15:41:07.245499 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 15:41:07.245527 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 15:41:07.245543 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 15:41:07.245559 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 15:41:07.245582 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 15:41:07.245609 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 15:41:07.245637 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 15:41:07.245650 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 15:41:07.245662 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 15:41:07.245692 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 15:41:07.245715 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:07.245728 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 15:41:07.245740 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 15:41:07.245751 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 15:41:07.245764 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 15:41:07.245783 systemd[1]: Reached target machines.target - Containers. Sep 4 15:41:07.245795 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 15:41:07.245811 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 15:41:07.245843 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 15:41:07.245865 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 15:41:07.245878 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:41:07.245890 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:41:07.245903 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:41:07.245914 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 15:41:07.245927 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:41:07.245939 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 15:41:07.245958 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 15:41:07.245992 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 15:41:07.246019 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 15:41:07.246040 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 15:41:07.246053 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:41:07.246068 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 15:41:07.246085 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 15:41:07.246099 kernel: fuse: init (API version 7.41) Sep 4 15:41:07.246114 kernel: loop: module loaded Sep 4 15:41:07.246126 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 15:41:07.246138 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 15:41:07.246150 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 15:41:07.246161 kernel: ACPI: bus type drm_connector registered Sep 4 15:41:07.246174 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 15:41:07.246199 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 15:41:07.246212 systemd[1]: Stopped verity-setup.service. Sep 4 15:41:07.246225 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:07.246237 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 15:41:07.246249 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 15:41:07.246288 systemd-journald[1201]: Collecting audit messages is disabled. Sep 4 15:41:07.246313 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 15:41:07.246326 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 15:41:07.246339 systemd-journald[1201]: Journal started Sep 4 15:41:07.246361 systemd-journald[1201]: Runtime Journal (/run/log/journal/f8e5019f17df401897887788581dcbf1) is 6M, max 48.6M, 42.5M free. Sep 4 15:41:06.984659 systemd[1]: Queued start job for default target multi-user.target. Sep 4 15:41:07.005896 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 4 15:41:07.006421 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 15:41:07.249755 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 15:41:07.250570 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 15:41:07.251836 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 15:41:07.253157 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 15:41:07.254628 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 15:41:07.256155 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 15:41:07.256396 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 15:41:07.257887 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:41:07.258137 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:41:07.259618 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:41:07.259910 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:41:07.261268 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:41:07.261483 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:41:07.263085 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 15:41:07.263308 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 15:41:07.264665 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:41:07.264906 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:41:07.266347 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 15:41:07.267802 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 15:41:07.269394 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 15:41:07.271119 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 15:41:07.285181 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 15:41:07.287758 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 15:41:07.290030 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 15:41:07.291116 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 15:41:07.291145 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 15:41:07.293168 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 15:41:07.300803 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 15:41:07.302883 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:41:07.304791 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 15:41:07.307878 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 15:41:07.309099 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:41:07.310253 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 15:41:07.311849 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:41:07.312835 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 15:41:07.316800 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 15:41:07.325454 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 15:41:07.328287 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 15:41:07.331176 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 15:41:07.334958 systemd-journald[1201]: Time spent on flushing to /var/log/journal/f8e5019f17df401897887788581dcbf1 is 16.917ms for 981 entries. Sep 4 15:41:07.334958 systemd-journald[1201]: System Journal (/var/log/journal/f8e5019f17df401897887788581dcbf1) is 8M, max 195.6M, 187.6M free. Sep 4 15:41:07.441118 systemd-journald[1201]: Received client request to flush runtime journal. Sep 4 15:41:07.441223 kernel: loop0: detected capacity change from 0 to 110984 Sep 4 15:41:07.349859 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 15:41:07.351315 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 15:41:07.356813 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 15:41:07.382712 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 15:41:07.445613 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 15:41:07.445788 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 15:41:07.448523 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 15:41:07.455128 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 15:41:07.460821 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 15:41:07.462876 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 15:41:07.472716 kernel: loop1: detected capacity change from 0 to 128016 Sep 4 15:41:07.496162 systemd-tmpfiles[1265]: ACLs are not supported, ignoring. Sep 4 15:41:07.496186 systemd-tmpfiles[1265]: ACLs are not supported, ignoring. Sep 4 15:41:07.500703 kernel: loop2: detected capacity change from 0 to 224512 Sep 4 15:41:07.501173 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 15:41:07.546718 kernel: loop3: detected capacity change from 0 to 110984 Sep 4 15:41:07.561707 kernel: loop4: detected capacity change from 0 to 128016 Sep 4 15:41:07.573715 kernel: loop5: detected capacity change from 0 to 224512 Sep 4 15:41:07.586571 (sd-merge)[1271]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 4 15:41:07.587664 (sd-merge)[1271]: Merged extensions into '/usr'. Sep 4 15:41:07.662004 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 15:41:07.662146 systemd[1]: Reloading... Sep 4 15:41:07.738969 zram_generator::config[1293]: No configuration found. Sep 4 15:41:07.903417 ldconfig[1244]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 15:41:07.957583 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 15:41:07.957736 systemd[1]: Reloading finished in 295 ms. Sep 4 15:41:08.001157 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 15:41:08.002999 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 15:41:08.025429 systemd[1]: Starting ensure-sysext.service... Sep 4 15:41:08.027568 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 15:41:08.049717 systemd[1]: Reload requested from client PID 1334 ('systemctl') (unit ensure-sysext.service)... Sep 4 15:41:08.049732 systemd[1]: Reloading... Sep 4 15:41:08.055530 systemd-tmpfiles[1335]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 15:41:08.055569 systemd-tmpfiles[1335]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 15:41:08.056524 systemd-tmpfiles[1335]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 15:41:08.056942 systemd-tmpfiles[1335]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 15:41:08.057920 systemd-tmpfiles[1335]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 15:41:08.058285 systemd-tmpfiles[1335]: ACLs are not supported, ignoring. Sep 4 15:41:08.058419 systemd-tmpfiles[1335]: ACLs are not supported, ignoring. Sep 4 15:41:08.063134 systemd-tmpfiles[1335]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:41:08.063212 systemd-tmpfiles[1335]: Skipping /boot Sep 4 15:41:08.073585 systemd-tmpfiles[1335]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 15:41:08.073668 systemd-tmpfiles[1335]: Skipping /boot Sep 4 15:41:08.126717 zram_generator::config[1367]: No configuration found. Sep 4 15:41:08.306416 systemd[1]: Reloading finished in 256 ms. Sep 4 15:41:08.326833 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 15:41:08.348230 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 15:41:08.356658 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:41:08.359200 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 15:41:08.361567 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 15:41:08.368284 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 15:41:08.370929 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 15:41:08.374987 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 15:41:08.379393 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.379566 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 15:41:08.389755 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:41:08.392907 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:41:08.396151 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:41:08.397378 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:41:08.397566 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:41:08.405050 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 15:41:08.406076 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.408264 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 15:41:08.410521 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:41:08.410816 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:41:08.412801 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:41:08.413038 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:41:08.414853 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:41:08.415132 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:41:08.420600 systemd-udevd[1406]: Using default interface naming scheme 'v255'. Sep 4 15:41:08.424913 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 15:41:08.431711 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.432096 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 15:41:08.435257 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:41:08.437571 augenrules[1437]: No rules Sep 4 15:41:08.438853 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:41:08.442761 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:41:08.442924 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:41:08.443041 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:41:08.451048 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 15:41:08.452091 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.453612 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:41:08.453898 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:41:08.456247 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 15:41:08.458362 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:41:08.458595 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:41:08.460074 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 15:41:08.461892 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:41:08.462133 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:41:08.463862 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:41:08.464086 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:41:08.471670 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 15:41:08.476030 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 15:41:08.489425 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.493807 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:41:08.494942 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 15:41:08.496918 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 15:41:08.499716 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 15:41:08.505556 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 15:41:08.507762 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 15:41:08.508940 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 15:41:08.508994 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 15:41:08.510983 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 15:41:08.512054 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 15:41:08.512084 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 15:41:08.521545 systemd[1]: Finished ensure-sysext.service. Sep 4 15:41:08.522947 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 15:41:08.523181 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 15:41:08.526173 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 15:41:08.526439 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 15:41:08.529247 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 15:41:08.530120 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 15:41:08.531896 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 15:41:08.532123 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 15:41:08.540993 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 15:41:08.541111 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 15:41:08.543718 augenrules[1480]: /sbin/augenrules: No change Sep 4 15:41:08.544533 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 15:41:08.557934 augenrules[1511]: No rules Sep 4 15:41:08.591336 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:41:08.591625 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:41:08.617502 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 15:41:08.628606 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 15:41:08.631211 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 15:41:08.653647 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 15:41:08.665711 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 15:41:08.674842 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 4 15:41:08.684720 kernel: ACPI: button: Power Button [PWRF] Sep 4 15:41:08.719775 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 4 15:41:08.720078 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 4 15:41:08.735817 systemd-resolved[1404]: Positive Trust Anchors: Sep 4 15:41:08.737009 systemd-networkd[1489]: lo: Link UP Sep 4 15:41:08.738935 systemd-resolved[1404]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 15:41:08.738979 systemd-resolved[1404]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 15:41:08.739203 systemd-networkd[1489]: lo: Gained carrier Sep 4 15:41:08.741071 systemd-networkd[1489]: Enumeration completed Sep 4 15:41:08.741174 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 15:41:08.745950 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 15:41:08.750833 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 15:41:08.752090 systemd-networkd[1489]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 15:41:08.752095 systemd-networkd[1489]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 15:41:08.752796 systemd-networkd[1489]: eth0: Link UP Sep 4 15:41:08.753051 systemd-networkd[1489]: eth0: Gained carrier Sep 4 15:41:08.753117 systemd-networkd[1489]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 15:41:08.760949 systemd-resolved[1404]: Defaulting to hostname 'linux'. Sep 4 15:41:08.763322 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 15:41:08.764503 systemd[1]: Reached target network.target - Network. Sep 4 15:41:08.765417 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 15:41:08.766750 systemd-networkd[1489]: eth0: DHCPv4 address 10.0.0.9/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 15:41:08.770354 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 15:41:08.772108 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 15:41:08.773332 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 15:41:09.378968 systemd-timesyncd[1505]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 4 15:41:09.379008 systemd-timesyncd[1505]: Initial clock synchronization to Thu 2025-09-04 15:41:09.378875 UTC. Sep 4 15:41:09.379058 systemd-resolved[1404]: Clock change detected. Flushing caches. Sep 4 15:41:09.380090 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 15:41:09.381393 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 15:41:09.382531 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 15:41:09.384272 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 15:41:09.384305 systemd[1]: Reached target paths.target - Path Units. Sep 4 15:41:09.385297 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 15:41:09.386492 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 15:41:09.387661 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 15:41:09.388884 systemd[1]: Reached target timers.target - Timer Units. Sep 4 15:41:09.392059 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 15:41:09.394865 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 15:41:09.399727 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 15:41:09.401432 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 4 15:41:09.403042 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 4 15:41:09.439331 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 15:41:09.441276 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 15:41:09.444010 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 15:41:09.445696 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 15:41:09.458376 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 15:41:09.459893 systemd[1]: Reached target basic.target - Basic System. Sep 4 15:41:09.461113 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:41:09.461230 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 15:41:09.467501 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 15:41:09.472481 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 15:41:09.480544 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 15:41:09.515802 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 15:41:09.520179 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 15:41:09.521276 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 15:41:09.522481 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 15:41:09.526435 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 15:41:09.532420 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 15:41:09.534175 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Refreshing passwd entry cache Sep 4 15:41:09.534449 oslogin_cache_refresh[1559]: Refreshing passwd entry cache Sep 4 15:41:09.537035 jq[1557]: false Sep 4 15:41:09.537675 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 15:41:09.541474 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 15:41:09.545077 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Failure getting users, quitting Sep 4 15:41:09.546254 oslogin_cache_refresh[1559]: Failure getting users, quitting Sep 4 15:41:09.546431 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:41:09.546503 oslogin_cache_refresh[1559]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 15:41:09.546707 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Refreshing group entry cache Sep 4 15:41:09.546741 oslogin_cache_refresh[1559]: Refreshing group entry cache Sep 4 15:41:09.548530 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 15:41:09.548641 kernel: kvm_amd: TSC scaling supported Sep 4 15:41:09.548669 kernel: kvm_amd: Nested Virtualization enabled Sep 4 15:41:09.548688 kernel: kvm_amd: Nested Paging enabled Sep 4 15:41:09.548700 kernel: kvm_amd: LBR virtualization supported Sep 4 15:41:09.550689 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 4 15:41:09.550717 kernel: kvm_amd: Virtual GIF supported Sep 4 15:41:09.551156 extend-filesystems[1558]: Found /dev/vda6 Sep 4 15:41:09.555527 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 15:41:09.557053 extend-filesystems[1558]: Found /dev/vda9 Sep 4 15:41:09.558559 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 15:41:09.559316 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 15:41:09.562253 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Failure getting groups, quitting Sep 4 15:41:09.562253 google_oslogin_nss_cache[1559]: oslogin_cache_refresh[1559]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:41:09.561312 oslogin_cache_refresh[1559]: Failure getting groups, quitting Sep 4 15:41:09.561323 oslogin_cache_refresh[1559]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 15:41:09.563552 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 15:41:09.564479 extend-filesystems[1558]: Checking size of /dev/vda9 Sep 4 15:41:09.569351 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 15:41:09.578909 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 15:41:09.582074 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 15:41:09.582346 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 15:41:09.582794 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 15:41:09.584665 jq[1580]: true Sep 4 15:41:09.584872 extend-filesystems[1558]: Resized partition /dev/vda9 Sep 4 15:41:09.584918 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 15:41:09.587550 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 15:41:09.587795 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 15:41:09.590350 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 15:41:09.590668 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 15:41:09.596257 extend-filesystems[1589]: resize2fs 1.47.3 (8-Jul-2025) Sep 4 15:41:09.599245 update_engine[1577]: I20250904 15:41:09.599002 1577 main.cc:92] Flatcar Update Engine starting Sep 4 15:41:09.602798 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 4 15:41:09.618411 jq[1590]: true Sep 4 15:41:09.633694 tar[1588]: linux-amd64/LICENSE Sep 4 15:41:09.635644 (ntainerd)[1599]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 15:41:09.637831 tar[1588]: linux-amd64/helm Sep 4 15:41:09.655232 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 4 15:41:09.667918 dbus-daemon[1555]: [system] SELinux support is enabled Sep 4 15:41:09.668108 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 15:41:09.670755 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 15:41:09.670967 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 15:41:09.671131 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 15:41:09.671147 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 15:41:09.678256 extend-filesystems[1589]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 4 15:41:09.678256 extend-filesystems[1589]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 4 15:41:09.678256 extend-filesystems[1589]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 4 15:41:09.682259 update_engine[1577]: I20250904 15:41:09.681527 1577 update_check_scheduler.cc:74] Next update check in 7m37s Sep 4 15:41:09.682741 systemd[1]: Started update-engine.service - Update Engine. Sep 4 15:41:09.689234 bash[1619]: Updated "/home/core/.ssh/authorized_keys" Sep 4 15:41:09.689383 extend-filesystems[1558]: Resized filesystem in /dev/vda9 Sep 4 15:41:09.688821 systemd-logind[1569]: Watching system buttons on /dev/input/event2 (Power Button) Sep 4 15:41:09.688844 systemd-logind[1569]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 15:41:09.690865 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 15:41:09.692458 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 15:41:09.695332 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 4 15:41:09.697421 systemd-logind[1569]: New seat seat0. Sep 4 15:41:09.697879 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 15:41:09.698193 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 15:41:09.716240 kernel: EDAC MC: Ver: 3.0.0 Sep 4 15:41:09.784408 locksmithd[1621]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 15:41:09.859982 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 15:41:09.866080 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 15:41:09.974701 containerd[1599]: time="2025-09-04T15:41:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 15:41:09.975575 containerd[1599]: time="2025-09-04T15:41:09.975547346Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 4 15:41:09.987434 containerd[1599]: time="2025-09-04T15:41:09.987363351Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.616µs" Sep 4 15:41:09.987434 containerd[1599]: time="2025-09-04T15:41:09.987410910Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 15:41:09.987434 containerd[1599]: time="2025-09-04T15:41:09.987433633Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 15:41:09.987698 containerd[1599]: time="2025-09-04T15:41:09.987665658Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 15:41:09.987698 containerd[1599]: time="2025-09-04T15:41:09.987689182Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 15:41:09.987761 containerd[1599]: time="2025-09-04T15:41:09.987716733Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:41:09.987848 containerd[1599]: time="2025-09-04T15:41:09.987823674Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 15:41:09.987848 containerd[1599]: time="2025-09-04T15:41:09.987840455Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988242 containerd[1599]: time="2025-09-04T15:41:09.988192325Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988242 containerd[1599]: time="2025-09-04T15:41:09.988237380Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988294 containerd[1599]: time="2025-09-04T15:41:09.988281793Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988294 containerd[1599]: time="2025-09-04T15:41:09.988291531Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988439 containerd[1599]: time="2025-09-04T15:41:09.988409142Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988780 containerd[1599]: time="2025-09-04T15:41:09.988752566Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988814 containerd[1599]: time="2025-09-04T15:41:09.988789876Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 15:41:09.988814 containerd[1599]: time="2025-09-04T15:41:09.988799664Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 15:41:09.988876 containerd[1599]: time="2025-09-04T15:41:09.988854978Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 15:41:09.989140 containerd[1599]: time="2025-09-04T15:41:09.989108212Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 15:41:09.989220 containerd[1599]: time="2025-09-04T15:41:09.989195947Z" level=info msg="metadata content store policy set" policy=shared Sep 4 15:41:09.997491 containerd[1599]: time="2025-09-04T15:41:09.997430577Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 15:41:09.997532 containerd[1599]: time="2025-09-04T15:41:09.997513162Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 15:41:09.997553 containerd[1599]: time="2025-09-04T15:41:09.997532137Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 15:41:09.997605 containerd[1599]: time="2025-09-04T15:41:09.997546294Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 15:41:09.997669 containerd[1599]: time="2025-09-04T15:41:09.997612378Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 15:41:09.997669 containerd[1599]: time="2025-09-04T15:41:09.997625182Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 15:41:09.997669 containerd[1599]: time="2025-09-04T15:41:09.997653936Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 15:41:09.997669 containerd[1599]: time="2025-09-04T15:41:09.997667180Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 15:41:09.997744 containerd[1599]: time="2025-09-04T15:41:09.997679163Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 15:41:09.997744 containerd[1599]: time="2025-09-04T15:41:09.997689402Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 15:41:09.997744 containerd[1599]: time="2025-09-04T15:41:09.997698770Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 15:41:09.997744 containerd[1599]: time="2025-09-04T15:41:09.997712295Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 15:41:09.998027 containerd[1599]: time="2025-09-04T15:41:09.997886532Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 15:41:09.998027 containerd[1599]: time="2025-09-04T15:41:09.997913352Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 15:41:09.998027 containerd[1599]: time="2025-09-04T15:41:09.997989375Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 15:41:09.998092 containerd[1599]: time="2025-09-04T15:41:09.998039188Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 15:41:09.998092 containerd[1599]: time="2025-09-04T15:41:09.998053014Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 15:41:09.998092 containerd[1599]: time="2025-09-04T15:41:09.998064075Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 15:41:09.998092 containerd[1599]: time="2025-09-04T15:41:09.998075185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 15:41:09.998092 containerd[1599]: time="2025-09-04T15:41:09.998085405Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 15:41:09.998193 containerd[1599]: time="2025-09-04T15:41:09.998095343Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 15:41:09.998193 containerd[1599]: time="2025-09-04T15:41:09.998105602Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 15:41:09.998193 containerd[1599]: time="2025-09-04T15:41:09.998127263Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 15:41:09.998272 containerd[1599]: time="2025-09-04T15:41:09.998262437Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 15:41:09.998294 containerd[1599]: time="2025-09-04T15:41:09.998279258Z" level=info msg="Start snapshots syncer" Sep 4 15:41:09.998344 containerd[1599]: time="2025-09-04T15:41:09.998335103Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 15:41:09.998674 containerd[1599]: time="2025-09-04T15:41:09.998625658Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 15:41:09.998868 containerd[1599]: time="2025-09-04T15:41:09.998684518Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 15:41:10.000815 containerd[1599]: time="2025-09-04T15:41:10.000789745Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 15:41:10.000945 containerd[1599]: time="2025-09-04T15:41:10.000914038Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 15:41:10.000945 containerd[1599]: time="2025-09-04T15:41:10.000938444Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 15:41:10.000993 containerd[1599]: time="2025-09-04T15:41:10.000949635Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 15:41:10.001014 containerd[1599]: time="2025-09-04T15:41:10.000998867Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 15:41:10.001045 containerd[1599]: time="2025-09-04T15:41:10.001028122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 15:41:10.001066 containerd[1599]: time="2025-09-04T15:41:10.001047739Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 15:41:10.001066 containerd[1599]: time="2025-09-04T15:41:10.001059441Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 15:41:10.001107 containerd[1599]: time="2025-09-04T15:41:10.001098685Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 15:41:10.001129 containerd[1599]: time="2025-09-04T15:41:10.001110998Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 15:41:10.001129 containerd[1599]: time="2025-09-04T15:41:10.001122690Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 15:41:10.001170 containerd[1599]: time="2025-09-04T15:41:10.001161001Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001231383Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001249888Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001260007Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001267681Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001276608Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 15:41:10.001286 containerd[1599]: time="2025-09-04T15:41:10.001287238Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 15:41:10.001433 containerd[1599]: time="2025-09-04T15:41:10.001306444Z" level=info msg="runtime interface created" Sep 4 15:41:10.001433 containerd[1599]: time="2025-09-04T15:41:10.001312045Z" level=info msg="created NRI interface" Sep 4 15:41:10.001433 containerd[1599]: time="2025-09-04T15:41:10.001319448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 15:41:10.001433 containerd[1599]: time="2025-09-04T15:41:10.001330299Z" level=info msg="Connect containerd service" Sep 4 15:41:10.001433 containerd[1599]: time="2025-09-04T15:41:10.001352530Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 15:41:10.002591 containerd[1599]: time="2025-09-04T15:41:10.002556819Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 15:41:10.080808 sshd_keygen[1584]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 15:41:10.109235 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 15:41:10.113053 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 15:41:10.136167 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 15:41:10.138060 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 15:41:10.145357 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 15:41:10.147658 tar[1588]: linux-amd64/README.md Sep 4 15:41:10.176861 containerd[1599]: time="2025-09-04T15:41:10.176653870Z" level=info msg="Start subscribing containerd event" Sep 4 15:41:10.176861 containerd[1599]: time="2025-09-04T15:41:10.176760080Z" level=info msg="Start recovering state" Sep 4 15:41:10.176861 containerd[1599]: time="2025-09-04T15:41:10.176862912Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 15:41:10.177086 containerd[1599]: time="2025-09-04T15:41:10.177065142Z" level=info msg="Start event monitor" Sep 4 15:41:10.177115 containerd[1599]: time="2025-09-04T15:41:10.177089748Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 15:41:10.177171 containerd[1599]: time="2025-09-04T15:41:10.177126557Z" level=info msg="Start cni network conf syncer for default" Sep 4 15:41:10.177171 containerd[1599]: time="2025-09-04T15:41:10.177142537Z" level=info msg="Start streaming server" Sep 4 15:41:10.177171 containerd[1599]: time="2025-09-04T15:41:10.177155782Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 15:41:10.177245 containerd[1599]: time="2025-09-04T15:41:10.177175579Z" level=info msg="runtime interface starting up..." Sep 4 15:41:10.177245 containerd[1599]: time="2025-09-04T15:41:10.177182862Z" level=info msg="starting plugins..." Sep 4 15:41:10.177245 containerd[1599]: time="2025-09-04T15:41:10.177202750Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 15:41:10.177854 containerd[1599]: time="2025-09-04T15:41:10.177806031Z" level=info msg="containerd successfully booted in 0.203807s" Sep 4 15:41:10.178119 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 15:41:10.203033 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 15:41:10.210024 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 15:41:10.213028 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 15:41:10.215197 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 15:41:10.216573 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 15:41:10.575487 systemd-networkd[1489]: eth0: Gained IPv6LL Sep 4 15:41:10.578921 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 15:41:10.580728 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 15:41:10.583350 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 4 15:41:10.585776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:10.610681 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 15:41:10.630005 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 4 15:41:10.630322 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 4 15:41:10.631905 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 15:41:10.635396 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 15:41:11.110839 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 15:41:11.114531 systemd[1]: Started sshd@0-10.0.0.9:22-10.0.0.1:41488.service - OpenSSH per-connection server daemon (10.0.0.1:41488). Sep 4 15:41:11.218412 sshd[1694]: Accepted publickey for core from 10.0.0.1 port 41488 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:11.220686 sshd-session[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:11.227345 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 15:41:11.229522 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 15:41:11.237170 systemd-logind[1569]: New session 1 of user core. Sep 4 15:41:11.251846 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 15:41:11.275609 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 15:41:11.297544 (systemd)[1699]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 15:41:11.300071 systemd-logind[1569]: New session c1 of user core. Sep 4 15:41:11.461021 systemd[1699]: Queued start job for default target default.target. Sep 4 15:41:11.478513 systemd[1699]: Created slice app.slice - User Application Slice. Sep 4 15:41:11.478541 systemd[1699]: Reached target paths.target - Paths. Sep 4 15:41:11.478584 systemd[1699]: Reached target timers.target - Timers. Sep 4 15:41:11.480142 systemd[1699]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 15:41:11.517564 systemd[1699]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 15:41:11.517739 systemd[1699]: Reached target sockets.target - Sockets. Sep 4 15:41:11.517801 systemd[1699]: Reached target basic.target - Basic System. Sep 4 15:41:11.517855 systemd[1699]: Reached target default.target - Main User Target. Sep 4 15:41:11.517895 systemd[1699]: Startup finished in 209ms. Sep 4 15:41:11.518302 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 15:41:11.526348 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 15:41:11.592103 systemd[1]: Started sshd@1-10.0.0.9:22-10.0.0.1:41500.service - OpenSSH per-connection server daemon (10.0.0.1:41500). Sep 4 15:41:11.657321 sshd[1710]: Accepted publickey for core from 10.0.0.1 port 41500 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:11.658730 sshd-session[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:11.663107 systemd-logind[1569]: New session 2 of user core. Sep 4 15:41:11.676366 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 15:41:11.678553 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:11.681932 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 15:41:11.683251 systemd[1]: Startup finished in 3.022s (kernel) + 8.805s (initrd) + 4.702s (userspace) = 16.530s. Sep 4 15:41:11.683811 (kubelet)[1717]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:41:11.737405 sshd[1719]: Connection closed by 10.0.0.1 port 41500 Sep 4 15:41:11.739493 sshd-session[1710]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:11.748817 systemd[1]: sshd@1-10.0.0.9:22-10.0.0.1:41500.service: Deactivated successfully. Sep 4 15:41:11.750726 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 15:41:11.751786 systemd-logind[1569]: Session 2 logged out. Waiting for processes to exit. Sep 4 15:41:11.754588 systemd[1]: Started sshd@2-10.0.0.9:22-10.0.0.1:41510.service - OpenSSH per-connection server daemon (10.0.0.1:41510). Sep 4 15:41:11.755496 systemd-logind[1569]: Removed session 2. Sep 4 15:41:11.818368 sshd[1729]: Accepted publickey for core from 10.0.0.1 port 41510 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:11.819731 sshd-session[1729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:11.825510 systemd-logind[1569]: New session 3 of user core. Sep 4 15:41:11.832347 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 15:41:11.888958 sshd[1737]: Connection closed by 10.0.0.1 port 41510 Sep 4 15:41:11.890118 sshd-session[1729]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:11.901991 systemd[1]: sshd@2-10.0.0.9:22-10.0.0.1:41510.service: Deactivated successfully. Sep 4 15:41:11.903845 systemd[1]: session-3.scope: Deactivated successfully. Sep 4 15:41:11.904801 systemd-logind[1569]: Session 3 logged out. Waiting for processes to exit. Sep 4 15:41:11.907771 systemd[1]: Started sshd@3-10.0.0.9:22-10.0.0.1:41524.service - OpenSSH per-connection server daemon (10.0.0.1:41524). Sep 4 15:41:11.908716 systemd-logind[1569]: Removed session 3. Sep 4 15:41:11.964267 sshd[1743]: Accepted publickey for core from 10.0.0.1 port 41524 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:11.965520 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:11.969907 systemd-logind[1569]: New session 4 of user core. Sep 4 15:41:11.979375 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 15:41:12.036737 sshd[1747]: Connection closed by 10.0.0.1 port 41524 Sep 4 15:41:12.037095 sshd-session[1743]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:12.056908 systemd[1]: sshd@3-10.0.0.9:22-10.0.0.1:41524.service: Deactivated successfully. Sep 4 15:41:12.058918 systemd[1]: session-4.scope: Deactivated successfully. Sep 4 15:41:12.059767 systemd-logind[1569]: Session 4 logged out. Waiting for processes to exit. Sep 4 15:41:12.062830 systemd[1]: Started sshd@4-10.0.0.9:22-10.0.0.1:41540.service - OpenSSH per-connection server daemon (10.0.0.1:41540). Sep 4 15:41:12.063968 systemd-logind[1569]: Removed session 4. Sep 4 15:41:12.129557 sshd[1753]: Accepted publickey for core from 10.0.0.1 port 41540 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:12.130972 sshd-session[1753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:12.135841 systemd-logind[1569]: New session 5 of user core. Sep 4 15:41:12.145390 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 15:41:12.170223 kubelet[1717]: E0904 15:41:12.170157 1717 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:41:12.174361 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:41:12.174580 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:41:12.174996 systemd[1]: kubelet.service: Consumed 1.384s CPU time, 265.1M memory peak. Sep 4 15:41:12.207342 sudo[1758]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 15:41:12.207690 sudo[1758]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:41:12.222879 sudo[1758]: pam_unix(sudo:session): session closed for user root Sep 4 15:41:12.224736 sshd[1756]: Connection closed by 10.0.0.1 port 41540 Sep 4 15:41:12.225158 sshd-session[1753]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:12.239943 systemd[1]: sshd@4-10.0.0.9:22-10.0.0.1:41540.service: Deactivated successfully. Sep 4 15:41:12.241821 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 15:41:12.242565 systemd-logind[1569]: Session 5 logged out. Waiting for processes to exit. Sep 4 15:41:12.245542 systemd[1]: Started sshd@5-10.0.0.9:22-10.0.0.1:41544.service - OpenSSH per-connection server daemon (10.0.0.1:41544). Sep 4 15:41:12.246084 systemd-logind[1569]: Removed session 5. Sep 4 15:41:12.314944 sshd[1764]: Accepted publickey for core from 10.0.0.1 port 41544 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:12.316149 sshd-session[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:12.320663 systemd-logind[1569]: New session 6 of user core. Sep 4 15:41:12.330562 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 15:41:12.388949 sudo[1769]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 15:41:12.389355 sudo[1769]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:41:12.397465 sudo[1769]: pam_unix(sudo:session): session closed for user root Sep 4 15:41:12.405500 sudo[1768]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 15:41:12.405864 sudo[1768]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:41:12.417835 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 15:41:12.479376 augenrules[1791]: No rules Sep 4 15:41:12.481499 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 15:41:12.481846 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 15:41:12.483253 sudo[1768]: pam_unix(sudo:session): session closed for user root Sep 4 15:41:12.484854 sshd[1767]: Connection closed by 10.0.0.1 port 41544 Sep 4 15:41:12.485179 sshd-session[1764]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:12.499719 systemd[1]: sshd@5-10.0.0.9:22-10.0.0.1:41544.service: Deactivated successfully. Sep 4 15:41:12.501979 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 15:41:12.502915 systemd-logind[1569]: Session 6 logged out. Waiting for processes to exit. Sep 4 15:41:12.505774 systemd[1]: Started sshd@6-10.0.0.9:22-10.0.0.1:41556.service - OpenSSH per-connection server daemon (10.0.0.1:41556). Sep 4 15:41:12.506700 systemd-logind[1569]: Removed session 6. Sep 4 15:41:12.565774 sshd[1800]: Accepted publickey for core from 10.0.0.1 port 41556 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:41:12.567059 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:41:12.571646 systemd-logind[1569]: New session 7 of user core. Sep 4 15:41:12.585390 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 15:41:12.639716 sudo[1804]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 15:41:12.640095 sudo[1804]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 15:41:13.109180 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 15:41:13.126653 (dockerd)[1824]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 15:41:13.439517 dockerd[1824]: time="2025-09-04T15:41:13.439353191Z" level=info msg="Starting up" Sep 4 15:41:13.440302 dockerd[1824]: time="2025-09-04T15:41:13.440277294Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 15:41:13.458042 dockerd[1824]: time="2025-09-04T15:41:13.457985396Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 4 15:41:13.929333 dockerd[1824]: time="2025-09-04T15:41:13.929154739Z" level=info msg="Loading containers: start." Sep 4 15:41:13.941239 kernel: Initializing XFRM netlink socket Sep 4 15:41:14.209019 systemd-networkd[1489]: docker0: Link UP Sep 4 15:41:14.215027 dockerd[1824]: time="2025-09-04T15:41:14.214978378Z" level=info msg="Loading containers: done." Sep 4 15:41:14.291056 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3568621785-merged.mount: Deactivated successfully. Sep 4 15:41:14.293386 dockerd[1824]: time="2025-09-04T15:41:14.293327783Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 15:41:14.293475 dockerd[1824]: time="2025-09-04T15:41:14.293424284Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 4 15:41:14.293538 dockerd[1824]: time="2025-09-04T15:41:14.293520574Z" level=info msg="Initializing buildkit" Sep 4 15:41:14.328569 dockerd[1824]: time="2025-09-04T15:41:14.328532732Z" level=info msg="Completed buildkit initialization" Sep 4 15:41:14.334347 dockerd[1824]: time="2025-09-04T15:41:14.334307991Z" level=info msg="Daemon has completed initialization" Sep 4 15:41:14.334437 dockerd[1824]: time="2025-09-04T15:41:14.334362052Z" level=info msg="API listen on /run/docker.sock" Sep 4 15:41:14.334721 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 15:41:15.323532 containerd[1599]: time="2025-09-04T15:41:15.323464539Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 4 15:41:16.372655 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2110910675.mount: Deactivated successfully. Sep 4 15:41:17.514436 containerd[1599]: time="2025-09-04T15:41:17.514367021Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:17.515084 containerd[1599]: time="2025-09-04T15:41:17.515029272Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Sep 4 15:41:17.516221 containerd[1599]: time="2025-09-04T15:41:17.516172276Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:17.518800 containerd[1599]: time="2025-09-04T15:41:17.518763123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:17.519934 containerd[1599]: time="2025-09-04T15:41:17.519899955Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.196380483s" Sep 4 15:41:17.519991 containerd[1599]: time="2025-09-04T15:41:17.519933678Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 4 15:41:17.520565 containerd[1599]: time="2025-09-04T15:41:17.520539544Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 4 15:41:18.864244 containerd[1599]: time="2025-09-04T15:41:18.864163138Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:18.865073 containerd[1599]: time="2025-09-04T15:41:18.864993876Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Sep 4 15:41:18.866105 containerd[1599]: time="2025-09-04T15:41:18.866044365Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:18.868681 containerd[1599]: time="2025-09-04T15:41:18.868617450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:18.869559 containerd[1599]: time="2025-09-04T15:41:18.869516716Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.348948828s" Sep 4 15:41:18.869621 containerd[1599]: time="2025-09-04T15:41:18.869560248Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 4 15:41:18.870122 containerd[1599]: time="2025-09-04T15:41:18.870089871Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 4 15:41:20.214058 containerd[1599]: time="2025-09-04T15:41:20.213940320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:20.214901 containerd[1599]: time="2025-09-04T15:41:20.214782749Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Sep 4 15:41:20.216324 containerd[1599]: time="2025-09-04T15:41:20.216182634Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:20.219986 containerd[1599]: time="2025-09-04T15:41:20.219940761Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:20.220882 containerd[1599]: time="2025-09-04T15:41:20.220837141Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.350714489s" Sep 4 15:41:20.220882 containerd[1599]: time="2025-09-04T15:41:20.220874922Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 4 15:41:20.221450 containerd[1599]: time="2025-09-04T15:41:20.221345825Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 4 15:41:21.323175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount864732262.mount: Deactivated successfully. Sep 4 15:41:21.840961 containerd[1599]: time="2025-09-04T15:41:21.840899548Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:21.841839 containerd[1599]: time="2025-09-04T15:41:21.841807410Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Sep 4 15:41:21.842957 containerd[1599]: time="2025-09-04T15:41:21.842926218Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:21.844894 containerd[1599]: time="2025-09-04T15:41:21.844848773Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:21.845575 containerd[1599]: time="2025-09-04T15:41:21.845525802Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 1.624133229s" Sep 4 15:41:21.845617 containerd[1599]: time="2025-09-04T15:41:21.845571588Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 4 15:41:21.846040 containerd[1599]: time="2025-09-04T15:41:21.846017614Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 4 15:41:22.258056 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 15:41:22.259751 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:22.508528 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:22.513132 (kubelet)[2123]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 15:41:22.555920 kubelet[2123]: E0904 15:41:22.555842 2123 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 15:41:22.562737 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 15:41:22.563001 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 15:41:22.563432 systemd[1]: kubelet.service: Consumed 247ms CPU time, 108.3M memory peak. Sep 4 15:41:22.669269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3538082570.mount: Deactivated successfully. Sep 4 15:41:23.321085 containerd[1599]: time="2025-09-04T15:41:23.320998002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:23.321852 containerd[1599]: time="2025-09-04T15:41:23.321806288Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 4 15:41:23.323284 containerd[1599]: time="2025-09-04T15:41:23.323258310Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:23.325849 containerd[1599]: time="2025-09-04T15:41:23.325794466Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:23.326721 containerd[1599]: time="2025-09-04T15:41:23.326693211Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.480649368s" Sep 4 15:41:23.326776 containerd[1599]: time="2025-09-04T15:41:23.326724319Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 4 15:41:23.327531 containerd[1599]: time="2025-09-04T15:41:23.327445682Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 15:41:23.830571 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3587078687.mount: Deactivated successfully. Sep 4 15:41:23.836616 containerd[1599]: time="2025-09-04T15:41:23.836575009Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:41:23.837313 containerd[1599]: time="2025-09-04T15:41:23.837249934Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 15:41:23.838620 containerd[1599]: time="2025-09-04T15:41:23.838575810Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:41:23.840540 containerd[1599]: time="2025-09-04T15:41:23.840501952Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 15:41:23.841270 containerd[1599]: time="2025-09-04T15:41:23.841235197Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 513.710888ms" Sep 4 15:41:23.841270 containerd[1599]: time="2025-09-04T15:41:23.841269551Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 15:41:23.841773 containerd[1599]: time="2025-09-04T15:41:23.841733481Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 4 15:41:24.381438 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2002031713.mount: Deactivated successfully. Sep 4 15:41:26.454181 containerd[1599]: time="2025-09-04T15:41:26.454092163Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:26.454866 containerd[1599]: time="2025-09-04T15:41:26.454682830Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 4 15:41:26.456169 containerd[1599]: time="2025-09-04T15:41:26.456112782Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:26.461621 containerd[1599]: time="2025-09-04T15:41:26.461570194Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:26.462811 containerd[1599]: time="2025-09-04T15:41:26.462732964Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.620946293s" Sep 4 15:41:26.462811 containerd[1599]: time="2025-09-04T15:41:26.462796263Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 4 15:41:29.883608 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:29.883823 systemd[1]: kubelet.service: Consumed 247ms CPU time, 108.3M memory peak. Sep 4 15:41:29.886149 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:29.911337 systemd[1]: Reload requested from client PID 2271 ('systemctl') (unit session-7.scope)... Sep 4 15:41:29.911354 systemd[1]: Reloading... Sep 4 15:41:29.992260 zram_generator::config[2314]: No configuration found. Sep 4 15:41:30.283139 systemd[1]: Reloading finished in 371 ms. Sep 4 15:41:30.345847 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 15:41:30.345950 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 15:41:30.346331 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:30.346380 systemd[1]: kubelet.service: Consumed 156ms CPU time, 98.4M memory peak. Sep 4 15:41:30.347994 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:30.527828 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:30.532746 (kubelet)[2361]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:41:30.579611 kubelet[2361]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:41:30.579611 kubelet[2361]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 15:41:30.579611 kubelet[2361]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:41:30.579986 kubelet[2361]: I0904 15:41:30.579592 2361 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:41:30.839441 kubelet[2361]: I0904 15:41:30.839318 2361 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 4 15:41:30.839441 kubelet[2361]: I0904 15:41:30.839349 2361 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:41:30.839676 kubelet[2361]: I0904 15:41:30.839648 2361 server.go:954] "Client rotation is on, will bootstrap in background" Sep 4 15:41:30.864525 kubelet[2361]: I0904 15:41:30.864485 2361 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:41:30.864639 kubelet[2361]: E0904 15:41:30.864535 2361 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.9:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:30.871652 kubelet[2361]: I0904 15:41:30.871599 2361 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:41:30.877252 kubelet[2361]: I0904 15:41:30.877221 2361 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:41:30.879280 kubelet[2361]: I0904 15:41:30.879225 2361 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:41:30.879438 kubelet[2361]: I0904 15:41:30.879261 2361 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:41:30.879549 kubelet[2361]: I0904 15:41:30.879451 2361 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:41:30.879549 kubelet[2361]: I0904 15:41:30.879460 2361 container_manager_linux.go:304] "Creating device plugin manager" Sep 4 15:41:30.879662 kubelet[2361]: I0904 15:41:30.879602 2361 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:41:30.882593 kubelet[2361]: I0904 15:41:30.882458 2361 kubelet.go:446] "Attempting to sync node with API server" Sep 4 15:41:30.882593 kubelet[2361]: I0904 15:41:30.882488 2361 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:41:30.882593 kubelet[2361]: I0904 15:41:30.882516 2361 kubelet.go:352] "Adding apiserver pod source" Sep 4 15:41:30.882593 kubelet[2361]: I0904 15:41:30.882528 2361 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:41:30.885757 kubelet[2361]: I0904 15:41:30.885727 2361 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:41:30.886716 kubelet[2361]: W0904 15:41:30.886003 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:30.886716 kubelet[2361]: W0904 15:41:30.886045 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:30.886716 kubelet[2361]: E0904 15:41:30.886094 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:30.886716 kubelet[2361]: E0904 15:41:30.886096 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:30.886716 kubelet[2361]: I0904 15:41:30.886154 2361 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 15:41:30.886716 kubelet[2361]: W0904 15:41:30.886706 2361 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 15:41:30.888514 kubelet[2361]: I0904 15:41:30.888482 2361 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 15:41:30.888576 kubelet[2361]: I0904 15:41:30.888524 2361 server.go:1287] "Started kubelet" Sep 4 15:41:30.890836 kubelet[2361]: I0904 15:41:30.890668 2361 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:41:30.890836 kubelet[2361]: I0904 15:41:30.890707 2361 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:41:30.891057 kubelet[2361]: I0904 15:41:30.891035 2361 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:41:30.891114 kubelet[2361]: I0904 15:41:30.891059 2361 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:41:30.891928 kubelet[2361]: I0904 15:41:30.891895 2361 server.go:479] "Adding debug handlers to kubelet server" Sep 4 15:41:30.893318 kubelet[2361]: I0904 15:41:30.892895 2361 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:41:30.894599 kubelet[2361]: E0904 15:41:30.893927 2361 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 15:41:30.894763 kubelet[2361]: E0904 15:41:30.894736 2361 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:41:30.894821 kubelet[2361]: I0904 15:41:30.894772 2361 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 15:41:30.894977 kubelet[2361]: I0904 15:41:30.894943 2361 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 15:41:30.895796 kubelet[2361]: I0904 15:41:30.895024 2361 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:41:30.895796 kubelet[2361]: W0904 15:41:30.895357 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:30.895796 kubelet[2361]: E0904 15:41:30.895404 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:30.895796 kubelet[2361]: I0904 15:41:30.895566 2361 factory.go:221] Registration of the systemd container factory successfully Sep 4 15:41:30.895796 kubelet[2361]: I0904 15:41:30.895644 2361 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:41:30.896416 kubelet[2361]: I0904 15:41:30.896389 2361 factory.go:221] Registration of the containerd container factory successfully Sep 4 15:41:30.896561 kubelet[2361]: E0904 15:41:30.896494 2361 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.9:6443: connect: connection refused" interval="200ms" Sep 4 15:41:30.898257 kubelet[2361]: E0904 15:41:30.896240 2361 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.9:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.9:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18621ea33198925e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 15:41:30.888499806 +0000 UTC m=+0.350915237,LastTimestamp:2025-09-04 15:41:30.888499806 +0000 UTC m=+0.350915237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 15:41:30.916262 kubelet[2361]: I0904 15:41:30.915191 2361 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 15:41:30.916262 kubelet[2361]: I0904 15:41:30.915291 2361 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 15:41:30.916262 kubelet[2361]: I0904 15:41:30.915303 2361 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 15:41:30.916262 kubelet[2361]: I0904 15:41:30.915321 2361 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:41:30.916768 kubelet[2361]: I0904 15:41:30.916743 2361 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 15:41:30.916828 kubelet[2361]: I0904 15:41:30.916776 2361 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 4 15:41:30.916828 kubelet[2361]: I0904 15:41:30.916803 2361 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 15:41:30.916828 kubelet[2361]: I0904 15:41:30.916815 2361 kubelet.go:2382] "Starting kubelet main sync loop" Sep 4 15:41:30.916939 kubelet[2361]: E0904 15:41:30.916874 2361 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:41:30.995734 kubelet[2361]: E0904 15:41:30.995682 2361 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:41:31.017992 kubelet[2361]: E0904 15:41:31.017952 2361 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 15:41:31.095992 kubelet[2361]: E0904 15:41:31.095877 2361 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:41:31.097402 kubelet[2361]: E0904 15:41:31.097357 2361 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.9:6443: connect: connection refused" interval="400ms" Sep 4 15:41:31.196623 kubelet[2361]: E0904 15:41:31.196576 2361 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:41:31.207315 kubelet[2361]: I0904 15:41:31.207286 2361 policy_none.go:49] "None policy: Start" Sep 4 15:41:31.207315 kubelet[2361]: I0904 15:41:31.207307 2361 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 15:41:31.207412 kubelet[2361]: I0904 15:41:31.207326 2361 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:41:31.208287 kubelet[2361]: W0904 15:41:31.208227 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:31.208287 kubelet[2361]: E0904 15:41:31.208277 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:31.213147 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 15:41:31.219054 kubelet[2361]: E0904 15:41:31.219027 2361 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 15:41:31.227110 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 15:41:31.230115 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 15:41:31.254052 kubelet[2361]: I0904 15:41:31.253992 2361 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 15:41:31.254231 kubelet[2361]: I0904 15:41:31.254197 2361 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:41:31.254294 kubelet[2361]: I0904 15:41:31.254231 2361 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:41:31.254519 kubelet[2361]: I0904 15:41:31.254452 2361 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:41:31.255472 kubelet[2361]: E0904 15:41:31.255449 2361 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 15:41:31.255524 kubelet[2361]: E0904 15:41:31.255500 2361 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 4 15:41:31.355501 kubelet[2361]: I0904 15:41:31.355407 2361 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:41:31.355835 kubelet[2361]: E0904 15:41:31.355805 2361 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.9:6443/api/v1/nodes\": dial tcp 10.0.0.9:6443: connect: connection refused" node="localhost" Sep 4 15:41:31.498580 kubelet[2361]: E0904 15:41:31.498544 2361 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.9:6443: connect: connection refused" interval="800ms" Sep 4 15:41:31.557875 kubelet[2361]: I0904 15:41:31.557834 2361 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:41:31.558195 kubelet[2361]: E0904 15:41:31.558160 2361 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.9:6443/api/v1/nodes\": dial tcp 10.0.0.9:6443: connect: connection refused" node="localhost" Sep 4 15:41:31.628579 systemd[1]: Created slice kubepods-burstable-pod2a87d88508b8cef7d49198c6d84b4457.slice - libcontainer container kubepods-burstable-pod2a87d88508b8cef7d49198c6d84b4457.slice. Sep 4 15:41:31.646848 kubelet[2361]: E0904 15:41:31.646815 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:31.648806 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 4 15:41:31.664609 kubelet[2361]: E0904 15:41:31.664552 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:31.667610 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 4 15:41:31.670234 kubelet[2361]: E0904 15:41:31.670188 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:31.698866 kubelet[2361]: I0904 15:41:31.698835 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:31.698997 kubelet[2361]: I0904 15:41:31.698878 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:31.698997 kubelet[2361]: I0904 15:41:31.698903 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:31.698997 kubelet[2361]: I0904 15:41:31.698944 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:31.698997 kubelet[2361]: I0904 15:41:31.698980 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:31.699139 kubelet[2361]: W0904 15:41:31.698981 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:31.699139 kubelet[2361]: I0904 15:41:31.699009 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:31.699139 kubelet[2361]: I0904 15:41:31.699030 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:31.699139 kubelet[2361]: I0904 15:41:31.699049 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:31.699139 kubelet[2361]: E0904 15:41:31.699039 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:31.699330 kubelet[2361]: I0904 15:41:31.699075 2361 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:31.709617 kubelet[2361]: W0904 15:41:31.709569 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:31.709617 kubelet[2361]: E0904 15:41:31.709607 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:31.797796 kubelet[2361]: W0904 15:41:31.797712 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:31.797796 kubelet[2361]: E0904 15:41:31.797791 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:31.948616 kubelet[2361]: E0904 15:41:31.948473 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:31.949432 containerd[1599]: time="2025-09-04T15:41:31.949380828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2a87d88508b8cef7d49198c6d84b4457,Namespace:kube-system,Attempt:0,}" Sep 4 15:41:31.960305 kubelet[2361]: I0904 15:41:31.960260 2361 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:41:31.960797 kubelet[2361]: E0904 15:41:31.960757 2361 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.9:6443/api/v1/nodes\": dial tcp 10.0.0.9:6443: connect: connection refused" node="localhost" Sep 4 15:41:31.966052 kubelet[2361]: E0904 15:41:31.966018 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:31.966582 containerd[1599]: time="2025-09-04T15:41:31.966540422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 4 15:41:31.971031 kubelet[2361]: E0904 15:41:31.970987 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:31.971660 containerd[1599]: time="2025-09-04T15:41:31.971602123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 4 15:41:32.299687 kubelet[2361]: E0904 15:41:32.299628 2361 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.9:6443: connect: connection refused" interval="1.6s" Sep 4 15:41:32.333157 containerd[1599]: time="2025-09-04T15:41:32.333103488Z" level=info msg="connecting to shim ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae" address="unix:///run/containerd/s/3cb8fae0c6f8ebe3cdc350181081321475e93d026fecc2b375a771d8773d6ed7" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:32.379642 containerd[1599]: time="2025-09-04T15:41:32.379572016Z" level=info msg="connecting to shim d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6" address="unix:///run/containerd/s/70cb4b14eed8ff9509c837e53077badbd846b59c6e72d05d70ecbc23a6b943b9" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:32.389166 containerd[1599]: time="2025-09-04T15:41:32.389089111Z" level=info msg="connecting to shim e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665" address="unix:///run/containerd/s/a24f0196dec82dfed56b0de8060bd7822c9be55623e80d41317251c9f0a6e5a4" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:32.412566 systemd[1]: Started cri-containerd-ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae.scope - libcontainer container ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae. Sep 4 15:41:32.434336 systemd[1]: Started cri-containerd-e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665.scope - libcontainer container e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665. Sep 4 15:41:32.438106 systemd[1]: Started cri-containerd-d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6.scope - libcontainer container d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6. Sep 4 15:41:32.553368 containerd[1599]: time="2025-09-04T15:41:32.515903197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665\"" Sep 4 15:41:32.553368 containerd[1599]: time="2025-09-04T15:41:32.522833321Z" level=info msg="CreateContainer within sandbox \"e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 15:41:32.553534 kubelet[2361]: E0904 15:41:32.518554 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:32.563414 containerd[1599]: time="2025-09-04T15:41:32.563366881Z" level=info msg="Container 36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:32.563414 containerd[1599]: time="2025-09-04T15:41:32.563403570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae\"" Sep 4 15:41:32.564241 kubelet[2361]: E0904 15:41:32.564192 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:32.566108 containerd[1599]: time="2025-09-04T15:41:32.566074218Z" level=info msg="CreateContainer within sandbox \"ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 15:41:32.566320 containerd[1599]: time="2025-09-04T15:41:32.566297516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2a87d88508b8cef7d49198c6d84b4457,Namespace:kube-system,Attempt:0,} returns sandbox id \"d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6\"" Sep 4 15:41:32.566895 kubelet[2361]: E0904 15:41:32.566860 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:32.568307 containerd[1599]: time="2025-09-04T15:41:32.568266328Z" level=info msg="CreateContainer within sandbox \"d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 15:41:32.573615 containerd[1599]: time="2025-09-04T15:41:32.573582846Z" level=info msg="CreateContainer within sandbox \"e738c999dd65779267802d55ee1c7acc47eadc7f7fb29128306454de52246665\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660\"" Sep 4 15:41:32.574092 containerd[1599]: time="2025-09-04T15:41:32.574052918Z" level=info msg="StartContainer for \"36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660\"" Sep 4 15:41:32.575145 containerd[1599]: time="2025-09-04T15:41:32.575109449Z" level=info msg="connecting to shim 36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660" address="unix:///run/containerd/s/a24f0196dec82dfed56b0de8060bd7822c9be55623e80d41317251c9f0a6e5a4" protocol=ttrpc version=3 Sep 4 15:41:32.577980 containerd[1599]: time="2025-09-04T15:41:32.577955445Z" level=info msg="Container 7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:32.581266 containerd[1599]: time="2025-09-04T15:41:32.580817221Z" level=info msg="Container e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:32.585651 containerd[1599]: time="2025-09-04T15:41:32.585591242Z" level=info msg="CreateContainer within sandbox \"ddd220898a9795e02a38eec037c31f8a9689f8d1d8f8c1abf7efdaad1e7905ae\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b\"" Sep 4 15:41:32.586530 containerd[1599]: time="2025-09-04T15:41:32.586491771Z" level=info msg="StartContainer for \"7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b\"" Sep 4 15:41:32.587862 containerd[1599]: time="2025-09-04T15:41:32.587838085Z" level=info msg="connecting to shim 7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b" address="unix:///run/containerd/s/3cb8fae0c6f8ebe3cdc350181081321475e93d026fecc2b375a771d8773d6ed7" protocol=ttrpc version=3 Sep 4 15:41:32.591166 containerd[1599]: time="2025-09-04T15:41:32.591119257Z" level=info msg="CreateContainer within sandbox \"d606c01741fa3594eeb57ddb72bae60bc6f3891a98765b509a6798ba602b82b6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d\"" Sep 4 15:41:32.591847 containerd[1599]: time="2025-09-04T15:41:32.591794724Z" level=info msg="StartContainer for \"e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d\"" Sep 4 15:41:32.594446 containerd[1599]: time="2025-09-04T15:41:32.594396913Z" level=info msg="connecting to shim e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d" address="unix:///run/containerd/s/70cb4b14eed8ff9509c837e53077badbd846b59c6e72d05d70ecbc23a6b943b9" protocol=ttrpc version=3 Sep 4 15:41:32.600376 systemd[1]: Started cri-containerd-36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660.scope - libcontainer container 36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660. Sep 4 15:41:32.614511 systemd[1]: Started cri-containerd-7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b.scope - libcontainer container 7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b. Sep 4 15:41:32.637351 systemd[1]: Started cri-containerd-e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d.scope - libcontainer container e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d. Sep 4 15:41:32.665489 containerd[1599]: time="2025-09-04T15:41:32.665417246Z" level=info msg="StartContainer for \"36e41718d917721f3bc8562fcd8005e785292a41987db6d88778a059045d0660\" returns successfully" Sep 4 15:41:32.686539 kubelet[2361]: W0904 15:41:32.686452 2361 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.9:6443: connect: connection refused Sep 4 15:41:32.686539 kubelet[2361]: E0904 15:41:32.686541 2361 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.9:6443: connect: connection refused" logger="UnhandledError" Sep 4 15:41:32.709866 containerd[1599]: time="2025-09-04T15:41:32.709018819Z" level=info msg="StartContainer for \"7fd29cd9a3a080ea873c392bc02387c167122ab18029094032bd76f37d69230b\" returns successfully" Sep 4 15:41:32.710004 containerd[1599]: time="2025-09-04T15:41:32.709955655Z" level=info msg="StartContainer for \"e72bbf020456ca1de3676d59bcad2e65f131652d0f57324dd7d874b45a90241d\" returns successfully" Sep 4 15:41:32.769281 kubelet[2361]: I0904 15:41:32.762826 2361 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:41:32.928621 kubelet[2361]: E0904 15:41:32.928481 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:32.928621 kubelet[2361]: E0904 15:41:32.928602 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:32.933782 kubelet[2361]: E0904 15:41:32.933758 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:32.933925 kubelet[2361]: E0904 15:41:32.933848 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:32.934073 kubelet[2361]: E0904 15:41:32.934042 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:32.934156 kubelet[2361]: E0904 15:41:32.934122 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:33.936245 kubelet[2361]: E0904 15:41:33.936188 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:33.936634 kubelet[2361]: E0904 15:41:33.936341 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:33.936634 kubelet[2361]: E0904 15:41:33.936238 2361 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 15:41:33.936634 kubelet[2361]: E0904 15:41:33.936452 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:33.936634 kubelet[2361]: E0904 15:41:33.936496 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:33.936634 kubelet[2361]: E0904 15:41:33.936632 2361 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:34.433454 kubelet[2361]: E0904 15:41:34.433389 2361 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 4 15:41:34.475263 kubelet[2361]: E0904 15:41:34.475116 2361 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.18621ea33198925e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 15:41:30.888499806 +0000 UTC m=+0.350915237,LastTimestamp:2025-09-04 15:41:30.888499806 +0000 UTC m=+0.350915237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 15:41:34.516036 kubelet[2361]: I0904 15:41:34.515978 2361 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 15:41:34.596456 kubelet[2361]: I0904 15:41:34.596384 2361 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:34.600449 kubelet[2361]: E0904 15:41:34.600424 2361 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:34.600449 kubelet[2361]: I0904 15:41:34.600444 2361 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:34.602769 kubelet[2361]: E0904 15:41:34.602743 2361 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:34.602769 kubelet[2361]: I0904 15:41:34.602758 2361 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:34.604057 kubelet[2361]: E0904 15:41:34.604031 2361 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:34.886679 kubelet[2361]: I0904 15:41:34.886645 2361 apiserver.go:52] "Watching apiserver" Sep 4 15:41:34.896103 kubelet[2361]: I0904 15:41:34.896063 2361 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 15:41:36.478256 systemd[1]: Reload requested from client PID 2633 ('systemctl') (unit session-7.scope)... Sep 4 15:41:36.478283 systemd[1]: Reloading... Sep 4 15:41:36.569257 zram_generator::config[2679]: No configuration found. Sep 4 15:41:36.794411 systemd[1]: Reloading finished in 315 ms. Sep 4 15:41:36.817899 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:36.832555 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 15:41:36.832907 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:36.832956 systemd[1]: kubelet.service: Consumed 858ms CPU time, 132.9M memory peak. Sep 4 15:41:36.834644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 15:41:37.047589 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 15:41:37.052430 (kubelet)[2721]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 15:41:37.133117 kubelet[2721]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:41:37.133117 kubelet[2721]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 15:41:37.133117 kubelet[2721]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 15:41:37.133570 kubelet[2721]: I0904 15:41:37.133238 2721 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 15:41:37.140666 kubelet[2721]: I0904 15:41:37.140634 2721 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 4 15:41:37.140666 kubelet[2721]: I0904 15:41:37.140654 2721 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 15:41:37.140920 kubelet[2721]: I0904 15:41:37.140895 2721 server.go:954] "Client rotation is on, will bootstrap in background" Sep 4 15:41:37.142089 kubelet[2721]: I0904 15:41:37.142064 2721 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 15:41:37.144502 kubelet[2721]: I0904 15:41:37.144456 2721 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 15:41:37.150015 kubelet[2721]: I0904 15:41:37.149989 2721 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 15:41:37.155293 kubelet[2721]: I0904 15:41:37.155266 2721 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 15:41:37.155552 kubelet[2721]: I0904 15:41:37.155510 2721 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 15:41:37.155744 kubelet[2721]: I0904 15:41:37.155541 2721 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 15:41:37.155852 kubelet[2721]: I0904 15:41:37.155749 2721 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 15:41:37.155852 kubelet[2721]: I0904 15:41:37.155758 2721 container_manager_linux.go:304] "Creating device plugin manager" Sep 4 15:41:37.155852 kubelet[2721]: I0904 15:41:37.155810 2721 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:41:37.156001 kubelet[2721]: I0904 15:41:37.155985 2721 kubelet.go:446] "Attempting to sync node with API server" Sep 4 15:41:37.156049 kubelet[2721]: I0904 15:41:37.156019 2721 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 15:41:37.156049 kubelet[2721]: I0904 15:41:37.156043 2721 kubelet.go:352] "Adding apiserver pod source" Sep 4 15:41:37.156106 kubelet[2721]: I0904 15:41:37.156052 2721 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 15:41:37.158355 kubelet[2721]: I0904 15:41:37.158317 2721 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 15:41:37.158746 kubelet[2721]: I0904 15:41:37.158721 2721 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 15:41:37.159256 kubelet[2721]: I0904 15:41:37.159233 2721 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 15:41:37.159478 kubelet[2721]: I0904 15:41:37.159265 2721 server.go:1287] "Started kubelet" Sep 4 15:41:37.159908 kubelet[2721]: I0904 15:41:37.159828 2721 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 15:41:37.162229 kubelet[2721]: I0904 15:41:37.161909 2721 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 15:41:37.163401 kubelet[2721]: I0904 15:41:37.163369 2721 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 15:41:37.166328 kubelet[2721]: I0904 15:41:37.166303 2721 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 15:41:37.166423 kubelet[2721]: I0904 15:41:37.166401 2721 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 15:41:37.167257 kubelet[2721]: I0904 15:41:37.167229 2721 server.go:479] "Adding debug handlers to kubelet server" Sep 4 15:41:37.168409 kubelet[2721]: E0904 15:41:37.168350 2721 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 15:41:37.168453 kubelet[2721]: I0904 15:41:37.168416 2721 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 15:41:37.168635 kubelet[2721]: I0904 15:41:37.168614 2721 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 15:41:37.168835 kubelet[2721]: I0904 15:41:37.168807 2721 reconciler.go:26] "Reconciler: start to sync state" Sep 4 15:41:37.175707 kubelet[2721]: I0904 15:41:37.175614 2721 factory.go:221] Registration of the systemd container factory successfully Sep 4 15:41:37.175758 kubelet[2721]: I0904 15:41:37.175732 2721 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 15:41:37.177276 kubelet[2721]: E0904 15:41:37.177239 2721 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 15:41:37.177276 kubelet[2721]: I0904 15:41:37.176855 2721 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 15:41:37.178568 kubelet[2721]: I0904 15:41:37.178537 2721 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 15:41:37.178631 kubelet[2721]: I0904 15:41:37.178575 2721 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 4 15:41:37.178631 kubelet[2721]: I0904 15:41:37.178597 2721 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 15:41:37.178631 kubelet[2721]: I0904 15:41:37.178605 2721 kubelet.go:2382] "Starting kubelet main sync loop" Sep 4 15:41:37.178708 kubelet[2721]: E0904 15:41:37.178660 2721 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 15:41:37.180242 kubelet[2721]: I0904 15:41:37.179768 2721 factory.go:221] Registration of the containerd container factory successfully Sep 4 15:41:37.212161 kubelet[2721]: I0904 15:41:37.212130 2721 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 15:41:37.212161 kubelet[2721]: I0904 15:41:37.212146 2721 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 15:41:37.212161 kubelet[2721]: I0904 15:41:37.212166 2721 state_mem.go:36] "Initialized new in-memory state store" Sep 4 15:41:37.212348 kubelet[2721]: I0904 15:41:37.212326 2721 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 15:41:37.212376 kubelet[2721]: I0904 15:41:37.212342 2721 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 15:41:37.212376 kubelet[2721]: I0904 15:41:37.212361 2721 policy_none.go:49] "None policy: Start" Sep 4 15:41:37.212376 kubelet[2721]: I0904 15:41:37.212370 2721 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 15:41:37.212448 kubelet[2721]: I0904 15:41:37.212380 2721 state_mem.go:35] "Initializing new in-memory state store" Sep 4 15:41:37.212492 kubelet[2721]: I0904 15:41:37.212475 2721 state_mem.go:75] "Updated machine memory state" Sep 4 15:41:37.217205 kubelet[2721]: I0904 15:41:37.216887 2721 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 15:41:37.217205 kubelet[2721]: I0904 15:41:37.217074 2721 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 15:41:37.217205 kubelet[2721]: I0904 15:41:37.217087 2721 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 15:41:37.217796 kubelet[2721]: I0904 15:41:37.217766 2721 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 15:41:37.218916 kubelet[2721]: E0904 15:41:37.218887 2721 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 15:41:37.280356 kubelet[2721]: I0904 15:41:37.280283 2721 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.280866 kubelet[2721]: I0904 15:41:37.280529 2721 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:37.280866 kubelet[2721]: I0904 15:41:37.280614 2721 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:37.322403 kubelet[2721]: I0904 15:41:37.322282 2721 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 15:41:37.332224 kubelet[2721]: I0904 15:41:37.332160 2721 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 4 15:41:37.332354 kubelet[2721]: I0904 15:41:37.332244 2721 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 15:41:37.370100 kubelet[2721]: I0904 15:41:37.370071 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.370201 kubelet[2721]: I0904 15:41:37.370104 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:37.370201 kubelet[2721]: I0904 15:41:37.370126 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:37.370201 kubelet[2721]: I0904 15:41:37.370146 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.370201 kubelet[2721]: I0904 15:41:37.370164 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.370201 kubelet[2721]: I0904 15:41:37.370181 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.370351 kubelet[2721]: I0904 15:41:37.370198 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 15:41:37.370351 kubelet[2721]: I0904 15:41:37.370235 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:37.370351 kubelet[2721]: I0904 15:41:37.370253 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a87d88508b8cef7d49198c6d84b4457-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2a87d88508b8cef7d49198c6d84b4457\") " pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:37.585342 kubelet[2721]: E0904 15:41:37.585186 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:37.585492 kubelet[2721]: E0904 15:41:37.585369 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:37.587376 kubelet[2721]: E0904 15:41:37.587345 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:38.160361 kubelet[2721]: I0904 15:41:38.160302 2721 apiserver.go:52] "Watching apiserver" Sep 4 15:41:38.169395 kubelet[2721]: I0904 15:41:38.169339 2721 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 15:41:38.193239 kubelet[2721]: E0904 15:41:38.193123 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:38.193239 kubelet[2721]: I0904 15:41:38.193170 2721 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:38.193555 kubelet[2721]: I0904 15:41:38.193538 2721 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:38.199849 kubelet[2721]: E0904 15:41:38.199798 2721 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 4 15:41:38.199849 kubelet[2721]: E0904 15:41:38.199953 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:38.200229 kubelet[2721]: E0904 15:41:38.200070 2721 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 4 15:41:38.200229 kubelet[2721]: E0904 15:41:38.200148 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:38.212928 kubelet[2721]: I0904 15:41:38.212874 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.212859408 podStartE2EDuration="1.212859408s" podCreationTimestamp="2025-09-04 15:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:41:38.212248994 +0000 UTC m=+1.122721393" watchObservedRunningTime="2025-09-04 15:41:38.212859408 +0000 UTC m=+1.123331807" Sep 4 15:41:38.224535 kubelet[2721]: I0904 15:41:38.224451 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.224430023 podStartE2EDuration="1.224430023s" podCreationTimestamp="2025-09-04 15:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:41:38.224269142 +0000 UTC m=+1.134741541" watchObservedRunningTime="2025-09-04 15:41:38.224430023 +0000 UTC m=+1.134902422" Sep 4 15:41:38.224610 kubelet[2721]: I0904 15:41:38.224568 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.224563814 podStartE2EDuration="1.224563814s" podCreationTimestamp="2025-09-04 15:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:41:38.218121385 +0000 UTC m=+1.128593784" watchObservedRunningTime="2025-09-04 15:41:38.224563814 +0000 UTC m=+1.135036213" Sep 4 15:41:39.194689 kubelet[2721]: E0904 15:41:39.194367 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:39.194689 kubelet[2721]: E0904 15:41:39.194591 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:41.966239 kubelet[2721]: I0904 15:41:41.966174 2721 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 15:41:41.966777 containerd[1599]: time="2025-09-04T15:41:41.966744552Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 15:41:41.967052 kubelet[2721]: I0904 15:41:41.967013 2721 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 15:41:42.627903 systemd[1]: Created slice kubepods-besteffort-pod1708dbfa_d070_44d9_bed8_13fd00a79518.slice - libcontainer container kubepods-besteffort-pod1708dbfa_d070_44d9_bed8_13fd00a79518.slice. Sep 4 15:41:42.712812 kubelet[2721]: I0904 15:41:42.712759 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1708dbfa-d070-44d9-bed8-13fd00a79518-lib-modules\") pod \"kube-proxy-pgknt\" (UID: \"1708dbfa-d070-44d9-bed8-13fd00a79518\") " pod="kube-system/kube-proxy-pgknt" Sep 4 15:41:42.712812 kubelet[2721]: I0904 15:41:42.712799 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6tz\" (UniqueName: \"kubernetes.io/projected/1708dbfa-d070-44d9-bed8-13fd00a79518-kube-api-access-9d6tz\") pod \"kube-proxy-pgknt\" (UID: \"1708dbfa-d070-44d9-bed8-13fd00a79518\") " pod="kube-system/kube-proxy-pgknt" Sep 4 15:41:42.712998 kubelet[2721]: I0904 15:41:42.712838 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1708dbfa-d070-44d9-bed8-13fd00a79518-kube-proxy\") pod \"kube-proxy-pgknt\" (UID: \"1708dbfa-d070-44d9-bed8-13fd00a79518\") " pod="kube-system/kube-proxy-pgknt" Sep 4 15:41:42.712998 kubelet[2721]: I0904 15:41:42.712869 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1708dbfa-d070-44d9-bed8-13fd00a79518-xtables-lock\") pod \"kube-proxy-pgknt\" (UID: \"1708dbfa-d070-44d9-bed8-13fd00a79518\") " pod="kube-system/kube-proxy-pgknt" Sep 4 15:41:42.819684 kubelet[2721]: E0904 15:41:42.819621 2721 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 4 15:41:42.819684 kubelet[2721]: E0904 15:41:42.819679 2721 projected.go:194] Error preparing data for projected volume kube-api-access-9d6tz for pod kube-system/kube-proxy-pgknt: configmap "kube-root-ca.crt" not found Sep 4 15:41:42.819878 kubelet[2721]: E0904 15:41:42.819787 2721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1708dbfa-d070-44d9-bed8-13fd00a79518-kube-api-access-9d6tz podName:1708dbfa-d070-44d9-bed8-13fd00a79518 nodeName:}" failed. No retries permitted until 2025-09-04 15:41:43.319760688 +0000 UTC m=+6.230233087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9d6tz" (UniqueName: "kubernetes.io/projected/1708dbfa-d070-44d9-bed8-13fd00a79518-kube-api-access-9d6tz") pod "kube-proxy-pgknt" (UID: "1708dbfa-d070-44d9-bed8-13fd00a79518") : configmap "kube-root-ca.crt" not found Sep 4 15:41:42.988618 systemd[1]: Created slice kubepods-besteffort-pod828d397a_e2d7_4d78_9fd4_3ebccea9a3e9.slice - libcontainer container kubepods-besteffort-pod828d397a_e2d7_4d78_9fd4_3ebccea9a3e9.slice. Sep 4 15:41:43.116062 kubelet[2721]: I0904 15:41:43.116002 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/828d397a-e2d7-4d78-9fd4-3ebccea9a3e9-var-lib-calico\") pod \"tigera-operator-755d956888-wwwqh\" (UID: \"828d397a-e2d7-4d78-9fd4-3ebccea9a3e9\") " pod="tigera-operator/tigera-operator-755d956888-wwwqh" Sep 4 15:41:43.116062 kubelet[2721]: I0904 15:41:43.116047 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s84b\" (UniqueName: \"kubernetes.io/projected/828d397a-e2d7-4d78-9fd4-3ebccea9a3e9-kube-api-access-2s84b\") pod \"tigera-operator-755d956888-wwwqh\" (UID: \"828d397a-e2d7-4d78-9fd4-3ebccea9a3e9\") " pod="tigera-operator/tigera-operator-755d956888-wwwqh" Sep 4 15:41:43.295303 containerd[1599]: time="2025-09-04T15:41:43.295242327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-wwwqh,Uid:828d397a-e2d7-4d78-9fd4-3ebccea9a3e9,Namespace:tigera-operator,Attempt:0,}" Sep 4 15:41:43.321255 containerd[1599]: time="2025-09-04T15:41:43.321176018Z" level=info msg="connecting to shim 7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a" address="unix:///run/containerd/s/5d0a06eed0ef3358fc1339c192450554936d27b931250dc5841e500c848278d8" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:43.404385 systemd[1]: Started cri-containerd-7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a.scope - libcontainer container 7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a. Sep 4 15:41:43.462086 containerd[1599]: time="2025-09-04T15:41:43.461962263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-wwwqh,Uid:828d397a-e2d7-4d78-9fd4-3ebccea9a3e9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a\"" Sep 4 15:41:43.469008 containerd[1599]: time="2025-09-04T15:41:43.468914654Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 15:41:43.539930 kubelet[2721]: E0904 15:41:43.539876 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:43.540574 containerd[1599]: time="2025-09-04T15:41:43.540535404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pgknt,Uid:1708dbfa-d070-44d9-bed8-13fd00a79518,Namespace:kube-system,Attempt:0,}" Sep 4 15:41:43.566785 containerd[1599]: time="2025-09-04T15:41:43.566563535Z" level=info msg="connecting to shim ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99" address="unix:///run/containerd/s/f430cbbafa2e197cab97b14c7f11913a62faf3291d55378bb8d188735b452d31" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:43.601552 systemd[1]: Started cri-containerd-ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99.scope - libcontainer container ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99. Sep 4 15:41:43.632605 containerd[1599]: time="2025-09-04T15:41:43.632563215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pgknt,Uid:1708dbfa-d070-44d9-bed8-13fd00a79518,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99\"" Sep 4 15:41:43.633527 kubelet[2721]: E0904 15:41:43.633483 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:43.635795 containerd[1599]: time="2025-09-04T15:41:43.635746264Z" level=info msg="CreateContainer within sandbox \"ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 15:41:43.648667 containerd[1599]: time="2025-09-04T15:41:43.648612477Z" level=info msg="Container 359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:43.656552 containerd[1599]: time="2025-09-04T15:41:43.656510770Z" level=info msg="CreateContainer within sandbox \"ec5cddbbd85a558c6cac998c939c8ebe4385359f778ffd922abb54dc6f64df99\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9\"" Sep 4 15:41:43.656997 containerd[1599]: time="2025-09-04T15:41:43.656964339Z" level=info msg="StartContainer for \"359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9\"" Sep 4 15:41:43.658369 containerd[1599]: time="2025-09-04T15:41:43.658329956Z" level=info msg="connecting to shim 359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9" address="unix:///run/containerd/s/f430cbbafa2e197cab97b14c7f11913a62faf3291d55378bb8d188735b452d31" protocol=ttrpc version=3 Sep 4 15:41:43.685385 systemd[1]: Started cri-containerd-359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9.scope - libcontainer container 359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9. Sep 4 15:41:43.727902 containerd[1599]: time="2025-09-04T15:41:43.727830724Z" level=info msg="StartContainer for \"359013a571d3bfcbaeb2a316ed81ad0b0d55eab4e8027e39d3d60369355235e9\" returns successfully" Sep 4 15:41:44.078054 kubelet[2721]: E0904 15:41:44.077973 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:44.088970 kubelet[2721]: E0904 15:41:44.088934 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:44.204043 kubelet[2721]: E0904 15:41:44.203472 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:44.205295 kubelet[2721]: E0904 15:41:44.204988 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:44.205295 kubelet[2721]: E0904 15:41:44.205174 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:44.223461 kubelet[2721]: I0904 15:41:44.223309 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pgknt" podStartSLOduration=2.223290205 podStartE2EDuration="2.223290205s" podCreationTimestamp="2025-09-04 15:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:41:44.213574303 +0000 UTC m=+7.124046743" watchObservedRunningTime="2025-09-04 15:41:44.223290205 +0000 UTC m=+7.133762604" Sep 4 15:41:44.982018 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2615592926.mount: Deactivated successfully. Sep 4 15:41:45.206159 kubelet[2721]: E0904 15:41:45.206125 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:45.316430 containerd[1599]: time="2025-09-04T15:41:45.316370619Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:45.317249 containerd[1599]: time="2025-09-04T15:41:45.317225473Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 15:41:45.318442 containerd[1599]: time="2025-09-04T15:41:45.318412852Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:45.320488 containerd[1599]: time="2025-09-04T15:41:45.320444103Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:45.320865 containerd[1599]: time="2025-09-04T15:41:45.320832115Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.851860421s" Sep 4 15:41:45.320893 containerd[1599]: time="2025-09-04T15:41:45.320864076Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 15:41:45.324018 containerd[1599]: time="2025-09-04T15:41:45.323970824Z" level=info msg="CreateContainer within sandbox \"7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 15:41:45.332125 containerd[1599]: time="2025-09-04T15:41:45.332085341Z" level=info msg="Container 13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:45.338377 containerd[1599]: time="2025-09-04T15:41:45.338343290Z" level=info msg="CreateContainer within sandbox \"7f2dc2e83c0485606e10e98dc128422278b6f7dbeb7cc41d73248103a8a7f50a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63\"" Sep 4 15:41:45.338680 containerd[1599]: time="2025-09-04T15:41:45.338649606Z" level=info msg="StartContainer for \"13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63\"" Sep 4 15:41:45.339652 containerd[1599]: time="2025-09-04T15:41:45.339575335Z" level=info msg="connecting to shim 13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63" address="unix:///run/containerd/s/5d0a06eed0ef3358fc1339c192450554936d27b931250dc5841e500c848278d8" protocol=ttrpc version=3 Sep 4 15:41:45.388422 systemd[1]: Started cri-containerd-13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63.scope - libcontainer container 13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63. Sep 4 15:41:45.419556 containerd[1599]: time="2025-09-04T15:41:45.419510628Z" level=info msg="StartContainer for \"13f73992afc0ba9f2c8562adbd5504786923a245c1cc1207ce4fb408db7c0b63\" returns successfully" Sep 4 15:41:46.219669 kubelet[2721]: I0904 15:41:46.219585 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-wwwqh" podStartSLOduration=2.363900524 podStartE2EDuration="4.219537888s" podCreationTimestamp="2025-09-04 15:41:42 +0000 UTC" firstStartedPulling="2025-09-04 15:41:43.46594274 +0000 UTC m=+6.376415139" lastFinishedPulling="2025-09-04 15:41:45.321580114 +0000 UTC m=+8.232052503" observedRunningTime="2025-09-04 15:41:46.218956588 +0000 UTC m=+9.129428987" watchObservedRunningTime="2025-09-04 15:41:46.219537888 +0000 UTC m=+9.130010287" Sep 4 15:41:47.413238 kubelet[2721]: E0904 15:41:47.413146 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:48.215694 kubelet[2721]: E0904 15:41:48.215650 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:50.573157 sudo[1804]: pam_unix(sudo:session): session closed for user root Sep 4 15:41:50.575944 sshd[1803]: Connection closed by 10.0.0.1 port 41556 Sep 4 15:41:50.575866 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Sep 4 15:41:50.581719 systemd[1]: sshd@6-10.0.0.9:22-10.0.0.1:41556.service: Deactivated successfully. Sep 4 15:41:50.587677 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 15:41:50.588353 systemd[1]: session-7.scope: Consumed 5.830s CPU time, 227.8M memory peak. Sep 4 15:41:50.592141 systemd-logind[1569]: Session 7 logged out. Waiting for processes to exit. Sep 4 15:41:50.594633 systemd-logind[1569]: Removed session 7. Sep 4 15:41:53.104239 systemd[1]: Created slice kubepods-besteffort-pod73534e43_5af3_4d9b_b26a_44efeb1f0c1c.slice - libcontainer container kubepods-besteffort-pod73534e43_5af3_4d9b_b26a_44efeb1f0c1c.slice. Sep 4 15:41:53.179749 kubelet[2721]: I0904 15:41:53.179658 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/73534e43-5af3-4d9b-b26a-44efeb1f0c1c-typha-certs\") pod \"calico-typha-57676689d7-hrzl5\" (UID: \"73534e43-5af3-4d9b-b26a-44efeb1f0c1c\") " pod="calico-system/calico-typha-57676689d7-hrzl5" Sep 4 15:41:53.179749 kubelet[2721]: I0904 15:41:53.179719 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrxz\" (UniqueName: \"kubernetes.io/projected/73534e43-5af3-4d9b-b26a-44efeb1f0c1c-kube-api-access-hnrxz\") pod \"calico-typha-57676689d7-hrzl5\" (UID: \"73534e43-5af3-4d9b-b26a-44efeb1f0c1c\") " pod="calico-system/calico-typha-57676689d7-hrzl5" Sep 4 15:41:53.179749 kubelet[2721]: I0904 15:41:53.179745 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73534e43-5af3-4d9b-b26a-44efeb1f0c1c-tigera-ca-bundle\") pod \"calico-typha-57676689d7-hrzl5\" (UID: \"73534e43-5af3-4d9b-b26a-44efeb1f0c1c\") " pod="calico-system/calico-typha-57676689d7-hrzl5" Sep 4 15:41:53.389171 systemd[1]: Created slice kubepods-besteffort-podf303361c_2d97_4ca2_a547_7fe4f5faa7a0.slice - libcontainer container kubepods-besteffort-podf303361c_2d97_4ca2_a547_7fe4f5faa7a0.slice. Sep 4 15:41:53.409636 kubelet[2721]: E0904 15:41:53.409595 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:53.410232 containerd[1599]: time="2025-09-04T15:41:53.410144621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57676689d7-hrzl5,Uid:73534e43-5af3-4d9b-b26a-44efeb1f0c1c,Namespace:calico-system,Attempt:0,}" Sep 4 15:41:53.481461 kubelet[2721]: I0904 15:41:53.481409 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-policysync\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481609 kubelet[2721]: I0904 15:41:53.481471 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtl4j\" (UniqueName: \"kubernetes.io/projected/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-kube-api-access-gtl4j\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481609 kubelet[2721]: I0904 15:41:53.481494 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-cni-log-dir\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481609 kubelet[2721]: I0904 15:41:53.481516 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-var-lib-calico\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481609 kubelet[2721]: I0904 15:41:53.481534 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-cni-bin-dir\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481609 kubelet[2721]: I0904 15:41:53.481555 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-var-run-calico\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481747 kubelet[2721]: I0904 15:41:53.481590 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-lib-modules\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481747 kubelet[2721]: I0904 15:41:53.481620 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-cni-net-dir\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481747 kubelet[2721]: I0904 15:41:53.481643 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-node-certs\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481747 kubelet[2721]: I0904 15:41:53.481657 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-xtables-lock\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481747 kubelet[2721]: I0904 15:41:53.481670 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-flexvol-driver-host\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.481912 kubelet[2721]: I0904 15:41:53.481687 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f303361c-2d97-4ca2-a547-7fe4f5faa7a0-tigera-ca-bundle\") pod \"calico-node-5sg7z\" (UID: \"f303361c-2d97-4ca2-a547-7fe4f5faa7a0\") " pod="calico-system/calico-node-5sg7z" Sep 4 15:41:53.499719 containerd[1599]: time="2025-09-04T15:41:53.499659190Z" level=info msg="connecting to shim a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e" address="unix:///run/containerd/s/4e377751b02db206e33456780a59a443ce1c15e3906ebee8f8e08fec1dc732c7" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:53.530361 systemd[1]: Started cri-containerd-a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e.scope - libcontainer container a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e. Sep 4 15:41:53.575069 containerd[1599]: time="2025-09-04T15:41:53.575024162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57676689d7-hrzl5,Uid:73534e43-5af3-4d9b-b26a-44efeb1f0c1c,Namespace:calico-system,Attempt:0,} returns sandbox id \"a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e\"" Sep 4 15:41:53.575679 kubelet[2721]: E0904 15:41:53.575653 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:41:53.576914 containerd[1599]: time="2025-09-04T15:41:53.576877337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 15:41:53.583387 kubelet[2721]: E0904 15:41:53.583362 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.583507 kubelet[2721]: W0904 15:41:53.583476 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.583635 kubelet[2721]: E0904 15:41:53.583518 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.583748 kubelet[2721]: E0904 15:41:53.583732 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.583748 kubelet[2721]: W0904 15:41:53.583743 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.583833 kubelet[2721]: E0904 15:41:53.583759 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.584004 kubelet[2721]: E0904 15:41:53.583976 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.584004 kubelet[2721]: W0904 15:41:53.583999 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.584119 kubelet[2721]: E0904 15:41:53.584033 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.584556 kubelet[2721]: E0904 15:41:53.584535 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.584556 kubelet[2721]: W0904 15:41:53.584549 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.584648 kubelet[2721]: E0904 15:41:53.584569 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.584802 kubelet[2721]: E0904 15:41:53.584778 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.584802 kubelet[2721]: W0904 15:41:53.584798 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.584869 kubelet[2721]: E0904 15:41:53.584808 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.585030 kubelet[2721]: E0904 15:41:53.585014 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.585030 kubelet[2721]: W0904 15:41:53.585026 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.585187 kubelet[2721]: E0904 15:41:53.585165 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.585254 kubelet[2721]: E0904 15:41:53.585238 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.585254 kubelet[2721]: W0904 15:41:53.585246 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.585345 kubelet[2721]: E0904 15:41:53.585272 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.585576 kubelet[2721]: E0904 15:41:53.585550 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.585576 kubelet[2721]: W0904 15:41:53.585562 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.585657 kubelet[2721]: E0904 15:41:53.585642 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.585836 kubelet[2721]: E0904 15:41:53.585820 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.585836 kubelet[2721]: W0904 15:41:53.585831 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.585971 kubelet[2721]: E0904 15:41:53.585930 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.586048 kubelet[2721]: E0904 15:41:53.586033 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.586048 kubelet[2721]: W0904 15:41:53.586044 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.586108 kubelet[2721]: E0904 15:41:53.586058 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.589307 kubelet[2721]: E0904 15:41:53.589289 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.589307 kubelet[2721]: W0904 15:41:53.589302 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.589373 kubelet[2721]: E0904 15:41:53.589328 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.589565 kubelet[2721]: E0904 15:41:53.589550 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.589565 kubelet[2721]: W0904 15:41:53.589561 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.589629 kubelet[2721]: E0904 15:41:53.589574 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.590052 kubelet[2721]: E0904 15:41:53.589822 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.590052 kubelet[2721]: W0904 15:41:53.589838 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.590052 kubelet[2721]: E0904 15:41:53.589964 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.590689 kubelet[2721]: E0904 15:41:53.590673 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.590689 kubelet[2721]: W0904 15:41:53.590685 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.590756 kubelet[2721]: E0904 15:41:53.590700 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.590959 kubelet[2721]: E0904 15:41:53.590943 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.590959 kubelet[2721]: W0904 15:41:53.590957 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.591025 kubelet[2721]: E0904 15:41:53.590973 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.591274 kubelet[2721]: E0904 15:41:53.591256 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.591274 kubelet[2721]: W0904 15:41:53.591268 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.591387 kubelet[2721]: E0904 15:41:53.591359 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.591519 kubelet[2721]: E0904 15:41:53.591504 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.591519 kubelet[2721]: W0904 15:41:53.591515 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.591570 kubelet[2721]: E0904 15:41:53.591545 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.591723 kubelet[2721]: E0904 15:41:53.591706 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.591723 kubelet[2721]: W0904 15:41:53.591717 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.591792 kubelet[2721]: E0904 15:41:53.591726 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.597283 kubelet[2721]: E0904 15:41:53.597183 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.597283 kubelet[2721]: W0904 15:41:53.597272 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.597355 kubelet[2721]: E0904 15:41:53.597288 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.634864 kubelet[2721]: E0904 15:41:53.634485 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:41:53.687290 kubelet[2721]: E0904 15:41:53.687143 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.687290 kubelet[2721]: W0904 15:41:53.687178 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.689665 kubelet[2721]: E0904 15:41:53.689629 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.690111 kubelet[2721]: E0904 15:41:53.690035 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.690111 kubelet[2721]: W0904 15:41:53.690106 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.690266 kubelet[2721]: E0904 15:41:53.690121 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.691179 kubelet[2721]: E0904 15:41:53.691155 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.691276 kubelet[2721]: W0904 15:41:53.691190 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.691276 kubelet[2721]: E0904 15:41:53.691206 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.691563 kubelet[2721]: E0904 15:41:53.691543 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.691597 kubelet[2721]: W0904 15:41:53.691580 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.691623 kubelet[2721]: E0904 15:41:53.691596 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.692377 kubelet[2721]: E0904 15:41:53.692359 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.692377 kubelet[2721]: W0904 15:41:53.692372 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.692562 kubelet[2721]: E0904 15:41:53.692533 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.694048 kubelet[2721]: E0904 15:41:53.693990 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.694048 kubelet[2721]: W0904 15:41:53.694006 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.694048 kubelet[2721]: E0904 15:41:53.694018 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.695286 kubelet[2721]: E0904 15:41:53.695264 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.695286 kubelet[2721]: W0904 15:41:53.695279 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.695380 kubelet[2721]: E0904 15:41:53.695290 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.695574 kubelet[2721]: E0904 15:41:53.695542 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.695574 kubelet[2721]: W0904 15:41:53.695556 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.695574 kubelet[2721]: E0904 15:41:53.695565 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.695811 kubelet[2721]: E0904 15:41:53.695794 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.695811 kubelet[2721]: W0904 15:41:53.695806 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.695871 kubelet[2721]: E0904 15:41:53.695815 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.696030 kubelet[2721]: E0904 15:41:53.696009 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.696030 kubelet[2721]: W0904 15:41:53.696023 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.696030 kubelet[2721]: E0904 15:41:53.696032 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.696323 kubelet[2721]: E0904 15:41:53.696292 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.696323 kubelet[2721]: W0904 15:41:53.696319 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.696445 kubelet[2721]: E0904 15:41:53.696349 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.696614 kubelet[2721]: E0904 15:41:53.696597 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.696614 kubelet[2721]: W0904 15:41:53.696609 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.696719 kubelet[2721]: E0904 15:41:53.696617 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.697017 kubelet[2721]: E0904 15:41:53.696997 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.697017 kubelet[2721]: W0904 15:41:53.697015 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.697105 kubelet[2721]: E0904 15:41:53.697026 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.697355 kubelet[2721]: E0904 15:41:53.697196 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.697355 kubelet[2721]: W0904 15:41:53.697241 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.697355 kubelet[2721]: E0904 15:41:53.697250 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.697444 kubelet[2721]: E0904 15:41:53.697411 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.697444 kubelet[2721]: W0904 15:41:53.697418 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.697444 kubelet[2721]: E0904 15:41:53.697426 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.697948 kubelet[2721]: E0904 15:41:53.697599 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.697948 kubelet[2721]: W0904 15:41:53.697614 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.697948 kubelet[2721]: E0904 15:41:53.697622 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.697948 kubelet[2721]: E0904 15:41:53.697815 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.697948 kubelet[2721]: W0904 15:41:53.697822 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.697948 kubelet[2721]: E0904 15:41:53.697830 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.698134 kubelet[2721]: E0904 15:41:53.697988 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.698134 kubelet[2721]: W0904 15:41:53.697995 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.698134 kubelet[2721]: E0904 15:41:53.698003 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.698231 kubelet[2721]: E0904 15:41:53.698193 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.698231 kubelet[2721]: W0904 15:41:53.698201 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.698285 kubelet[2721]: E0904 15:41:53.698240 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.699242 kubelet[2721]: E0904 15:41:53.698410 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.699242 kubelet[2721]: W0904 15:41:53.698423 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.699242 kubelet[2721]: E0904 15:41:53.698430 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.699345 containerd[1599]: time="2025-09-04T15:41:53.698633531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5sg7z,Uid:f303361c-2d97-4ca2-a547-7fe4f5faa7a0,Namespace:calico-system,Attempt:0,}" Sep 4 15:41:53.730916 containerd[1599]: time="2025-09-04T15:41:53.730848598Z" level=info msg="connecting to shim f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3" address="unix:///run/containerd/s/29b1b5b42ae4b54fbc20c224b75fd151facb48442b2abf6a48e37ff68b80f040" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:41:53.755695 systemd[1]: Started cri-containerd-f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3.scope - libcontainer container f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3. Sep 4 15:41:53.784287 kubelet[2721]: E0904 15:41:53.784240 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.784287 kubelet[2721]: W0904 15:41:53.784275 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.784287 kubelet[2721]: E0904 15:41:53.784297 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.784473 kubelet[2721]: I0904 15:41:53.784331 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1972dac2-e9c7-4fa0-a718-6b8d76c15f8c-varrun\") pod \"csi-node-driver-g5qkw\" (UID: \"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c\") " pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:41:53.784631 kubelet[2721]: E0904 15:41:53.784613 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.784631 kubelet[2721]: W0904 15:41:53.784627 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.784699 kubelet[2721]: E0904 15:41:53.784641 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.784699 kubelet[2721]: I0904 15:41:53.784656 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jsq\" (UniqueName: \"kubernetes.io/projected/1972dac2-e9c7-4fa0-a718-6b8d76c15f8c-kube-api-access-w4jsq\") pod \"csi-node-driver-g5qkw\" (UID: \"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c\") " pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:41:53.785024 kubelet[2721]: E0904 15:41:53.784984 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.785024 kubelet[2721]: W0904 15:41:53.785019 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.785113 kubelet[2721]: E0904 15:41:53.785052 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.785350 kubelet[2721]: E0904 15:41:53.785296 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.785350 kubelet[2721]: W0904 15:41:53.785307 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.785350 kubelet[2721]: E0904 15:41:53.785341 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.785700 kubelet[2721]: E0904 15:41:53.785657 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.785737 kubelet[2721]: W0904 15:41:53.785700 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.785737 kubelet[2721]: E0904 15:41:53.785720 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.785793 kubelet[2721]: I0904 15:41:53.785755 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1972dac2-e9c7-4fa0-a718-6b8d76c15f8c-kubelet-dir\") pod \"csi-node-driver-g5qkw\" (UID: \"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c\") " pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:41:53.786016 kubelet[2721]: E0904 15:41:53.785995 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.786016 kubelet[2721]: W0904 15:41:53.786012 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.786129 kubelet[2721]: E0904 15:41:53.786040 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.786129 kubelet[2721]: I0904 15:41:53.786113 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1972dac2-e9c7-4fa0-a718-6b8d76c15f8c-registration-dir\") pod \"csi-node-driver-g5qkw\" (UID: \"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c\") " pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786279 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787012 kubelet[2721]: W0904 15:41:53.786288 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786318 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786457 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787012 kubelet[2721]: W0904 15:41:53.786464 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786478 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786672 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787012 kubelet[2721]: W0904 15:41:53.786681 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787012 kubelet[2721]: E0904 15:41:53.786696 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787229 kubelet[2721]: I0904 15:41:53.786711 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1972dac2-e9c7-4fa0-a718-6b8d76c15f8c-socket-dir\") pod \"csi-node-driver-g5qkw\" (UID: \"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c\") " pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:41:53.787229 kubelet[2721]: E0904 15:41:53.786920 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787229 kubelet[2721]: W0904 15:41:53.786930 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787229 kubelet[2721]: E0904 15:41:53.786944 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787229 kubelet[2721]: E0904 15:41:53.787130 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787229 kubelet[2721]: W0904 15:41:53.787142 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787229 kubelet[2721]: E0904 15:41:53.787153 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787428 kubelet[2721]: E0904 15:41:53.787404 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787428 kubelet[2721]: W0904 15:41:53.787422 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787502 kubelet[2721]: E0904 15:41:53.787436 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787666 kubelet[2721]: E0904 15:41:53.787625 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787666 kubelet[2721]: W0904 15:41:53.787650 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787666 kubelet[2721]: E0904 15:41:53.787660 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.787887 kubelet[2721]: E0904 15:41:53.787867 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.787887 kubelet[2721]: W0904 15:41:53.787883 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.787950 kubelet[2721]: E0904 15:41:53.787893 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.788148 kubelet[2721]: E0904 15:41:53.788082 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.788148 kubelet[2721]: W0904 15:41:53.788108 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.788148 kubelet[2721]: E0904 15:41:53.788117 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.790514 containerd[1599]: time="2025-09-04T15:41:53.790472687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5sg7z,Uid:f303361c-2d97-4ca2-a547-7fe4f5faa7a0,Namespace:calico-system,Attempt:0,} returns sandbox id \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\"" Sep 4 15:41:53.889004 kubelet[2721]: E0904 15:41:53.888952 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.889004 kubelet[2721]: W0904 15:41:53.888975 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.889004 kubelet[2721]: E0904 15:41:53.889003 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.889248 kubelet[2721]: E0904 15:41:53.889232 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.889248 kubelet[2721]: W0904 15:41:53.889244 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.889300 kubelet[2721]: E0904 15:41:53.889259 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.889468 kubelet[2721]: E0904 15:41:53.889453 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.889468 kubelet[2721]: W0904 15:41:53.889463 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.889524 kubelet[2721]: E0904 15:41:53.889476 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.889824 kubelet[2721]: E0904 15:41:53.889795 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.889824 kubelet[2721]: W0904 15:41:53.889826 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.889917 kubelet[2721]: E0904 15:41:53.889854 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.890166 kubelet[2721]: E0904 15:41:53.890122 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.890166 kubelet[2721]: W0904 15:41:53.890146 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.890166 kubelet[2721]: E0904 15:41:53.890178 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.890584 kubelet[2721]: E0904 15:41:53.890553 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.890584 kubelet[2721]: W0904 15:41:53.890579 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.890656 kubelet[2721]: E0904 15:41:53.890602 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.890879 kubelet[2721]: E0904 15:41:53.890857 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.890879 kubelet[2721]: W0904 15:41:53.890872 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.890965 kubelet[2721]: E0904 15:41:53.890889 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.891176 kubelet[2721]: E0904 15:41:53.891159 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.891176 kubelet[2721]: W0904 15:41:53.891171 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.891279 kubelet[2721]: E0904 15:41:53.891202 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.891451 kubelet[2721]: E0904 15:41:53.891434 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.891451 kubelet[2721]: W0904 15:41:53.891445 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.891520 kubelet[2721]: E0904 15:41:53.891493 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.891668 kubelet[2721]: E0904 15:41:53.891645 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.891668 kubelet[2721]: W0904 15:41:53.891657 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.891713 kubelet[2721]: E0904 15:41:53.891690 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.891885 kubelet[2721]: E0904 15:41:53.891868 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.891885 kubelet[2721]: W0904 15:41:53.891879 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.891943 kubelet[2721]: E0904 15:41:53.891921 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.892083 kubelet[2721]: E0904 15:41:53.892068 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.892083 kubelet[2721]: W0904 15:41:53.892079 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.892130 kubelet[2721]: E0904 15:41:53.892092 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.892332 kubelet[2721]: E0904 15:41:53.892315 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.892332 kubelet[2721]: W0904 15:41:53.892328 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.892377 kubelet[2721]: E0904 15:41:53.892347 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.892527 kubelet[2721]: E0904 15:41:53.892510 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.892527 kubelet[2721]: W0904 15:41:53.892521 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.892578 kubelet[2721]: E0904 15:41:53.892534 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.892714 kubelet[2721]: E0904 15:41:53.892698 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.892714 kubelet[2721]: W0904 15:41:53.892709 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.892791 kubelet[2721]: E0904 15:41:53.892737 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.892932 kubelet[2721]: E0904 15:41:53.892911 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.892932 kubelet[2721]: W0904 15:41:53.892923 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.892991 kubelet[2721]: E0904 15:41:53.892951 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.893139 kubelet[2721]: E0904 15:41:53.893121 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.893139 kubelet[2721]: W0904 15:41:53.893132 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.893202 kubelet[2721]: E0904 15:41:53.893157 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.893325 kubelet[2721]: E0904 15:41:53.893308 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.893325 kubelet[2721]: W0904 15:41:53.893318 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.893395 kubelet[2721]: E0904 15:41:53.893338 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.893489 kubelet[2721]: E0904 15:41:53.893475 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.893489 kubelet[2721]: W0904 15:41:53.893485 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.893541 kubelet[2721]: E0904 15:41:53.893505 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.893650 kubelet[2721]: E0904 15:41:53.893633 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.893650 kubelet[2721]: W0904 15:41:53.893643 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.893712 kubelet[2721]: E0904 15:41:53.893654 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.893858 kubelet[2721]: E0904 15:41:53.893840 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.893858 kubelet[2721]: W0904 15:41:53.893850 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.893929 kubelet[2721]: E0904 15:41:53.893862 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.894112 kubelet[2721]: E0904 15:41:53.894086 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.894112 kubelet[2721]: W0904 15:41:53.894104 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.894191 kubelet[2721]: E0904 15:41:53.894130 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.894497 kubelet[2721]: E0904 15:41:53.894474 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.894497 kubelet[2721]: W0904 15:41:53.894493 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.894595 kubelet[2721]: E0904 15:41:53.894512 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.894735 kubelet[2721]: E0904 15:41:53.894708 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.894735 kubelet[2721]: W0904 15:41:53.894720 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.894735 kubelet[2721]: E0904 15:41:53.894733 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.894992 kubelet[2721]: E0904 15:41:53.894962 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.894992 kubelet[2721]: W0904 15:41:53.894976 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.894992 kubelet[2721]: E0904 15:41:53.894990 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:53.903252 kubelet[2721]: E0904 15:41:53.901889 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:41:53.903252 kubelet[2721]: W0904 15:41:53.901911 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:41:53.903252 kubelet[2721]: E0904 15:41:53.901922 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:41:55.154795 update_engine[1577]: I20250904 15:41:55.154672 1577 update_attempter.cc:509] Updating boot flags... Sep 4 15:41:55.181233 kubelet[2721]: E0904 15:41:55.180335 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:41:57.180038 kubelet[2721]: E0904 15:41:57.179972 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:41:57.334678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1383175862.mount: Deactivated successfully. Sep 4 15:41:59.182233 kubelet[2721]: E0904 15:41:59.182065 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:41:59.300611 containerd[1599]: time="2025-09-04T15:41:59.300557614Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:59.301481 containerd[1599]: time="2025-09-04T15:41:59.301441835Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 15:41:59.302798 containerd[1599]: time="2025-09-04T15:41:59.302770035Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:59.305061 containerd[1599]: time="2025-09-04T15:41:59.305015408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:41:59.305593 containerd[1599]: time="2025-09-04T15:41:59.305570246Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 5.728651623s" Sep 4 15:41:59.305707 containerd[1599]: time="2025-09-04T15:41:59.305669724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 15:41:59.307109 containerd[1599]: time="2025-09-04T15:41:59.307061935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 15:41:59.319075 containerd[1599]: time="2025-09-04T15:41:59.319031523Z" level=info msg="CreateContainer within sandbox \"a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 15:41:59.327661 containerd[1599]: time="2025-09-04T15:41:59.327630632Z" level=info msg="Container 630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:41:59.339135 containerd[1599]: time="2025-09-04T15:41:59.339079757Z" level=info msg="CreateContainer within sandbox \"a9562640afd9ee21fac5b7341ce685e6cf75c96c96c4b9562eb625baaa956b8e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1\"" Sep 4 15:41:59.339781 containerd[1599]: time="2025-09-04T15:41:59.339588417Z" level=info msg="StartContainer for \"630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1\"" Sep 4 15:41:59.340660 containerd[1599]: time="2025-09-04T15:41:59.340615117Z" level=info msg="connecting to shim 630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1" address="unix:///run/containerd/s/4e377751b02db206e33456780a59a443ce1c15e3906ebee8f8e08fec1dc732c7" protocol=ttrpc version=3 Sep 4 15:41:59.372374 systemd[1]: Started cri-containerd-630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1.scope - libcontainer container 630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1. Sep 4 15:41:59.423633 containerd[1599]: time="2025-09-04T15:41:59.423582164Z" level=info msg="StartContainer for \"630a9e36b3cc32b272d9e691f699bcf76e20dbbd01e8dd720fb640421ab7e3f1\" returns successfully" Sep 4 15:42:00.242670 kubelet[2721]: E0904 15:42:00.242619 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:00.255252 kubelet[2721]: I0904 15:42:00.255123 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-57676689d7-hrzl5" podStartSLOduration=1.524691462 podStartE2EDuration="7.255104816s" podCreationTimestamp="2025-09-04 15:41:53 +0000 UTC" firstStartedPulling="2025-09-04 15:41:53.576315231 +0000 UTC m=+16.486787630" lastFinishedPulling="2025-09-04 15:41:59.306728595 +0000 UTC m=+22.217200984" observedRunningTime="2025-09-04 15:42:00.25488473 +0000 UTC m=+23.165357129" watchObservedRunningTime="2025-09-04 15:42:00.255104816 +0000 UTC m=+23.165577215" Sep 4 15:42:00.338773 kubelet[2721]: E0904 15:42:00.338706 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.338773 kubelet[2721]: W0904 15:42:00.338748 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.338773 kubelet[2721]: E0904 15:42:00.338780 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.339011 kubelet[2721]: E0904 15:42:00.338989 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.339011 kubelet[2721]: W0904 15:42:00.339000 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.339011 kubelet[2721]: E0904 15:42:00.339009 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.339184 kubelet[2721]: E0904 15:42:00.339158 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.339184 kubelet[2721]: W0904 15:42:00.339168 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.339184 kubelet[2721]: E0904 15:42:00.339176 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.339425 kubelet[2721]: E0904 15:42:00.339400 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.339425 kubelet[2721]: W0904 15:42:00.339411 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.339425 kubelet[2721]: E0904 15:42:00.339420 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.339590 kubelet[2721]: E0904 15:42:00.339575 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.339590 kubelet[2721]: W0904 15:42:00.339585 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.339696 kubelet[2721]: E0904 15:42:00.339593 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.339826 kubelet[2721]: E0904 15:42:00.339804 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.339826 kubelet[2721]: W0904 15:42:00.339820 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.339879 kubelet[2721]: E0904 15:42:00.339831 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.340004 kubelet[2721]: E0904 15:42:00.339973 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.340004 kubelet[2721]: W0904 15:42:00.339984 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.340004 kubelet[2721]: E0904 15:42:00.339991 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.340154 kubelet[2721]: E0904 15:42:00.340136 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.340154 kubelet[2721]: W0904 15:42:00.340146 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.340154 kubelet[2721]: E0904 15:42:00.340154 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.340455 kubelet[2721]: E0904 15:42:00.340404 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.340455 kubelet[2721]: W0904 15:42:00.340433 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.340455 kubelet[2721]: E0904 15:42:00.340463 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.340708 kubelet[2721]: E0904 15:42:00.340663 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.340708 kubelet[2721]: W0904 15:42:00.340672 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.340708 kubelet[2721]: E0904 15:42:00.340680 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.340916 kubelet[2721]: E0904 15:42:00.340899 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.340916 kubelet[2721]: W0904 15:42:00.340911 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.340969 kubelet[2721]: E0904 15:42:00.340920 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.341107 kubelet[2721]: E0904 15:42:00.341092 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.341107 kubelet[2721]: W0904 15:42:00.341104 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.341154 kubelet[2721]: E0904 15:42:00.341113 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.341336 kubelet[2721]: E0904 15:42:00.341318 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.341336 kubelet[2721]: W0904 15:42:00.341331 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.341405 kubelet[2721]: E0904 15:42:00.341339 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.341552 kubelet[2721]: E0904 15:42:00.341531 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.341552 kubelet[2721]: W0904 15:42:00.341544 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.341552 kubelet[2721]: E0904 15:42:00.341554 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.341754 kubelet[2721]: E0904 15:42:00.341726 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.341754 kubelet[2721]: W0904 15:42:00.341746 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.341754 kubelet[2721]: E0904 15:42:00.341756 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.436829 kubelet[2721]: E0904 15:42:00.436775 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.436829 kubelet[2721]: W0904 15:42:00.436799 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.436829 kubelet[2721]: E0904 15:42:00.436823 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.437080 kubelet[2721]: E0904 15:42:00.437063 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.437080 kubelet[2721]: W0904 15:42:00.437077 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.437154 kubelet[2721]: E0904 15:42:00.437094 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.437362 kubelet[2721]: E0904 15:42:00.437335 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.437362 kubelet[2721]: W0904 15:42:00.437351 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.437425 kubelet[2721]: E0904 15:42:00.437368 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.437592 kubelet[2721]: E0904 15:42:00.437573 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.437592 kubelet[2721]: W0904 15:42:00.437588 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.437652 kubelet[2721]: E0904 15:42:00.437604 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.437911 kubelet[2721]: E0904 15:42:00.437885 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.437911 kubelet[2721]: W0904 15:42:00.437898 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.437969 kubelet[2721]: E0904 15:42:00.437915 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.438144 kubelet[2721]: E0904 15:42:00.438122 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.438144 kubelet[2721]: W0904 15:42:00.438134 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.438191 kubelet[2721]: E0904 15:42:00.438148 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.438403 kubelet[2721]: E0904 15:42:00.438381 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.438403 kubelet[2721]: W0904 15:42:00.438393 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.438454 kubelet[2721]: E0904 15:42:00.438434 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.438604 kubelet[2721]: E0904 15:42:00.438588 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.438604 kubelet[2721]: W0904 15:42:00.438599 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.438645 kubelet[2721]: E0904 15:42:00.438628 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.438811 kubelet[2721]: E0904 15:42:00.438794 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.438811 kubelet[2721]: W0904 15:42:00.438806 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.438854 kubelet[2721]: E0904 15:42:00.438835 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.439040 kubelet[2721]: E0904 15:42:00.439023 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.439040 kubelet[2721]: W0904 15:42:00.439034 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.439088 kubelet[2721]: E0904 15:42:00.439050 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.439314 kubelet[2721]: E0904 15:42:00.439295 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.439314 kubelet[2721]: W0904 15:42:00.439309 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.439391 kubelet[2721]: E0904 15:42:00.439326 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.439508 kubelet[2721]: E0904 15:42:00.439489 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.439508 kubelet[2721]: W0904 15:42:00.439500 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.439560 kubelet[2721]: E0904 15:42:00.439513 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.439741 kubelet[2721]: E0904 15:42:00.439715 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.439741 kubelet[2721]: W0904 15:42:00.439738 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.439792 kubelet[2721]: E0904 15:42:00.439756 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.439972 kubelet[2721]: E0904 15:42:00.439957 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.439972 kubelet[2721]: W0904 15:42:00.439970 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.440029 kubelet[2721]: E0904 15:42:00.439986 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.440235 kubelet[2721]: E0904 15:42:00.440199 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.440259 kubelet[2721]: W0904 15:42:00.440233 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.440259 kubelet[2721]: E0904 15:42:00.440251 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.440514 kubelet[2721]: E0904 15:42:00.440496 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.440547 kubelet[2721]: W0904 15:42:00.440508 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.440547 kubelet[2721]: E0904 15:42:00.440540 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.440752 kubelet[2721]: E0904 15:42:00.440724 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.440752 kubelet[2721]: W0904 15:42:00.440747 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.440814 kubelet[2721]: E0904 15:42:00.440758 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:00.440985 kubelet[2721]: E0904 15:42:00.440971 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:00.441011 kubelet[2721]: W0904 15:42:00.440984 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:00.441011 kubelet[2721]: E0904 15:42:00.440994 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.179873 kubelet[2721]: E0904 15:42:01.179817 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:01.243966 kubelet[2721]: I0904 15:42:01.243915 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:42:01.244526 kubelet[2721]: E0904 15:42:01.244280 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:01.246623 kubelet[2721]: E0904 15:42:01.246586 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.246623 kubelet[2721]: W0904 15:42:01.246609 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.246843 kubelet[2721]: E0904 15:42:01.246634 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.246893 kubelet[2721]: E0904 15:42:01.246872 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.246893 kubelet[2721]: W0904 15:42:01.246883 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.246957 kubelet[2721]: E0904 15:42:01.246894 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.247133 kubelet[2721]: E0904 15:42:01.247111 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.247133 kubelet[2721]: W0904 15:42:01.247126 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.247206 kubelet[2721]: E0904 15:42:01.247137 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.247355 kubelet[2721]: E0904 15:42:01.247336 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.247355 kubelet[2721]: W0904 15:42:01.247347 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.247355 kubelet[2721]: E0904 15:42:01.247355 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.247526 kubelet[2721]: E0904 15:42:01.247509 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.247526 kubelet[2721]: W0904 15:42:01.247519 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.247526 kubelet[2721]: E0904 15:42:01.247527 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.247679 kubelet[2721]: E0904 15:42:01.247659 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.247679 kubelet[2721]: W0904 15:42:01.247672 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.247679 kubelet[2721]: E0904 15:42:01.247681 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.247932 kubelet[2721]: E0904 15:42:01.247897 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.247932 kubelet[2721]: W0904 15:42:01.247916 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.248030 kubelet[2721]: E0904 15:42:01.247938 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.248235 kubelet[2721]: E0904 15:42:01.248205 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.248235 kubelet[2721]: W0904 15:42:01.248232 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.248319 kubelet[2721]: E0904 15:42:01.248244 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.248458 kubelet[2721]: E0904 15:42:01.248437 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.248458 kubelet[2721]: W0904 15:42:01.248447 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.248458 kubelet[2721]: E0904 15:42:01.248455 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.248612 kubelet[2721]: E0904 15:42:01.248600 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.248612 kubelet[2721]: W0904 15:42:01.248609 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.248659 kubelet[2721]: E0904 15:42:01.248617 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.248825 kubelet[2721]: E0904 15:42:01.248813 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.248825 kubelet[2721]: W0904 15:42:01.248822 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.248872 kubelet[2721]: E0904 15:42:01.248830 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.248999 kubelet[2721]: E0904 15:42:01.248988 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.248999 kubelet[2721]: W0904 15:42:01.248997 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.249043 kubelet[2721]: E0904 15:42:01.249005 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.249187 kubelet[2721]: E0904 15:42:01.249168 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.249187 kubelet[2721]: W0904 15:42:01.249178 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.249187 kubelet[2721]: E0904 15:42:01.249185 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.249406 kubelet[2721]: E0904 15:42:01.249378 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.249406 kubelet[2721]: W0904 15:42:01.249400 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.249459 kubelet[2721]: E0904 15:42:01.249409 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.249615 kubelet[2721]: E0904 15:42:01.249596 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.249615 kubelet[2721]: W0904 15:42:01.249611 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.249615 kubelet[2721]: E0904 15:42:01.249623 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.301917 containerd[1599]: time="2025-09-04T15:42:01.301860172Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:01.303064 containerd[1599]: time="2025-09-04T15:42:01.303016936Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 15:42:01.303981 containerd[1599]: time="2025-09-04T15:42:01.303955438Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:01.305860 containerd[1599]: time="2025-09-04T15:42:01.305819908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:01.306345 containerd[1599]: time="2025-09-04T15:42:01.306313659Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.999223181s" Sep 4 15:42:01.306398 containerd[1599]: time="2025-09-04T15:42:01.306346031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 15:42:01.308296 containerd[1599]: time="2025-09-04T15:42:01.308259904Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 15:42:01.317823 containerd[1599]: time="2025-09-04T15:42:01.317773444Z" level=info msg="Container e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:01.327367 containerd[1599]: time="2025-09-04T15:42:01.327328322Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\"" Sep 4 15:42:01.327845 containerd[1599]: time="2025-09-04T15:42:01.327816273Z" level=info msg="StartContainer for \"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\"" Sep 4 15:42:01.329145 containerd[1599]: time="2025-09-04T15:42:01.329112040Z" level=info msg="connecting to shim e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7" address="unix:///run/containerd/s/29b1b5b42ae4b54fbc20c224b75fd151facb48442b2abf6a48e37ff68b80f040" protocol=ttrpc version=3 Sep 4 15:42:01.343933 kubelet[2721]: E0904 15:42:01.343905 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.344159 kubelet[2721]: W0904 15:42:01.344027 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.344159 kubelet[2721]: E0904 15:42:01.344052 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.344399 kubelet[2721]: E0904 15:42:01.344304 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.344399 kubelet[2721]: W0904 15:42:01.344316 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.344399 kubelet[2721]: E0904 15:42:01.344329 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.344666 kubelet[2721]: E0904 15:42:01.344653 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.344733 kubelet[2721]: W0904 15:42:01.344712 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.344848 kubelet[2721]: E0904 15:42:01.344797 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.345066 kubelet[2721]: E0904 15:42:01.345053 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.345121 kubelet[2721]: W0904 15:42:01.345111 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.345227 kubelet[2721]: E0904 15:42:01.345175 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.345447 kubelet[2721]: E0904 15:42:01.345434 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.345503 kubelet[2721]: W0904 15:42:01.345493 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.345628 kubelet[2721]: E0904 15:42:01.345606 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.345897 kubelet[2721]: E0904 15:42:01.345848 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.345897 kubelet[2721]: W0904 15:42:01.345860 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.345976 kubelet[2721]: E0904 15:42:01.345896 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.346189 kubelet[2721]: E0904 15:42:01.346173 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.346607 kubelet[2721]: W0904 15:42:01.346507 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.346607 kubelet[2721]: E0904 15:42:01.346542 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.346767 kubelet[2721]: E0904 15:42:01.346755 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.346824 kubelet[2721]: W0904 15:42:01.346813 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.347041 kubelet[2721]: E0904 15:42:01.346864 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.347374 kubelet[2721]: E0904 15:42:01.347362 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.348079 kubelet[2721]: W0904 15:42:01.348064 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.348667 kubelet[2721]: E0904 15:42:01.348365 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.348851 kubelet[2721]: E0904 15:42:01.348838 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.348907 kubelet[2721]: W0904 15:42:01.348895 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.349050 kubelet[2721]: E0904 15:42:01.349037 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.349177 kubelet[2721]: E0904 15:42:01.349165 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.349268 kubelet[2721]: W0904 15:42:01.349256 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.349362 kubelet[2721]: E0904 15:42:01.349328 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.349652 kubelet[2721]: E0904 15:42:01.349536 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.349652 kubelet[2721]: W0904 15:42:01.349547 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.349652 kubelet[2721]: E0904 15:42:01.349560 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.349890 kubelet[2721]: E0904 15:42:01.349877 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.349946 kubelet[2721]: W0904 15:42:01.349935 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.350014 kubelet[2721]: E0904 15:42:01.349995 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.350254 kubelet[2721]: E0904 15:42:01.350203 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.350254 kubelet[2721]: W0904 15:42:01.350235 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.350254 kubelet[2721]: E0904 15:42:01.350250 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.350498 kubelet[2721]: E0904 15:42:01.350485 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.350498 kubelet[2721]: W0904 15:42:01.350495 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.350555 kubelet[2721]: E0904 15:42:01.350509 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.351024 kubelet[2721]: E0904 15:42:01.350855 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.351024 kubelet[2721]: W0904 15:42:01.350867 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.351024 kubelet[2721]: E0904 15:42:01.350880 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.351131 kubelet[2721]: E0904 15:42:01.351116 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.351131 kubelet[2721]: W0904 15:42:01.351127 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.351177 kubelet[2721]: E0904 15:42:01.351140 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.354245 kubelet[2721]: E0904 15:42:01.352525 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 15:42:01.354245 kubelet[2721]: W0904 15:42:01.352541 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 15:42:01.354245 kubelet[2721]: E0904 15:42:01.352552 2721 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 15:42:01.352537 systemd[1]: Started cri-containerd-e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7.scope - libcontainer container e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7. Sep 4 15:42:01.452703 systemd[1]: cri-containerd-e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7.scope: Deactivated successfully. Sep 4 15:42:01.453775 systemd[1]: cri-containerd-e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7.scope: Consumed 36ms CPU time, 6.3M memory peak, 4.6M written to disk. Sep 4 15:42:01.457398 containerd[1599]: time="2025-09-04T15:42:01.457361906Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\" id:\"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\" pid:3476 exited_at:{seconds:1757000521 nanos:456841524}" Sep 4 15:42:01.496013 containerd[1599]: time="2025-09-04T15:42:01.495949153Z" level=info msg="received exit event container_id:\"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\" id:\"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\" pid:3476 exited_at:{seconds:1757000521 nanos:456841524}" Sep 4 15:42:01.497528 containerd[1599]: time="2025-09-04T15:42:01.497496334Z" level=info msg="StartContainer for \"e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7\" returns successfully" Sep 4 15:42:01.520255 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e074419068341a81ffead271aa72c28c0656e83f9dc48a10c073dbee80e387f7-rootfs.mount: Deactivated successfully. Sep 4 15:42:02.248873 containerd[1599]: time="2025-09-04T15:42:02.248665250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 15:42:03.179299 kubelet[2721]: E0904 15:42:03.179205 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:05.179397 kubelet[2721]: E0904 15:42:05.179336 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:07.180256 kubelet[2721]: E0904 15:42:07.180174 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:09.179630 kubelet[2721]: E0904 15:42:09.179559 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:11.016002 containerd[1599]: time="2025-09-04T15:42:11.015931429Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:11.016688 containerd[1599]: time="2025-09-04T15:42:11.016653037Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 15:42:11.017682 containerd[1599]: time="2025-09-04T15:42:11.017650193Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:11.019634 containerd[1599]: time="2025-09-04T15:42:11.019591306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:11.020125 containerd[1599]: time="2025-09-04T15:42:11.020095285Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 8.771388997s" Sep 4 15:42:11.020164 containerd[1599]: time="2025-09-04T15:42:11.020123297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 15:42:11.022355 containerd[1599]: time="2025-09-04T15:42:11.022321644Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 15:42:11.031493 containerd[1599]: time="2025-09-04T15:42:11.031453077Z" level=info msg="Container de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:11.041883 containerd[1599]: time="2025-09-04T15:42:11.041840743Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\"" Sep 4 15:42:11.043269 containerd[1599]: time="2025-09-04T15:42:11.042304116Z" level=info msg="StartContainer for \"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\"" Sep 4 15:42:11.043717 containerd[1599]: time="2025-09-04T15:42:11.043691066Z" level=info msg="connecting to shim de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4" address="unix:///run/containerd/s/29b1b5b42ae4b54fbc20c224b75fd151facb48442b2abf6a48e37ff68b80f040" protocol=ttrpc version=3 Sep 4 15:42:11.067332 systemd[1]: Started cri-containerd-de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4.scope - libcontainer container de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4. Sep 4 15:42:11.110531 containerd[1599]: time="2025-09-04T15:42:11.110490976Z" level=info msg="StartContainer for \"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\" returns successfully" Sep 4 15:42:11.180805 kubelet[2721]: E0904 15:42:11.180744 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:12.100640 systemd[1]: cri-containerd-de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4.scope: Deactivated successfully. Sep 4 15:42:12.101435 systemd[1]: cri-containerd-de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4.scope: Consumed 648ms CPU time, 179.7M memory peak, 4.3M read from disk, 171.3M written to disk. Sep 4 15:42:12.102054 containerd[1599]: time="2025-09-04T15:42:12.102015595Z" level=info msg="received exit event container_id:\"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\" id:\"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\" pid:3534 exited_at:{seconds:1757000532 nanos:101791443}" Sep 4 15:42:12.102389 containerd[1599]: time="2025-09-04T15:42:12.102366464Z" level=info msg="TaskExit event in podsandbox handler container_id:\"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\" id:\"de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4\" pid:3534 exited_at:{seconds:1757000532 nanos:101791443}" Sep 4 15:42:12.124860 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de7a748748ed2329d1a9b914ad514209181c35fbc8fc1ff6e2d8b46116d2d8d4-rootfs.mount: Deactivated successfully. Sep 4 15:42:12.131149 kubelet[2721]: I0904 15:42:12.131114 2721 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 4 15:42:12.195986 systemd[1]: Created slice kubepods-burstable-podfe764908_ae86_49c6_b09b_c3ac12014f67.slice - libcontainer container kubepods-burstable-podfe764908_ae86_49c6_b09b_c3ac12014f67.slice. Sep 4 15:42:12.212727 systemd[1]: Created slice kubepods-burstable-pod1346250c_c62f_497e_b346_045a6d8429cd.slice - libcontainer container kubepods-burstable-pod1346250c_c62f_497e_b346_045a6d8429cd.slice. Sep 4 15:42:12.221332 kubelet[2721]: I0904 15:42:12.221150 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6ef84674-6ef9-43c1-a60c-d61b98c3020c-calico-apiserver-certs\") pod \"calico-apiserver-586bb8599c-jft4n\" (UID: \"6ef84674-6ef9-43c1-a60c-d61b98c3020c\") " pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:12.221332 kubelet[2721]: I0904 15:42:12.221326 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-ca-bundle\") pod \"whisker-58786fb5c7-glf82\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:12.221332 kubelet[2721]: I0904 15:42:12.221344 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61351a35-3590-4d07-b44e-2b184f6950c6-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-7qmzd\" (UID: \"61351a35-3590-4d07-b44e-2b184f6950c6\") " pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.222758 kubelet[2721]: I0904 15:42:12.221361 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1346250c-c62f-497e-b346-045a6d8429cd-config-volume\") pod \"coredns-668d6bf9bc-dzhx5\" (UID: \"1346250c-c62f-497e-b346-045a6d8429cd\") " pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:12.222758 kubelet[2721]: I0904 15:42:12.221377 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gcx\" (UniqueName: \"kubernetes.io/projected/1346250c-c62f-497e-b346-045a6d8429cd-kube-api-access-m2gcx\") pod \"coredns-668d6bf9bc-dzhx5\" (UID: \"1346250c-c62f-497e-b346-045a6d8429cd\") " pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:12.222758 kubelet[2721]: I0904 15:42:12.221392 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9wk\" (UniqueName: \"kubernetes.io/projected/68845f40-ebfe-4bf1-9325-43434676a6f4-kube-api-access-qp9wk\") pod \"calico-apiserver-586bb8599c-9bzlz\" (UID: \"68845f40-ebfe-4bf1-9325-43434676a6f4\") " pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" Sep 4 15:42:12.222758 kubelet[2721]: I0904 15:42:12.221408 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-backend-key-pair\") pod \"whisker-58786fb5c7-glf82\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:12.222758 kubelet[2721]: I0904 15:42:12.221422 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x5c7\" (UniqueName: \"kubernetes.io/projected/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-kube-api-access-4x5c7\") pod \"whisker-58786fb5c7-glf82\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:12.222881 kubelet[2721]: I0904 15:42:12.221435 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61351a35-3590-4d07-b44e-2b184f6950c6-config\") pod \"goldmane-54d579b49d-7qmzd\" (UID: \"61351a35-3590-4d07-b44e-2b184f6950c6\") " pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.222881 kubelet[2721]: I0904 15:42:12.221450 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/61351a35-3590-4d07-b44e-2b184f6950c6-goldmane-key-pair\") pod \"goldmane-54d579b49d-7qmzd\" (UID: \"61351a35-3590-4d07-b44e-2b184f6950c6\") " pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.222881 kubelet[2721]: I0904 15:42:12.221465 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mn5\" (UniqueName: \"kubernetes.io/projected/40ed457a-a70c-44bd-9d09-2cbd7ed7fa67-kube-api-access-d6mn5\") pod \"calico-kube-controllers-cd6bff945-8bhdp\" (UID: \"40ed457a-a70c-44bd-9d09-2cbd7ed7fa67\") " pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:12.222881 kubelet[2721]: I0904 15:42:12.221484 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe764908-ae86-49c6-b09b-c3ac12014f67-config-volume\") pod \"coredns-668d6bf9bc-6cbdm\" (UID: \"fe764908-ae86-49c6-b09b-c3ac12014f67\") " pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:12.222881 kubelet[2721]: I0904 15:42:12.221507 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f82t\" (UniqueName: \"kubernetes.io/projected/6ef84674-6ef9-43c1-a60c-d61b98c3020c-kube-api-access-6f82t\") pod \"calico-apiserver-586bb8599c-jft4n\" (UID: \"6ef84674-6ef9-43c1-a60c-d61b98c3020c\") " pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:12.223057 kubelet[2721]: I0904 15:42:12.221523 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56l5c\" (UniqueName: \"kubernetes.io/projected/fe764908-ae86-49c6-b09b-c3ac12014f67-kube-api-access-56l5c\") pod \"coredns-668d6bf9bc-6cbdm\" (UID: \"fe764908-ae86-49c6-b09b-c3ac12014f67\") " pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:12.223057 kubelet[2721]: I0904 15:42:12.221539 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8zf\" (UniqueName: \"kubernetes.io/projected/61351a35-3590-4d07-b44e-2b184f6950c6-kube-api-access-hw8zf\") pod \"goldmane-54d579b49d-7qmzd\" (UID: \"61351a35-3590-4d07-b44e-2b184f6950c6\") " pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.223057 kubelet[2721]: I0904 15:42:12.221562 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ed457a-a70c-44bd-9d09-2cbd7ed7fa67-tigera-ca-bundle\") pod \"calico-kube-controllers-cd6bff945-8bhdp\" (UID: \"40ed457a-a70c-44bd-9d09-2cbd7ed7fa67\") " pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:12.223057 kubelet[2721]: I0904 15:42:12.221579 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/68845f40-ebfe-4bf1-9325-43434676a6f4-calico-apiserver-certs\") pod \"calico-apiserver-586bb8599c-9bzlz\" (UID: \"68845f40-ebfe-4bf1-9325-43434676a6f4\") " pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" Sep 4 15:42:12.223627 systemd[1]: Created slice kubepods-besteffort-pod40ed457a_a70c_44bd_9d09_2cbd7ed7fa67.slice - libcontainer container kubepods-besteffort-pod40ed457a_a70c_44bd_9d09_2cbd7ed7fa67.slice. Sep 4 15:42:12.233569 systemd[1]: Created slice kubepods-besteffort-pod68845f40_ebfe_4bf1_9325_43434676a6f4.slice - libcontainer container kubepods-besteffort-pod68845f40_ebfe_4bf1_9325_43434676a6f4.slice. Sep 4 15:42:12.243994 systemd[1]: Created slice kubepods-besteffort-pod96b23ce9_24eb_409d_ab7e_65c59ba9d2a2.slice - libcontainer container kubepods-besteffort-pod96b23ce9_24eb_409d_ab7e_65c59ba9d2a2.slice. Sep 4 15:42:12.250303 systemd[1]: Created slice kubepods-besteffort-pod6ef84674_6ef9_43c1_a60c_d61b98c3020c.slice - libcontainer container kubepods-besteffort-pod6ef84674_6ef9_43c1_a60c_d61b98c3020c.slice. Sep 4 15:42:12.256854 systemd[1]: Created slice kubepods-besteffort-pod61351a35_3590_4d07_b44e_2b184f6950c6.slice - libcontainer container kubepods-besteffort-pod61351a35_3590_4d07_b44e_2b184f6950c6.slice. Sep 4 15:42:12.273727 containerd[1599]: time="2025-09-04T15:42:12.273613234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 15:42:12.498786 kubelet[2721]: E0904 15:42:12.498612 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:12.499746 containerd[1599]: time="2025-09-04T15:42:12.499419378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:12.519519 kubelet[2721]: E0904 15:42:12.519472 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:12.520130 containerd[1599]: time="2025-09-04T15:42:12.520089265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:12.531192 containerd[1599]: time="2025-09-04T15:42:12.531130526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:12.537975 containerd[1599]: time="2025-09-04T15:42:12.537897567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-9bzlz,Uid:68845f40-ebfe-4bf1-9325-43434676a6f4,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:42:12.547854 containerd[1599]: time="2025-09-04T15:42:12.547757648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58786fb5c7-glf82,Uid:96b23ce9-24eb-409d-ab7e-65c59ba9d2a2,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:12.554796 containerd[1599]: time="2025-09-04T15:42:12.554632180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:42:12.563360 containerd[1599]: time="2025-09-04T15:42:12.563314264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:12.613111 containerd[1599]: time="2025-09-04T15:42:12.613040114Z" level=error msg="Failed to destroy network for sandbox \"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.621328 containerd[1599]: time="2025-09-04T15:42:12.621269406Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.621670 kubelet[2721]: E0904 15:42:12.621620 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.621746 kubelet[2721]: E0904 15:42:12.621702 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:12.621746 kubelet[2721]: E0904 15:42:12.621727 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:12.621812 kubelet[2721]: E0904 15:42:12.621771 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6cbdm_kube-system(fe764908-ae86-49c6-b09b-c3ac12014f67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6cbdm_kube-system(fe764908-ae86-49c6-b09b-c3ac12014f67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0e2d1a178b6f1e8ad6c95b372cfaf237134d5c96f936e65d45baa09f85f599f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6cbdm" podUID="fe764908-ae86-49c6-b09b-c3ac12014f67" Sep 4 15:42:12.626201 containerd[1599]: time="2025-09-04T15:42:12.626156360Z" level=error msg="Failed to destroy network for sandbox \"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.628254 containerd[1599]: time="2025-09-04T15:42:12.628193142Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.628455 kubelet[2721]: E0904 15:42:12.628425 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.628538 kubelet[2721]: E0904 15:42:12.628513 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:12.628596 kubelet[2721]: E0904 15:42:12.628568 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:12.628627 kubelet[2721]: E0904 15:42:12.628612 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dzhx5_kube-system(1346250c-c62f-497e-b346-045a6d8429cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dzhx5_kube-system(1346250c-c62f-497e-b346-045a6d8429cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e87ab1957a2e3d4de8484e7c5b09f914691fbb58d1079375484d2542688e7988\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dzhx5" podUID="1346250c-c62f-497e-b346-045a6d8429cd" Sep 4 15:42:12.651185 containerd[1599]: time="2025-09-04T15:42:12.651035616Z" level=error msg="Failed to destroy network for sandbox \"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.653928 containerd[1599]: time="2025-09-04T15:42:12.653877653Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.654286 kubelet[2721]: E0904 15:42:12.654188 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.654365 kubelet[2721]: E0904 15:42:12.654339 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:12.654394 kubelet[2721]: E0904 15:42:12.654364 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:12.654483 kubelet[2721]: E0904 15:42:12.654446 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd6bff945-8bhdp_calico-system(40ed457a-a70c-44bd-9d09-2cbd7ed7fa67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd6bff945-8bhdp_calico-system(40ed457a-a70c-44bd-9d09-2cbd7ed7fa67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a3b1feda4b86a16d1790a164af5cb90837589dbdb1dff9ec66d881b81a1b92c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" podUID="40ed457a-a70c-44bd-9d09-2cbd7ed7fa67" Sep 4 15:42:12.666847 containerd[1599]: time="2025-09-04T15:42:12.666786047Z" level=error msg="Failed to destroy network for sandbox \"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.669827 containerd[1599]: time="2025-09-04T15:42:12.668228611Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.670032 containerd[1599]: time="2025-09-04T15:42:12.670004363Z" level=error msg="Failed to destroy network for sandbox \"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.670129 kubelet[2721]: E0904 15:42:12.670070 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.670319 kubelet[2721]: E0904 15:42:12.670149 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:12.670319 kubelet[2721]: E0904 15:42:12.670193 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:12.670319 kubelet[2721]: E0904 15:42:12.670287 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-586bb8599c-jft4n_calico-apiserver(6ef84674-6ef9-43c1-a60c-d61b98c3020c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-586bb8599c-jft4n_calico-apiserver(6ef84674-6ef9-43c1-a60c-d61b98c3020c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47af7ed691dec86933e2024284d1d05182c76cc0013eb0229cc1b1c9b3beeb46\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" podUID="6ef84674-6ef9-43c1-a60c-d61b98c3020c" Sep 4 15:42:12.672365 containerd[1599]: time="2025-09-04T15:42:12.672306794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-9bzlz,Uid:68845f40-ebfe-4bf1-9325-43434676a6f4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.672691 kubelet[2721]: E0904 15:42:12.672554 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.672691 kubelet[2721]: E0904 15:42:12.672623 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" Sep 4 15:42:12.672691 kubelet[2721]: E0904 15:42:12.672649 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" Sep 4 15:42:12.674233 kubelet[2721]: E0904 15:42:12.673285 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-586bb8599c-9bzlz_calico-apiserver(68845f40-ebfe-4bf1-9325-43434676a6f4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-586bb8599c-9bzlz_calico-apiserver(68845f40-ebfe-4bf1-9325-43434676a6f4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"544ff9a4c7a9838143bfd8742307c4ffe70dcec33b6d4bec4d7c54008c9b32f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" podUID="68845f40-ebfe-4bf1-9325-43434676a6f4" Sep 4 15:42:12.674350 containerd[1599]: time="2025-09-04T15:42:12.674170059Z" level=error msg="Failed to destroy network for sandbox \"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.675405 containerd[1599]: time="2025-09-04T15:42:12.675333578Z" level=error msg="Failed to destroy network for sandbox \"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.675882 containerd[1599]: time="2025-09-04T15:42:12.675797591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.676385 kubelet[2721]: E0904 15:42:12.676346 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.676527 kubelet[2721]: E0904 15:42:12.676507 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.676605 kubelet[2721]: E0904 15:42:12.676529 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:12.676605 kubelet[2721]: E0904 15:42:12.676577 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-7qmzd_calico-system(61351a35-3590-4d07-b44e-2b184f6950c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-7qmzd_calico-system(61351a35-3590-4d07-b44e-2b184f6950c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca808a137bbe5b5d0d68884ba52c1b815f6bd52b21afa5001d2bed3f2cca65fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-7qmzd" podUID="61351a35-3590-4d07-b44e-2b184f6950c6" Sep 4 15:42:12.676880 containerd[1599]: time="2025-09-04T15:42:12.676822530Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58786fb5c7-glf82,Uid:96b23ce9-24eb-409d-ab7e-65c59ba9d2a2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.676981 kubelet[2721]: E0904 15:42:12.676962 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:12.677042 kubelet[2721]: E0904 15:42:12.677018 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:12.677075 kubelet[2721]: E0904 15:42:12.677040 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:12.677118 kubelet[2721]: E0904 15:42:12.677092 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-58786fb5c7-glf82_calico-system(96b23ce9-24eb-409d-ab7e-65c59ba9d2a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-58786fb5c7-glf82_calico-system(96b23ce9-24eb-409d-ab7e-65c59ba9d2a2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a713f49174c7f81cfe318662294f9e2f5670272e2c6dd863aa552d00021b1e0d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-58786fb5c7-glf82" podUID="96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" Sep 4 15:42:13.185850 systemd[1]: Created slice kubepods-besteffort-pod1972dac2_e9c7_4fa0_a718_6b8d76c15f8c.slice - libcontainer container kubepods-besteffort-pod1972dac2_e9c7_4fa0_a718_6b8d76c15f8c.slice. Sep 4 15:42:13.188363 containerd[1599]: time="2025-09-04T15:42:13.188326178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:13.239872 containerd[1599]: time="2025-09-04T15:42:13.239805987Z" level=error msg="Failed to destroy network for sandbox \"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:13.242095 systemd[1]: run-netns-cni\x2d4dbb07b8\x2de0c5\x2d9591\x2db990\x2df9518599b9ae.mount: Deactivated successfully. Sep 4 15:42:13.265336 containerd[1599]: time="2025-09-04T15:42:13.265284750Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:13.265599 kubelet[2721]: E0904 15:42:13.265550 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:13.265873 kubelet[2721]: E0904 15:42:13.265623 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:42:13.265873 kubelet[2721]: E0904 15:42:13.265644 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:42:13.265873 kubelet[2721]: E0904 15:42:13.265694 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-g5qkw_calico-system(1972dac2-e9c7-4fa0-a718-6b8d76c15f8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-g5qkw_calico-system(1972dac2-e9c7-4fa0-a718-6b8d76c15f8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ba568dd5ddb32c48e28ae88f8f9414ef2cffb2b26dd70a2bb33232cc91725f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:14.174169 kubelet[2721]: I0904 15:42:14.174106 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:42:14.174587 kubelet[2721]: E0904 15:42:14.174556 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:14.275520 kubelet[2721]: E0904 15:42:14.275473 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:15.915268 systemd[1]: Started sshd@7-10.0.0.9:22-10.0.0.1:49512.service - OpenSSH per-connection server daemon (10.0.0.1:49512). Sep 4 15:42:15.977733 sshd[3847]: Accepted publickey for core from 10.0.0.1 port 49512 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:15.979356 sshd-session[3847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:15.984134 systemd-logind[1569]: New session 8 of user core. Sep 4 15:42:16.002345 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 15:42:16.122820 sshd[3850]: Connection closed by 10.0.0.1 port 49512 Sep 4 15:42:16.123156 sshd-session[3847]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:16.128243 systemd[1]: sshd@7-10.0.0.9:22-10.0.0.1:49512.service: Deactivated successfully. Sep 4 15:42:16.130332 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 15:42:16.131141 systemd-logind[1569]: Session 8 logged out. Waiting for processes to exit. Sep 4 15:42:16.132411 systemd-logind[1569]: Removed session 8. Sep 4 15:42:21.135486 systemd[1]: Started sshd@8-10.0.0.9:22-10.0.0.1:44934.service - OpenSSH per-connection server daemon (10.0.0.1:44934). Sep 4 15:42:21.195242 sshd[3866]: Accepted publickey for core from 10.0.0.1 port 44934 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:21.196903 sshd-session[3866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:21.201964 systemd-logind[1569]: New session 9 of user core. Sep 4 15:42:21.207430 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 15:42:21.327764 sshd[3869]: Connection closed by 10.0.0.1 port 44934 Sep 4 15:42:21.328139 sshd-session[3866]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:21.332510 systemd[1]: sshd@8-10.0.0.9:22-10.0.0.1:44934.service: Deactivated successfully. Sep 4 15:42:21.334672 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 15:42:21.335571 systemd-logind[1569]: Session 9 logged out. Waiting for processes to exit. Sep 4 15:42:21.336727 systemd-logind[1569]: Removed session 9. Sep 4 15:42:25.185634 kubelet[2721]: E0904 15:42:25.185593 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:25.186193 containerd[1599]: time="2025-09-04T15:42:25.185758176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:25.187818 containerd[1599]: time="2025-09-04T15:42:25.187791825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:25.588937 containerd[1599]: time="2025-09-04T15:42:25.588866214Z" level=error msg="Failed to destroy network for sandbox \"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.591414 systemd[1]: run-netns-cni\x2d06e3d6e4\x2d117c\x2d9acc\x2de511\x2d6e394a0ffe4f.mount: Deactivated successfully. Sep 4 15:42:25.626764 containerd[1599]: time="2025-09-04T15:42:25.626708571Z" level=error msg="Failed to destroy network for sandbox \"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.629348 systemd[1]: run-netns-cni\x2d01bdb4a8\x2dfbc6\x2d5295\x2d1a39\x2d3c39796b2313.mount: Deactivated successfully. Sep 4 15:42:25.746240 containerd[1599]: time="2025-09-04T15:42:25.746114284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.746595 kubelet[2721]: E0904 15:42:25.746537 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.747109 kubelet[2721]: E0904 15:42:25.746611 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:25.747109 kubelet[2721]: E0904 15:42:25.746660 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" Sep 4 15:42:25.747109 kubelet[2721]: E0904 15:42:25.746721 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd6bff945-8bhdp_calico-system(40ed457a-a70c-44bd-9d09-2cbd7ed7fa67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd6bff945-8bhdp_calico-system(40ed457a-a70c-44bd-9d09-2cbd7ed7fa67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d25c9cdbfcc5af220e3b3d190c696e80195dbde24fcd1d9676e96ecab9c6a768\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" podUID="40ed457a-a70c-44bd-9d09-2cbd7ed7fa67" Sep 4 15:42:25.752082 containerd[1599]: time="2025-09-04T15:42:25.750439135Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.752347 kubelet[2721]: E0904 15:42:25.750700 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:25.752347 kubelet[2721]: E0904 15:42:25.750772 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:25.752347 kubelet[2721]: E0904 15:42:25.750798 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6cbdm" Sep 4 15:42:25.752472 kubelet[2721]: E0904 15:42:25.750847 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6cbdm_kube-system(fe764908-ae86-49c6-b09b-c3ac12014f67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6cbdm_kube-system(fe764908-ae86-49c6-b09b-c3ac12014f67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c37ec47be93b65a06156ca64fd3c3460f01b2310ec4a0fb3f5aa8c942b890a9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6cbdm" podUID="fe764908-ae86-49c6-b09b-c3ac12014f67" Sep 4 15:42:26.038486 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount901957984.mount: Deactivated successfully. Sep 4 15:42:26.071589 containerd[1599]: time="2025-09-04T15:42:26.071505428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:26.100532 containerd[1599]: time="2025-09-04T15:42:26.072292456Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 15:42:26.100532 containerd[1599]: time="2025-09-04T15:42:26.073358087Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:26.100770 containerd[1599]: time="2025-09-04T15:42:26.075616758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 13.801968658s" Sep 4 15:42:26.100770 containerd[1599]: time="2025-09-04T15:42:26.100630543Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 15:42:26.101227 containerd[1599]: time="2025-09-04T15:42:26.101167420Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:26.109885 containerd[1599]: time="2025-09-04T15:42:26.109855773Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 15:42:26.119839 containerd[1599]: time="2025-09-04T15:42:26.119814381Z" level=info msg="Container ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:26.164844 containerd[1599]: time="2025-09-04T15:42:26.164804142Z" level=info msg="CreateContainer within sandbox \"f4e313270e4924c8a934db014b82125d88cd0ffad9d85568070cdccf8f511fe3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\"" Sep 4 15:42:26.165557 containerd[1599]: time="2025-09-04T15:42:26.165499979Z" level=info msg="StartContainer for \"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\"" Sep 4 15:42:26.167519 containerd[1599]: time="2025-09-04T15:42:26.167485566Z" level=info msg="connecting to shim ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858" address="unix:///run/containerd/s/29b1b5b42ae4b54fbc20c224b75fd151facb48442b2abf6a48e37ff68b80f040" protocol=ttrpc version=3 Sep 4 15:42:26.180635 kubelet[2721]: E0904 15:42:26.180289 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:26.182171 containerd[1599]: time="2025-09-04T15:42:26.182138879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:26.182791 containerd[1599]: time="2025-09-04T15:42:26.182569758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:42:26.182871 containerd[1599]: time="2025-09-04T15:42:26.182840817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:26.183052 containerd[1599]: time="2025-09-04T15:42:26.182624451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:26.183144 containerd[1599]: time="2025-09-04T15:42:26.182717435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58786fb5c7-glf82,Uid:96b23ce9-24eb-409d-ab7e-65c59ba9d2a2,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:26.191385 systemd[1]: Started cri-containerd-ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858.scope - libcontainer container ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858. Sep 4 15:42:26.318205 containerd[1599]: time="2025-09-04T15:42:26.318040242Z" level=error msg="Failed to destroy network for sandbox \"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.320265 containerd[1599]: time="2025-09-04T15:42:26.320191101Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.333202 kubelet[2721]: E0904 15:42:26.332961 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.334383 kubelet[2721]: E0904 15:42:26.333814 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:26.334383 kubelet[2721]: E0904 15:42:26.334080 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dzhx5" Sep 4 15:42:26.334540 kubelet[2721]: E0904 15:42:26.334511 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dzhx5_kube-system(1346250c-c62f-497e-b346-045a6d8429cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dzhx5_kube-system(1346250c-c62f-497e-b346-045a6d8429cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38a3399a0adf9b2ed939ceb67e5b9959fd36db9c4cee7a5743d526486456b816\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dzhx5" podUID="1346250c-c62f-497e-b346-045a6d8429cd" Sep 4 15:42:26.344566 systemd[1]: Started sshd@9-10.0.0.9:22-10.0.0.1:44938.service - OpenSSH per-connection server daemon (10.0.0.1:44938). Sep 4 15:42:26.359056 containerd[1599]: time="2025-09-04T15:42:26.359007210Z" level=info msg="StartContainer for \"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\" returns successfully" Sep 4 15:42:26.377489 containerd[1599]: time="2025-09-04T15:42:26.377405414Z" level=error msg="Failed to destroy network for sandbox \"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.380657 containerd[1599]: time="2025-09-04T15:42:26.380577069Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.381067 kubelet[2721]: E0904 15:42:26.380922 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.381166 kubelet[2721]: E0904 15:42:26.381127 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:26.381166 kubelet[2721]: E0904 15:42:26.381157 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7qmzd" Sep 4 15:42:26.382281 kubelet[2721]: E0904 15:42:26.382236 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-7qmzd_calico-system(61351a35-3590-4d07-b44e-2b184f6950c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-7qmzd_calico-system(61351a35-3590-4d07-b44e-2b184f6950c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"65784c43f4e579a4130c72e4a8d4f387245578e61afbf2999cee229ded97d263\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-7qmzd" podUID="61351a35-3590-4d07-b44e-2b184f6950c6" Sep 4 15:42:26.382756 containerd[1599]: time="2025-09-04T15:42:26.382720673Z" level=error msg="Failed to destroy network for sandbox \"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.384687 containerd[1599]: time="2025-09-04T15:42:26.384651158Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58786fb5c7-glf82,Uid:96b23ce9-24eb-409d-ab7e-65c59ba9d2a2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.385528 containerd[1599]: time="2025-09-04T15:42:26.385483921Z" level=error msg="Failed to destroy network for sandbox \"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.385714 kubelet[2721]: E0904 15:42:26.385679 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.385773 kubelet[2721]: E0904 15:42:26.385721 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:26.385773 kubelet[2721]: E0904 15:42:26.385741 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58786fb5c7-glf82" Sep 4 15:42:26.385817 kubelet[2721]: E0904 15:42:26.385767 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-58786fb5c7-glf82_calico-system(96b23ce9-24eb-409d-ab7e-65c59ba9d2a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-58786fb5c7-glf82_calico-system(96b23ce9-24eb-409d-ab7e-65c59ba9d2a2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8155c4ddeaeec3192beb0b76bc4893ddfa802e4636163c08b5946048caf52b40\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-58786fb5c7-glf82" podUID="96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" Sep 4 15:42:26.387326 containerd[1599]: time="2025-09-04T15:42:26.387289882Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.387656 kubelet[2721]: E0904 15:42:26.387615 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.387803 kubelet[2721]: E0904 15:42:26.387776 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:26.387897 kubelet[2721]: E0904 15:42:26.387878 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" Sep 4 15:42:26.388029 kubelet[2721]: E0904 15:42:26.388002 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-586bb8599c-jft4n_calico-apiserver(6ef84674-6ef9-43c1-a60c-d61b98c3020c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-586bb8599c-jft4n_calico-apiserver(6ef84674-6ef9-43c1-a60c-d61b98c3020c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8fa28f2b396ea60f1f3c261807294131e1e2d06b8ac7994857f481ea961dc647\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" podUID="6ef84674-6ef9-43c1-a60c-d61b98c3020c" Sep 4 15:42:26.417658 containerd[1599]: time="2025-09-04T15:42:26.417586026Z" level=error msg="Failed to destroy network for sandbox \"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.423702 sshd[4137]: Accepted publickey for core from 10.0.0.1 port 44938 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:26.427341 sshd-session[4137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:26.439760 systemd-logind[1569]: New session 10 of user core. Sep 4 15:42:26.449244 containerd[1599]: time="2025-09-04T15:42:26.447036029Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.449442 kubelet[2721]: E0904 15:42:26.447506 2721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 15:42:26.449442 kubelet[2721]: E0904 15:42:26.447590 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:42:26.449442 kubelet[2721]: E0904 15:42:26.447612 2721 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-g5qkw" Sep 4 15:42:26.447369 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 15:42:26.449691 kubelet[2721]: E0904 15:42:26.447725 2721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-g5qkw_calico-system(1972dac2-e9c7-4fa0-a718-6b8d76c15f8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-g5qkw_calico-system(1972dac2-e9c7-4fa0-a718-6b8d76c15f8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b7cff0717e7f77df70450cfda39285ae1d6e36a854fbd3de3448f8635ba04766\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-g5qkw" podUID="1972dac2-e9c7-4fa0-a718-6b8d76c15f8c" Sep 4 15:42:26.452872 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 15:42:26.453862 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 15:42:26.610038 sshd[4167]: Connection closed by 10.0.0.1 port 44938 Sep 4 15:42:26.611313 sshd-session[4137]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:26.617652 systemd[1]: sshd@9-10.0.0.9:22-10.0.0.1:44938.service: Deactivated successfully. Sep 4 15:42:26.620587 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 15:42:26.622990 systemd-logind[1569]: Session 10 logged out. Waiting for processes to exit. Sep 4 15:42:26.624374 systemd-logind[1569]: Removed session 10. Sep 4 15:42:27.182790 containerd[1599]: time="2025-09-04T15:42:27.182740109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-9bzlz,Uid:68845f40-ebfe-4bf1-9325-43434676a6f4,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:42:27.385458 kubelet[2721]: I0904 15:42:27.385316 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5sg7z" podStartSLOduration=2.075118057 podStartE2EDuration="34.38529228s" podCreationTimestamp="2025-09-04 15:41:53 +0000 UTC" firstStartedPulling="2025-09-04 15:41:53.791701638 +0000 UTC m=+16.702174027" lastFinishedPulling="2025-09-04 15:42:26.101875851 +0000 UTC m=+49.012348250" observedRunningTime="2025-09-04 15:42:27.382266749 +0000 UTC m=+50.292739158" watchObservedRunningTime="2025-09-04 15:42:27.38529228 +0000 UTC m=+50.295764679" Sep 4 15:42:27.423927 systemd-networkd[1489]: cali6f9d1882227: Link UP Sep 4 15:42:27.424226 systemd-networkd[1489]: cali6f9d1882227: Gained carrier Sep 4 15:42:27.424671 kubelet[2721]: I0904 15:42:27.424487 2721 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-ca-bundle\") pod \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " Sep 4 15:42:27.424671 kubelet[2721]: I0904 15:42:27.424530 2721 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-backend-key-pair\") pod \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " Sep 4 15:42:27.424671 kubelet[2721]: I0904 15:42:27.424559 2721 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x5c7\" (UniqueName: \"kubernetes.io/projected/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-kube-api-access-4x5c7\") pod \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\" (UID: \"96b23ce9-24eb-409d-ab7e-65c59ba9d2a2\") " Sep 4 15:42:27.427290 kubelet[2721]: I0904 15:42:27.425066 2721 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" (UID: "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 4 15:42:27.434670 systemd[1]: var-lib-kubelet-pods-96b23ce9\x2d24eb\x2d409d\x2dab7e\x2d65c59ba9d2a2-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 15:42:27.438057 kubelet[2721]: I0904 15:42:27.437962 2721 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-kube-api-access-4x5c7" (OuterVolumeSpecName: "kube-api-access-4x5c7") pod "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" (UID: "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2"). InnerVolumeSpecName "kube-api-access-4x5c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 4 15:42:27.439548 kubelet[2721]: I0904 15:42:27.439290 2721 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" (UID: "96b23ce9-24eb-409d-ab7e-65c59ba9d2a2"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 4 15:42:27.440741 systemd[1]: var-lib-kubelet-pods-96b23ce9\x2d24eb\x2d409d\x2dab7e\x2d65c59ba9d2a2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4x5c7.mount: Deactivated successfully. Sep 4 15:42:27.445453 containerd[1599]: 2025-09-04 15:42:27.208 [INFO][4201] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 15:42:27.445453 containerd[1599]: 2025-09-04 15:42:27.231 [INFO][4201] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0 calico-apiserver-586bb8599c- calico-apiserver 68845f40-ebfe-4bf1-9325-43434676a6f4 878 0 2025-09-04 15:41:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:586bb8599c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-586bb8599c-9bzlz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6f9d1882227 [] [] }} ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-" Sep 4 15:42:27.445453 containerd[1599]: 2025-09-04 15:42:27.231 [INFO][4201] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.445453 containerd[1599]: 2025-09-04 15:42:27.363 [INFO][4216] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" HandleID="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Workload="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.363 [INFO][4216] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" HandleID="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Workload="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002872d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-586bb8599c-9bzlz", "timestamp":"2025-09-04 15:42:27.363494096 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.364 [INFO][4216] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.364 [INFO][4216] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.364 [INFO][4216] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.373 [INFO][4216] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" host="localhost" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.378 [INFO][4216] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.387 [INFO][4216] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.391 [INFO][4216] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.394 [INFO][4216] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:27.445887 containerd[1599]: 2025-09-04 15:42:27.394 [INFO][4216] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" host="localhost" Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.396 [INFO][4216] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55 Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.401 [INFO][4216] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" host="localhost" Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.409 [INFO][4216] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" host="localhost" Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.409 [INFO][4216] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" host="localhost" Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.409 [INFO][4216] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:27.446112 containerd[1599]: 2025-09-04 15:42:27.409 [INFO][4216] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" HandleID="k8s-pod-network.96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Workload="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.446275 containerd[1599]: 2025-09-04 15:42:27.413 [INFO][4201] cni-plugin/k8s.go 418: Populated endpoint ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0", GenerateName:"calico-apiserver-586bb8599c-", Namespace:"calico-apiserver", SelfLink:"", UID:"68845f40-ebfe-4bf1-9325-43434676a6f4", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"586bb8599c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-586bb8599c-9bzlz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f9d1882227", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:27.446330 containerd[1599]: 2025-09-04 15:42:27.413 [INFO][4201] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.446330 containerd[1599]: 2025-09-04 15:42:27.413 [INFO][4201] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f9d1882227 ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.446330 containerd[1599]: 2025-09-04 15:42:27.423 [INFO][4201] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.446457 containerd[1599]: 2025-09-04 15:42:27.423 [INFO][4201] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0", GenerateName:"calico-apiserver-586bb8599c-", Namespace:"calico-apiserver", SelfLink:"", UID:"68845f40-ebfe-4bf1-9325-43434676a6f4", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"586bb8599c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55", Pod:"calico-apiserver-586bb8599c-9bzlz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f9d1882227", MAC:"7e:52:b3:fe:b1:be", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:27.446508 containerd[1599]: 2025-09-04 15:42:27.442 [INFO][4201] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-9bzlz" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--9bzlz-eth0" Sep 4 15:42:27.525983 kubelet[2721]: I0904 15:42:27.525922 2721 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4x5c7\" (UniqueName: \"kubernetes.io/projected/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-kube-api-access-4x5c7\") on node \"localhost\" DevicePath \"\"" Sep 4 15:42:27.525983 kubelet[2721]: I0904 15:42:27.525963 2721 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 4 15:42:27.525983 kubelet[2721]: I0904 15:42:27.525973 2721 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 4 15:42:27.529716 containerd[1599]: time="2025-09-04T15:42:27.529672499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\" id:\"1929f3542c37f2a03b577a7627f4cf3251a6a26fcc7583dd45460aef78341224\" pid:4241 exit_status:1 exited_at:{seconds:1757000547 nanos:529246399}" Sep 4 15:42:27.615281 containerd[1599]: time="2025-09-04T15:42:27.615165623Z" level=info msg="connecting to shim 96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55" address="unix:///run/containerd/s/e9c2b8951f4dd209d602e6edea6729a63c68c25c6553a57649391a65ff4283b4" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:27.644415 systemd[1]: Started cri-containerd-96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55.scope - libcontainer container 96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55. Sep 4 15:42:27.657510 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:27.688363 containerd[1599]: time="2025-09-04T15:42:27.688184850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-9bzlz,Uid:68845f40-ebfe-4bf1-9325-43434676a6f4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55\"" Sep 4 15:42:27.690038 containerd[1599]: time="2025-09-04T15:42:27.690009465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 15:42:28.332045 systemd-networkd[1489]: vxlan.calico: Link UP Sep 4 15:42:28.332058 systemd-networkd[1489]: vxlan.calico: Gained carrier Sep 4 15:42:28.373601 systemd[1]: Removed slice kubepods-besteffort-pod96b23ce9_24eb_409d_ab7e_65c59ba9d2a2.slice - libcontainer container kubepods-besteffort-pod96b23ce9_24eb_409d_ab7e_65c59ba9d2a2.slice. Sep 4 15:42:28.449073 systemd[1]: Created slice kubepods-besteffort-pod4ee13c31_b75a_46fc_9672_79641ff48fa8.slice - libcontainer container kubepods-besteffort-pod4ee13c31_b75a_46fc_9672_79641ff48fa8.slice. Sep 4 15:42:28.509471 containerd[1599]: time="2025-09-04T15:42:28.509398460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\" id:\"9eb9fbbb3970ffa2834c0f396fb745c5c1bc8f1631982aabcfebf5c977d68c14\" pid:4483 exit_status:1 exited_at:{seconds:1757000548 nanos:508439279}" Sep 4 15:42:28.532293 kubelet[2721]: I0904 15:42:28.532200 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4ee13c31-b75a-46fc-9672-79641ff48fa8-whisker-backend-key-pair\") pod \"whisker-655f54c8b4-9dm8z\" (UID: \"4ee13c31-b75a-46fc-9672-79641ff48fa8\") " pod="calico-system/whisker-655f54c8b4-9dm8z" Sep 4 15:42:28.532293 kubelet[2721]: I0904 15:42:28.532277 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ee13c31-b75a-46fc-9672-79641ff48fa8-whisker-ca-bundle\") pod \"whisker-655f54c8b4-9dm8z\" (UID: \"4ee13c31-b75a-46fc-9672-79641ff48fa8\") " pod="calico-system/whisker-655f54c8b4-9dm8z" Sep 4 15:42:28.532293 kubelet[2721]: I0904 15:42:28.532316 2721 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vq2k\" (UniqueName: \"kubernetes.io/projected/4ee13c31-b75a-46fc-9672-79641ff48fa8-kube-api-access-6vq2k\") pod \"whisker-655f54c8b4-9dm8z\" (UID: \"4ee13c31-b75a-46fc-9672-79641ff48fa8\") " pod="calico-system/whisker-655f54c8b4-9dm8z" Sep 4 15:42:28.756299 containerd[1599]: time="2025-09-04T15:42:28.756141097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-655f54c8b4-9dm8z,Uid:4ee13c31-b75a-46fc-9672-79641ff48fa8,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:28.848420 systemd-networkd[1489]: cali6f9d1882227: Gained IPv6LL Sep 4 15:42:28.866480 systemd-networkd[1489]: calicdbcc6f973c: Link UP Sep 4 15:42:28.866966 systemd-networkd[1489]: calicdbcc6f973c: Gained carrier Sep 4 15:42:28.880398 containerd[1599]: 2025-09-04 15:42:28.802 [INFO][4532] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--655f54c8b4--9dm8z-eth0 whisker-655f54c8b4- calico-system 4ee13c31-b75a-46fc-9672-79641ff48fa8 1041 0 2025-09-04 15:42:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:655f54c8b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-655f54c8b4-9dm8z eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calicdbcc6f973c [] [] }} ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-" Sep 4 15:42:28.880398 containerd[1599]: 2025-09-04 15:42:28.803 [INFO][4532] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.880398 containerd[1599]: 2025-09-04 15:42:28.830 [INFO][4548] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" HandleID="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Workload="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.830 [INFO][4548] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" HandleID="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Workload="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ff40), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-655f54c8b4-9dm8z", "timestamp":"2025-09-04 15:42:28.83078939 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.831 [INFO][4548] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.831 [INFO][4548] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.831 [INFO][4548] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.838 [INFO][4548] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" host="localhost" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.843 [INFO][4548] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.846 [INFO][4548] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.849 [INFO][4548] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.851 [INFO][4548] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:28.880613 containerd[1599]: 2025-09-04 15:42:28.851 [INFO][4548] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" host="localhost" Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.853 [INFO][4548] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83 Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.856 [INFO][4548] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" host="localhost" Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.860 [INFO][4548] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" host="localhost" Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.860 [INFO][4548] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" host="localhost" Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.860 [INFO][4548] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:28.880836 containerd[1599]: 2025-09-04 15:42:28.860 [INFO][4548] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" HandleID="k8s-pod-network.f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Workload="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.880959 containerd[1599]: 2025-09-04 15:42:28.864 [INFO][4532] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--655f54c8b4--9dm8z-eth0", GenerateName:"whisker-655f54c8b4-", Namespace:"calico-system", SelfLink:"", UID:"4ee13c31-b75a-46fc-9672-79641ff48fa8", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 42, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"655f54c8b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-655f54c8b4-9dm8z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calicdbcc6f973c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:28.880959 containerd[1599]: 2025-09-04 15:42:28.864 [INFO][4532] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.881035 containerd[1599]: 2025-09-04 15:42:28.864 [INFO][4532] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicdbcc6f973c ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.881035 containerd[1599]: 2025-09-04 15:42:28.866 [INFO][4532] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.881076 containerd[1599]: 2025-09-04 15:42:28.866 [INFO][4532] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--655f54c8b4--9dm8z-eth0", GenerateName:"whisker-655f54c8b4-", Namespace:"calico-system", SelfLink:"", UID:"4ee13c31-b75a-46fc-9672-79641ff48fa8", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 42, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"655f54c8b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83", Pod:"whisker-655f54c8b4-9dm8z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calicdbcc6f973c", MAC:"32:f8:07:d8:eb:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:28.881142 containerd[1599]: 2025-09-04 15:42:28.876 [INFO][4532] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" Namespace="calico-system" Pod="whisker-655f54c8b4-9dm8z" WorkloadEndpoint="localhost-k8s-whisker--655f54c8b4--9dm8z-eth0" Sep 4 15:42:28.906138 containerd[1599]: time="2025-09-04T15:42:28.906083597Z" level=info msg="connecting to shim f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83" address="unix:///run/containerd/s/4063f9ce26a77cfa5a6825c641d8cf81c39de5b8fe807422de0bd19271da1c16" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:28.934389 systemd[1]: Started cri-containerd-f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83.scope - libcontainer container f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83. Sep 4 15:42:28.948238 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:28.977162 containerd[1599]: time="2025-09-04T15:42:28.977116453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-655f54c8b4-9dm8z,Uid:4ee13c31-b75a-46fc-9672-79641ff48fa8,Namespace:calico-system,Attempt:0,} returns sandbox id \"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83\"" Sep 4 15:42:29.182404 kubelet[2721]: I0904 15:42:29.182344 2721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b23ce9-24eb-409d-ab7e-65c59ba9d2a2" path="/var/lib/kubelet/pods/96b23ce9-24eb-409d-ab7e-65c59ba9d2a2/volumes" Sep 4 15:42:29.807409 systemd-networkd[1489]: vxlan.calico: Gained IPv6LL Sep 4 15:42:30.383486 systemd-networkd[1489]: calicdbcc6f973c: Gained IPv6LL Sep 4 15:42:31.622553 systemd[1]: Started sshd@10-10.0.0.9:22-10.0.0.1:46372.service - OpenSSH per-connection server daemon (10.0.0.1:46372). Sep 4 15:42:31.691596 sshd[4613]: Accepted publickey for core from 10.0.0.1 port 46372 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:31.693808 sshd-session[4613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:31.699093 systemd-logind[1569]: New session 11 of user core. Sep 4 15:42:31.706340 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 15:42:31.816763 sshd[4616]: Connection closed by 10.0.0.1 port 46372 Sep 4 15:42:31.817117 sshd-session[4613]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:31.821491 systemd[1]: sshd@10-10.0.0.9:22-10.0.0.1:46372.service: Deactivated successfully. Sep 4 15:42:31.823624 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 15:42:31.824553 systemd-logind[1569]: Session 11 logged out. Waiting for processes to exit. Sep 4 15:42:31.825800 systemd-logind[1569]: Removed session 11. Sep 4 15:42:34.477279 containerd[1599]: time="2025-09-04T15:42:34.477177164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:34.478023 containerd[1599]: time="2025-09-04T15:42:34.477979740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 15:42:34.479321 containerd[1599]: time="2025-09-04T15:42:34.479153483Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:34.484698 containerd[1599]: time="2025-09-04T15:42:34.484632484Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:34.485349 containerd[1599]: time="2025-09-04T15:42:34.485307041Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 6.795104472s" Sep 4 15:42:34.485394 containerd[1599]: time="2025-09-04T15:42:34.485348619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 15:42:34.486252 containerd[1599]: time="2025-09-04T15:42:34.486201269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 15:42:34.487633 containerd[1599]: time="2025-09-04T15:42:34.487437939Z" level=info msg="CreateContainer within sandbox \"96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:42:34.495401 containerd[1599]: time="2025-09-04T15:42:34.495370104Z" level=info msg="Container 775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:34.505945 containerd[1599]: time="2025-09-04T15:42:34.505894874Z" level=info msg="CreateContainer within sandbox \"96da75594a9c5a47af65e182306ddf0225349b113d3551fa1d27f1a9b6958e55\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03\"" Sep 4 15:42:34.506446 containerd[1599]: time="2025-09-04T15:42:34.506402898Z" level=info msg="StartContainer for \"775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03\"" Sep 4 15:42:34.507481 containerd[1599]: time="2025-09-04T15:42:34.507458077Z" level=info msg="connecting to shim 775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03" address="unix:///run/containerd/s/e9c2b8951f4dd209d602e6edea6729a63c68c25c6553a57649391a65ff4283b4" protocol=ttrpc version=3 Sep 4 15:42:34.541416 systemd[1]: Started cri-containerd-775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03.scope - libcontainer container 775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03. Sep 4 15:42:34.592168 containerd[1599]: time="2025-09-04T15:42:34.592122342Z" level=info msg="StartContainer for \"775cd22edcf01e38ae7a1712764e30e38ea0efbfe00947ddf7e3b6bb37eeba03\" returns successfully" Sep 4 15:42:35.403438 kubelet[2721]: I0904 15:42:35.403193 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-586bb8599c-9bzlz" podStartSLOduration=37.606802284 podStartE2EDuration="44.403169867s" podCreationTimestamp="2025-09-04 15:41:51 +0000 UTC" firstStartedPulling="2025-09-04 15:42:27.689726445 +0000 UTC m=+50.600198844" lastFinishedPulling="2025-09-04 15:42:34.486094028 +0000 UTC m=+57.396566427" observedRunningTime="2025-09-04 15:42:35.401721409 +0000 UTC m=+58.312193808" watchObservedRunningTime="2025-09-04 15:42:35.403169867 +0000 UTC m=+58.313642266" Sep 4 15:42:36.389660 kubelet[2721]: I0904 15:42:36.389596 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:42:36.833413 systemd[1]: Started sshd@11-10.0.0.9:22-10.0.0.1:46388.service - OpenSSH per-connection server daemon (10.0.0.1:46388). Sep 4 15:42:36.902539 sshd[4688]: Accepted publickey for core from 10.0.0.1 port 46388 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:36.904794 sshd-session[4688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:36.909670 systemd-logind[1569]: New session 12 of user core. Sep 4 15:42:36.920484 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 15:42:37.044598 sshd[4691]: Connection closed by 10.0.0.1 port 46388 Sep 4 15:42:37.044979 sshd-session[4688]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:37.059423 systemd[1]: sshd@11-10.0.0.9:22-10.0.0.1:46388.service: Deactivated successfully. Sep 4 15:42:37.061600 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 15:42:37.062530 systemd-logind[1569]: Session 12 logged out. Waiting for processes to exit. Sep 4 15:42:37.065438 systemd[1]: Started sshd@12-10.0.0.9:22-10.0.0.1:46398.service - OpenSSH per-connection server daemon (10.0.0.1:46398). Sep 4 15:42:37.066520 systemd-logind[1569]: Removed session 12. Sep 4 15:42:37.124389 sshd[4705]: Accepted publickey for core from 10.0.0.1 port 46398 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:37.126139 sshd-session[4705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:37.131090 systemd-logind[1569]: New session 13 of user core. Sep 4 15:42:37.142392 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 15:42:37.180242 containerd[1599]: time="2025-09-04T15:42:37.179909688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:37.312002 systemd-networkd[1489]: cali92951a2a9f9: Link UP Sep 4 15:42:37.313634 systemd-networkd[1489]: cali92951a2a9f9: Gained carrier Sep 4 15:42:37.338533 sshd[4708]: Connection closed by 10.0.0.1 port 46398 Sep 4 15:42:37.336553 sshd-session[4705]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:37.350491 containerd[1599]: 2025-09-04 15:42:37.226 [INFO][4710] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--7qmzd-eth0 goldmane-54d579b49d- calico-system 61351a35-3590-4d07-b44e-2b184f6950c6 873 0 2025-09-04 15:41:52 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-7qmzd eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali92951a2a9f9 [] [] }} ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-" Sep 4 15:42:37.350491 containerd[1599]: 2025-09-04 15:42:37.228 [INFO][4710] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.350491 containerd[1599]: 2025-09-04 15:42:37.262 [INFO][4732] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" HandleID="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Workload="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.262 [INFO][4732] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" HandleID="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Workload="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000287690), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-7qmzd", "timestamp":"2025-09-04 15:42:37.262321053 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.262 [INFO][4732] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.262 [INFO][4732] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.262 [INFO][4732] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.268 [INFO][4732] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" host="localhost" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.274 [INFO][4732] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.279 [INFO][4732] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.281 [INFO][4732] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.282 [INFO][4732] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:37.350887 containerd[1599]: 2025-09-04 15:42:37.283 [INFO][4732] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" host="localhost" Sep 4 15:42:37.350843 systemd[1]: sshd@12-10.0.0.9:22-10.0.0.1:46398.service: Deactivated successfully. Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.286 [INFO][4732] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.293 [INFO][4732] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" host="localhost" Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.302 [INFO][4732] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" host="localhost" Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.303 [INFO][4732] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" host="localhost" Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.303 [INFO][4732] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:37.352563 containerd[1599]: 2025-09-04 15:42:37.303 [INFO][4732] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" HandleID="k8s-pod-network.e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Workload="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.352779 containerd[1599]: 2025-09-04 15:42:37.306 [INFO][4710] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--7qmzd-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"61351a35-3590-4d07-b44e-2b184f6950c6", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-7qmzd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali92951a2a9f9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:37.352779 containerd[1599]: 2025-09-04 15:42:37.306 [INFO][4710] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.352941 containerd[1599]: 2025-09-04 15:42:37.306 [INFO][4710] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali92951a2a9f9 ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.352941 containerd[1599]: 2025-09-04 15:42:37.313 [INFO][4710] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.353009 containerd[1599]: 2025-09-04 15:42:37.314 [INFO][4710] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--7qmzd-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"61351a35-3590-4d07-b44e-2b184f6950c6", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad", Pod:"goldmane-54d579b49d-7qmzd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali92951a2a9f9", MAC:"ca:3b:c5:84:d9:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:37.353092 containerd[1599]: 2025-09-04 15:42:37.334 [INFO][4710] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" Namespace="calico-system" Pod="goldmane-54d579b49d-7qmzd" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7qmzd-eth0" Sep 4 15:42:37.354199 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 15:42:37.360346 systemd-logind[1569]: Session 13 logged out. Waiting for processes to exit. Sep 4 15:42:37.365634 systemd-logind[1569]: Removed session 13. Sep 4 15:42:37.368489 systemd[1]: Started sshd@13-10.0.0.9:22-10.0.0.1:46412.service - OpenSSH per-connection server daemon (10.0.0.1:46412). Sep 4 15:42:37.430176 sshd[4754]: Accepted publickey for core from 10.0.0.1 port 46412 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:37.432681 sshd-session[4754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:37.440388 systemd-logind[1569]: New session 14 of user core. Sep 4 15:42:37.448958 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 15:42:37.452641 containerd[1599]: time="2025-09-04T15:42:37.452591179Z" level=info msg="connecting to shim e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad" address="unix:///run/containerd/s/6e952f0475bb301ce72ce02449f759338a69b78cc8e88cf2c3c7c2494868db6f" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:37.492574 systemd[1]: Started cri-containerd-e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad.scope - libcontainer container e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad. Sep 4 15:42:37.506834 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:37.541140 containerd[1599]: time="2025-09-04T15:42:37.541094944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7qmzd,Uid:61351a35-3590-4d07-b44e-2b184f6950c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad\"" Sep 4 15:42:37.589160 sshd[4778]: Connection closed by 10.0.0.1 port 46412 Sep 4 15:42:37.589580 sshd-session[4754]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:37.594916 systemd[1]: sshd@13-10.0.0.9:22-10.0.0.1:46412.service: Deactivated successfully. Sep 4 15:42:37.597260 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 15:42:37.598166 systemd-logind[1569]: Session 14 logged out. Waiting for processes to exit. Sep 4 15:42:37.599791 systemd-logind[1569]: Removed session 14. Sep 4 15:42:38.180456 containerd[1599]: time="2025-09-04T15:42:38.180345570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:38.284272 systemd-networkd[1489]: cali1be08d86cd9: Link UP Sep 4 15:42:38.284493 systemd-networkd[1489]: cali1be08d86cd9: Gained carrier Sep 4 15:42:38.297094 containerd[1599]: 2025-09-04 15:42:38.216 [INFO][4818] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0 calico-kube-controllers-cd6bff945- calico-system 40ed457a-a70c-44bd-9d09-2cbd7ed7fa67 877 0 2025-09-04 15:41:53 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:cd6bff945 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-cd6bff945-8bhdp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1be08d86cd9 [] [] }} ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-" Sep 4 15:42:38.297094 containerd[1599]: 2025-09-04 15:42:38.216 [INFO][4818] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297094 containerd[1599]: 2025-09-04 15:42:38.245 [INFO][4832] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" HandleID="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Workload="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.245 [INFO][4832] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" HandleID="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Workload="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00018bb10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-cd6bff945-8bhdp", "timestamp":"2025-09-04 15:42:38.245298373 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.245 [INFO][4832] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.245 [INFO][4832] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.245 [INFO][4832] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.251 [INFO][4832] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" host="localhost" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.256 [INFO][4832] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.262 [INFO][4832] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.264 [INFO][4832] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.266 [INFO][4832] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:38.297340 containerd[1599]: 2025-09-04 15:42:38.266 [INFO][4832] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" host="localhost" Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.267 [INFO][4832] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3 Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.271 [INFO][4832] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" host="localhost" Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.277 [INFO][4832] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" host="localhost" Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.278 [INFO][4832] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" host="localhost" Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.278 [INFO][4832] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:38.297566 containerd[1599]: 2025-09-04 15:42:38.278 [INFO][4832] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" HandleID="k8s-pod-network.3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Workload="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297692 containerd[1599]: 2025-09-04 15:42:38.281 [INFO][4818] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0", GenerateName:"calico-kube-controllers-cd6bff945-", Namespace:"calico-system", SelfLink:"", UID:"40ed457a-a70c-44bd-9d09-2cbd7ed7fa67", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cd6bff945", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-cd6bff945-8bhdp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1be08d86cd9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:38.297742 containerd[1599]: 2025-09-04 15:42:38.281 [INFO][4818] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297742 containerd[1599]: 2025-09-04 15:42:38.281 [INFO][4818] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1be08d86cd9 ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297742 containerd[1599]: 2025-09-04 15:42:38.284 [INFO][4818] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.297808 containerd[1599]: 2025-09-04 15:42:38.285 [INFO][4818] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0", GenerateName:"calico-kube-controllers-cd6bff945-", Namespace:"calico-system", SelfLink:"", UID:"40ed457a-a70c-44bd-9d09-2cbd7ed7fa67", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cd6bff945", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3", Pod:"calico-kube-controllers-cd6bff945-8bhdp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1be08d86cd9", MAC:"f6:fb:63:8f:05:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:38.297859 containerd[1599]: 2025-09-04 15:42:38.294 [INFO][4818] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" Namespace="calico-system" Pod="calico-kube-controllers-cd6bff945-8bhdp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd6bff945--8bhdp-eth0" Sep 4 15:42:38.324923 containerd[1599]: time="2025-09-04T15:42:38.324848944Z" level=info msg="connecting to shim 3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3" address="unix:///run/containerd/s/dc6477e4a8f80f3d30a24e3eba3af929b9959ad22cf602a5b351445d4def6c79" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:38.349397 systemd[1]: Started cri-containerd-3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3.scope - libcontainer container 3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3. Sep 4 15:42:38.365590 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:38.400081 containerd[1599]: time="2025-09-04T15:42:38.400035808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd6bff945-8bhdp,Uid:40ed457a-a70c-44bd-9d09-2cbd7ed7fa67,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3\"" Sep 4 15:42:38.511426 systemd-networkd[1489]: cali92951a2a9f9: Gained IPv6LL Sep 4 15:42:39.919462 systemd-networkd[1489]: cali1be08d86cd9: Gained IPv6LL Sep 4 15:42:40.179998 kubelet[2721]: E0904 15:42:40.179808 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:40.180706 containerd[1599]: time="2025-09-04T15:42:40.180659490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:40.306084 containerd[1599]: time="2025-09-04T15:42:40.306024118Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:40.307087 containerd[1599]: time="2025-09-04T15:42:40.307041607Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 15:42:40.308276 containerd[1599]: time="2025-09-04T15:42:40.308201574Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:40.311349 containerd[1599]: time="2025-09-04T15:42:40.311274619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:40.312117 containerd[1599]: time="2025-09-04T15:42:40.312040987Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 5.825791578s" Sep 4 15:42:40.312117 containerd[1599]: time="2025-09-04T15:42:40.312090149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 15:42:40.313397 containerd[1599]: time="2025-09-04T15:42:40.313368707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 15:42:40.315352 containerd[1599]: time="2025-09-04T15:42:40.315308556Z" level=info msg="CreateContainer within sandbox \"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 15:42:40.322860 systemd-networkd[1489]: calia929126c681: Link UP Sep 4 15:42:40.323269 systemd-networkd[1489]: calia929126c681: Gained carrier Sep 4 15:42:40.338627 containerd[1599]: time="2025-09-04T15:42:40.338287375Z" level=info msg="Container 14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:40.352323 containerd[1599]: 2025-09-04 15:42:40.245 [INFO][4901] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0 coredns-668d6bf9bc- kube-system 1346250c-c62f-497e-b346-045a6d8429cd 870 0 2025-09-04 15:41:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-dzhx5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia929126c681 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-" Sep 4 15:42:40.352323 containerd[1599]: 2025-09-04 15:42:40.245 [INFO][4901] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.352323 containerd[1599]: 2025-09-04 15:42:40.281 [INFO][4920] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" HandleID="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Workload="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.281 [INFO][4920] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" HandleID="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Workload="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00050af80), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-dzhx5", "timestamp":"2025-09-04 15:42:40.281529696 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.281 [INFO][4920] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.281 [INFO][4920] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.281 [INFO][4920] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.288 [INFO][4920] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" host="localhost" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.293 [INFO][4920] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.297 [INFO][4920] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.299 [INFO][4920] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.301 [INFO][4920] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:40.354114 containerd[1599]: 2025-09-04 15:42:40.301 [INFO][4920] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" host="localhost" Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.302 [INFO][4920] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.306 [INFO][4920] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" host="localhost" Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.315 [INFO][4920] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" host="localhost" Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.315 [INFO][4920] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" host="localhost" Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.315 [INFO][4920] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:40.354450 containerd[1599]: 2025-09-04 15:42:40.315 [INFO][4920] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" HandleID="k8s-pod-network.7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Workload="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.354588 containerd[1599]: 2025-09-04 15:42:40.320 [INFO][4901] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1346250c-c62f-497e-b346-045a6d8429cd", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-dzhx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia929126c681", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:40.354656 containerd[1599]: 2025-09-04 15:42:40.320 [INFO][4901] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.354656 containerd[1599]: 2025-09-04 15:42:40.320 [INFO][4901] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia929126c681 ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.354656 containerd[1599]: 2025-09-04 15:42:40.324 [INFO][4901] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.354722 containerd[1599]: 2025-09-04 15:42:40.326 [INFO][4901] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1346250c-c62f-497e-b346-045a6d8429cd", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc", Pod:"coredns-668d6bf9bc-dzhx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia929126c681", MAC:"7e:3a:48:66:23:05", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:40.354722 containerd[1599]: 2025-09-04 15:42:40.337 [INFO][4901] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" Namespace="kube-system" Pod="coredns-668d6bf9bc-dzhx5" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dzhx5-eth0" Sep 4 15:42:40.355686 containerd[1599]: time="2025-09-04T15:42:40.355653244Z" level=info msg="CreateContainer within sandbox \"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504\"" Sep 4 15:42:40.357387 containerd[1599]: time="2025-09-04T15:42:40.356536962Z" level=info msg="StartContainer for \"14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504\"" Sep 4 15:42:40.359021 containerd[1599]: time="2025-09-04T15:42:40.357743566Z" level=info msg="connecting to shim 14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504" address="unix:///run/containerd/s/4063f9ce26a77cfa5a6825c641d8cf81c39de5b8fe807422de0bd19271da1c16" protocol=ttrpc version=3 Sep 4 15:42:40.388540 containerd[1599]: time="2025-09-04T15:42:40.388481561Z" level=info msg="connecting to shim 7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc" address="unix:///run/containerd/s/89e15ecb3f43e288493748362a7d5854890d2284014c8b67d9f1b1fd27ed471f" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:40.390534 systemd[1]: Started cri-containerd-14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504.scope - libcontainer container 14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504. Sep 4 15:42:40.420361 systemd[1]: Started cri-containerd-7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc.scope - libcontainer container 7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc. Sep 4 15:42:40.436513 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:40.461287 containerd[1599]: time="2025-09-04T15:42:40.461236583Z" level=info msg="StartContainer for \"14276d281ffde2a6e88b1134f5b0d92ea73087d9ca2bbbd16946f772326ab504\" returns successfully" Sep 4 15:42:40.471393 containerd[1599]: time="2025-09-04T15:42:40.471340858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dzhx5,Uid:1346250c-c62f-497e-b346-045a6d8429cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc\"" Sep 4 15:42:40.472704 kubelet[2721]: E0904 15:42:40.472680 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:40.476599 containerd[1599]: time="2025-09-04T15:42:40.476536256Z" level=info msg="CreateContainer within sandbox \"7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:42:40.491488 containerd[1599]: time="2025-09-04T15:42:40.491428564Z" level=info msg="Container 1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:40.498201 containerd[1599]: time="2025-09-04T15:42:40.498106452Z" level=info msg="CreateContainer within sandbox \"7a293ef2364f5deb58c548b5afd17d1b8ebb9c163a86b0c443cc62a9bd1ce0dc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de\"" Sep 4 15:42:40.499539 containerd[1599]: time="2025-09-04T15:42:40.499515024Z" level=info msg="StartContainer for \"1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de\"" Sep 4 15:42:40.500406 containerd[1599]: time="2025-09-04T15:42:40.500383403Z" level=info msg="connecting to shim 1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de" address="unix:///run/containerd/s/89e15ecb3f43e288493748362a7d5854890d2284014c8b67d9f1b1fd27ed471f" protocol=ttrpc version=3 Sep 4 15:42:40.525381 systemd[1]: Started cri-containerd-1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de.scope - libcontainer container 1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de. Sep 4 15:42:40.570375 containerd[1599]: time="2025-09-04T15:42:40.570325718Z" level=info msg="StartContainer for \"1f787ac55328feab45ef1909c701e9478f1906589a51a820e37f8653f62a23de\" returns successfully" Sep 4 15:42:41.179828 kubelet[2721]: E0904 15:42:41.179575 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:41.180097 containerd[1599]: time="2025-09-04T15:42:41.180048448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,}" Sep 4 15:42:41.180223 containerd[1599]: time="2025-09-04T15:42:41.180048327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,}" Sep 4 15:42:41.180360 containerd[1599]: time="2025-09-04T15:42:41.180049199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 15:42:41.367656 systemd-networkd[1489]: cali5bd9d2ba9e6: Link UP Sep 4 15:42:41.369625 systemd-networkd[1489]: cali5bd9d2ba9e6: Gained carrier Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.259 [INFO][5057] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--g5qkw-eth0 csi-node-driver- calico-system 1972dac2-e9c7-4fa0-a718-6b8d76c15f8c 749 0 2025-09-04 15:41:53 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-g5qkw eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5bd9d2ba9e6 [] [] }} ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.259 [INFO][5057] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.304 [INFO][5103] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" HandleID="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Workload="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.304 [INFO][5103] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" HandleID="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Workload="localhost-k8s-csi--node--driver--g5qkw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7320), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-g5qkw", "timestamp":"2025-09-04 15:42:41.303998735 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.304 [INFO][5103] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.304 [INFO][5103] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.304 [INFO][5103] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.331 [INFO][5103] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.336 [INFO][5103] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.340 [INFO][5103] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.343 [INFO][5103] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.345 [INFO][5103] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.345 [INFO][5103] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.347 [INFO][5103] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55 Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.353 [INFO][5103] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.359 [INFO][5103] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.360 [INFO][5103] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" host="localhost" Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.360 [INFO][5103] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:41.383324 containerd[1599]: 2025-09-04 15:42:41.360 [INFO][5103] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" HandleID="k8s-pod-network.5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Workload="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.363 [INFO][5057] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--g5qkw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-g5qkw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5bd9d2ba9e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.363 [INFO][5057] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.363 [INFO][5057] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5bd9d2ba9e6 ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.370 [INFO][5057] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.370 [INFO][5057] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--g5qkw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1972dac2-e9c7-4fa0-a718-6b8d76c15f8c", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55", Pod:"csi-node-driver-g5qkw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5bd9d2ba9e6", MAC:"02:b3:51:c4:36:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.384470 containerd[1599]: 2025-09-04 15:42:41.379 [INFO][5057] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" Namespace="calico-system" Pod="csi-node-driver-g5qkw" WorkloadEndpoint="localhost-k8s-csi--node--driver--g5qkw-eth0" Sep 4 15:42:41.405205 containerd[1599]: time="2025-09-04T15:42:41.405077787Z" level=info msg="connecting to shim 5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55" address="unix:///run/containerd/s/30108420f119f1304121c176512ab363c79313e43da16cb1a30422ada3a8dedc" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:41.413338 kubelet[2721]: E0904 15:42:41.413300 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:41.430026 kubelet[2721]: I0904 15:42:41.429452 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-dzhx5" podStartSLOduration=59.429432696 podStartE2EDuration="59.429432696s" podCreationTimestamp="2025-09-04 15:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:42:41.428094405 +0000 UTC m=+64.338566804" watchObservedRunningTime="2025-09-04 15:42:41.429432696 +0000 UTC m=+64.339905095" Sep 4 15:42:41.436684 systemd[1]: Started cri-containerd-5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55.scope - libcontainer container 5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55. Sep 4 15:42:41.464492 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:41.489140 containerd[1599]: time="2025-09-04T15:42:41.489056308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-g5qkw,Uid:1972dac2-e9c7-4fa0-a718-6b8d76c15f8c,Namespace:calico-system,Attempt:0,} returns sandbox id \"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55\"" Sep 4 15:42:41.496774 systemd-networkd[1489]: cali7ed7a11e4eb: Link UP Sep 4 15:42:41.498664 systemd-networkd[1489]: cali7ed7a11e4eb: Gained carrier Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.244 [INFO][5052] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0 coredns-668d6bf9bc- kube-system fe764908-ae86-49c6-b09b-c3ac12014f67 867 0 2025-09-04 15:41:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-6cbdm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7ed7a11e4eb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.244 [INFO][5052] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.345 [INFO][5094] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" HandleID="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Workload="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.345 [INFO][5094] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" HandleID="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Workload="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001b1db0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-6cbdm", "timestamp":"2025-09-04 15:42:41.345018217 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.345 [INFO][5094] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.361 [INFO][5094] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.361 [INFO][5094] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.437 [INFO][5094] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.454 [INFO][5094] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.462 [INFO][5094] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.464 [INFO][5094] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.467 [INFO][5094] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.467 [INFO][5094] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.469 [INFO][5094] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.475 [INFO][5094] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5094] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5094] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" host="localhost" Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5094] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:41.514514 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5094] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" HandleID="k8s-pod-network.0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Workload="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.492 [INFO][5052] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fe764908-ae86-49c6-b09b-c3ac12014f67", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-6cbdm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ed7a11e4eb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.493 [INFO][5052] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.493 [INFO][5052] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7ed7a11e4eb ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.499 [INFO][5052] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.499 [INFO][5052] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fe764908-ae86-49c6-b09b-c3ac12014f67", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca", Pod:"coredns-668d6bf9bc-6cbdm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7ed7a11e4eb", MAC:"1e:ca:e2:df:a6:29", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.515121 containerd[1599]: 2025-09-04 15:42:41.510 [INFO][5052] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" Namespace="kube-system" Pod="coredns-668d6bf9bc-6cbdm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6cbdm-eth0" Sep 4 15:42:41.543762 containerd[1599]: time="2025-09-04T15:42:41.543692805Z" level=info msg="connecting to shim 0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca" address="unix:///run/containerd/s/f93d67286720793fcfff9196a1dab2f3e18e7158a2496c846c17bb3a69aa35bd" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:41.578689 systemd[1]: Started cri-containerd-0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca.scope - libcontainer container 0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca. Sep 4 15:42:41.595599 systemd-networkd[1489]: cali476f58c6619: Link UP Sep 4 15:42:41.597399 systemd-networkd[1489]: cali476f58c6619: Gained carrier Sep 4 15:42:41.599055 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.273 [INFO][5066] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0 calico-apiserver-586bb8599c- calico-apiserver 6ef84674-6ef9-43c1-a60c-d61b98c3020c 876 0 2025-09-04 15:41:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:586bb8599c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-586bb8599c-jft4n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali476f58c6619 [] [] }} ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.274 [INFO][5066] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.357 [INFO][5111] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" HandleID="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Workload="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.357 [INFO][5111] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" HandleID="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Workload="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e4b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-586bb8599c-jft4n", "timestamp":"2025-09-04 15:42:41.357548543 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.357 [INFO][5111] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5111] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.485 [INFO][5111] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.533 [INFO][5111] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.554 [INFO][5111] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.565 [INFO][5111] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.567 [INFO][5111] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.570 [INFO][5111] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.571 [INFO][5111] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.573 [INFO][5111] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6 Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.576 [INFO][5111] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.583 [INFO][5111] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.583 [INFO][5111] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" host="localhost" Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.583 [INFO][5111] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 15:42:41.617031 containerd[1599]: 2025-09-04 15:42:41.583 [INFO][5111] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" HandleID="k8s-pod-network.c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Workload="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.588 [INFO][5066] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0", GenerateName:"calico-apiserver-586bb8599c-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ef84674-6ef9-43c1-a60c-d61b98c3020c", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"586bb8599c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-586bb8599c-jft4n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali476f58c6619", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.588 [INFO][5066] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.588 [INFO][5066] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali476f58c6619 ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.598 [INFO][5066] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.598 [INFO][5066] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0", GenerateName:"calico-apiserver-586bb8599c-", Namespace:"calico-apiserver", SelfLink:"", UID:"6ef84674-6ef9-43c1-a60c-d61b98c3020c", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 15, 41, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"586bb8599c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6", Pod:"calico-apiserver-586bb8599c-jft4n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali476f58c6619", MAC:"1e:c6:ec:ec:68:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 15:42:41.617857 containerd[1599]: 2025-09-04 15:42:41.610 [INFO][5066] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" Namespace="calico-apiserver" Pod="calico-apiserver-586bb8599c-jft4n" WorkloadEndpoint="localhost-k8s-calico--apiserver--586bb8599c--jft4n-eth0" Sep 4 15:42:41.646442 containerd[1599]: time="2025-09-04T15:42:41.646355067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6cbdm,Uid:fe764908-ae86-49c6-b09b-c3ac12014f67,Namespace:kube-system,Attempt:0,} returns sandbox id \"0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca\"" Sep 4 15:42:41.647443 kubelet[2721]: E0904 15:42:41.647381 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:41.650091 containerd[1599]: time="2025-09-04T15:42:41.650037594Z" level=info msg="connecting to shim c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6" address="unix:///run/containerd/s/f51922fb64100a9fa734fb3cb8e92da7a31ae017d95f5163cf901f6c0562530d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 15:42:41.653254 containerd[1599]: time="2025-09-04T15:42:41.652889485Z" level=info msg="CreateContainer within sandbox \"0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 15:42:41.668369 containerd[1599]: time="2025-09-04T15:42:41.668317177Z" level=info msg="Container 4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:41.674924 containerd[1599]: time="2025-09-04T15:42:41.674844682Z" level=info msg="CreateContainer within sandbox \"0f35c88e5adc15165e2caa1bbfda71fdda0b18de86ad659da31e8e93b7e09fca\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140\"" Sep 4 15:42:41.675776 containerd[1599]: time="2025-09-04T15:42:41.675692823Z" level=info msg="StartContainer for \"4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140\"" Sep 4 15:42:41.677010 containerd[1599]: time="2025-09-04T15:42:41.676979877Z" level=info msg="connecting to shim 4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140" address="unix:///run/containerd/s/f93d67286720793fcfff9196a1dab2f3e18e7158a2496c846c17bb3a69aa35bd" protocol=ttrpc version=3 Sep 4 15:42:41.678394 systemd[1]: Started cri-containerd-c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6.scope - libcontainer container c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6. Sep 4 15:42:41.703444 systemd[1]: Started cri-containerd-4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140.scope - libcontainer container 4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140. Sep 4 15:42:41.710261 systemd-resolved[1404]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 15:42:41.711410 systemd-networkd[1489]: calia929126c681: Gained IPv6LL Sep 4 15:42:41.749911 containerd[1599]: time="2025-09-04T15:42:41.749834621Z" level=info msg="StartContainer for \"4823e9836a6dee19a51827be9bf1ec3ff0c899dbe365866b90cd89fcf378e140\" returns successfully" Sep 4 15:42:41.753192 containerd[1599]: time="2025-09-04T15:42:41.753134532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-586bb8599c-jft4n,Uid:6ef84674-6ef9-43c1-a60c-d61b98c3020c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6\"" Sep 4 15:42:41.757860 containerd[1599]: time="2025-09-04T15:42:41.757823397Z" level=info msg="CreateContainer within sandbox \"c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 15:42:41.771742 containerd[1599]: time="2025-09-04T15:42:41.771675033Z" level=info msg="Container bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:41.781805 containerd[1599]: time="2025-09-04T15:42:41.781736397Z" level=info msg="CreateContainer within sandbox \"c11b40d44ac74bca50869861fa18f4527e734349132ea492a9726c7b5b34a4c6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7\"" Sep 4 15:42:41.782636 containerd[1599]: time="2025-09-04T15:42:41.782591802Z" level=info msg="StartContainer for \"bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7\"" Sep 4 15:42:41.783900 containerd[1599]: time="2025-09-04T15:42:41.783853580Z" level=info msg="connecting to shim bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7" address="unix:///run/containerd/s/f51922fb64100a9fa734fb3cb8e92da7a31ae017d95f5163cf901f6c0562530d" protocol=ttrpc version=3 Sep 4 15:42:41.812434 systemd[1]: Started cri-containerd-bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7.scope - libcontainer container bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7. Sep 4 15:42:41.866679 containerd[1599]: time="2025-09-04T15:42:41.866614797Z" level=info msg="StartContainer for \"bd61f3ba041ee173460ed8f8ea1b743407bce98bc054c486d8a419da85af5ea7\" returns successfully" Sep 4 15:42:42.429275 kubelet[2721]: E0904 15:42:42.429191 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:42.430480 kubelet[2721]: E0904 15:42:42.430454 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:42.461939 kubelet[2721]: I0904 15:42:42.461843 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-586bb8599c-jft4n" podStartSLOduration=51.461816371 podStartE2EDuration="51.461816371s" podCreationTimestamp="2025-09-04 15:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:42:42.441665339 +0000 UTC m=+65.352137748" watchObservedRunningTime="2025-09-04 15:42:42.461816371 +0000 UTC m=+65.372288770" Sep 4 15:42:42.462246 kubelet[2721]: I0904 15:42:42.462057 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6cbdm" podStartSLOduration=60.462050941 podStartE2EDuration="1m0.462050941s" podCreationTimestamp="2025-09-04 15:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 15:42:42.460067309 +0000 UTC m=+65.370539708" watchObservedRunningTime="2025-09-04 15:42:42.462050941 +0000 UTC m=+65.372523340" Sep 4 15:42:42.607633 systemd[1]: Started sshd@14-10.0.0.9:22-10.0.0.1:38814.service - OpenSSH per-connection server daemon (10.0.0.1:38814). Sep 4 15:42:42.608637 systemd-networkd[1489]: cali7ed7a11e4eb: Gained IPv6LL Sep 4 15:42:42.669232 sshd[5366]: Accepted publickey for core from 10.0.0.1 port 38814 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:42.671462 sshd-session[5366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:42.678971 systemd-logind[1569]: New session 15 of user core. Sep 4 15:42:42.683372 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 15:42:42.853307 sshd[5369]: Connection closed by 10.0.0.1 port 38814 Sep 4 15:42:42.853737 sshd-session[5366]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:42.859039 systemd[1]: sshd@14-10.0.0.9:22-10.0.0.1:38814.service: Deactivated successfully. Sep 4 15:42:42.861973 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 15:42:42.862908 systemd-logind[1569]: Session 15 logged out. Waiting for processes to exit. Sep 4 15:42:42.864373 systemd-logind[1569]: Removed session 15. Sep 4 15:42:42.927499 systemd-networkd[1489]: cali476f58c6619: Gained IPv6LL Sep 4 15:42:42.927862 systemd-networkd[1489]: cali5bd9d2ba9e6: Gained IPv6LL Sep 4 15:42:43.430731 kubelet[2721]: I0904 15:42:43.430674 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:42:43.431971 kubelet[2721]: E0904 15:42:43.431124 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:43.431971 kubelet[2721]: E0904 15:42:43.431125 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:44.432645 kubelet[2721]: E0904 15:42:44.432592 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:45.434811 kubelet[2721]: E0904 15:42:45.434754 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:47.868697 systemd[1]: Started sshd@15-10.0.0.9:22-10.0.0.1:38818.service - OpenSSH per-connection server daemon (10.0.0.1:38818). Sep 4 15:42:47.940314 sshd[5399]: Accepted publickey for core from 10.0.0.1 port 38818 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:47.942302 sshd-session[5399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:47.947058 systemd-logind[1569]: New session 16 of user core. Sep 4 15:42:47.957327 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 15:42:48.069413 sshd[5402]: Connection closed by 10.0.0.1 port 38818 Sep 4 15:42:48.070434 sshd-session[5399]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:48.074432 systemd[1]: sshd@15-10.0.0.9:22-10.0.0.1:38818.service: Deactivated successfully. Sep 4 15:42:48.076593 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 15:42:48.077483 systemd-logind[1569]: Session 16 logged out. Waiting for processes to exit. Sep 4 15:42:48.078698 systemd-logind[1569]: Removed session 16. Sep 4 15:42:48.717498 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount188284855.mount: Deactivated successfully. Sep 4 15:42:50.179438 kubelet[2721]: E0904 15:42:50.179398 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:51.180826 kubelet[2721]: E0904 15:42:51.180779 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:42:51.370484 containerd[1599]: time="2025-09-04T15:42:51.370424048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:51.374323 containerd[1599]: time="2025-09-04T15:42:51.374236503Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 15:42:51.374832 containerd[1599]: time="2025-09-04T15:42:51.374786742Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:51.377249 containerd[1599]: time="2025-09-04T15:42:51.377204533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:51.377782 containerd[1599]: time="2025-09-04T15:42:51.377755543Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 11.06435693s" Sep 4 15:42:51.377836 containerd[1599]: time="2025-09-04T15:42:51.377781202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 15:42:51.379118 containerd[1599]: time="2025-09-04T15:42:51.378931616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 15:42:51.380782 containerd[1599]: time="2025-09-04T15:42:51.380746628Z" level=info msg="CreateContainer within sandbox \"e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 15:42:51.390475 containerd[1599]: time="2025-09-04T15:42:51.390422372Z" level=info msg="Container 3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:51.399698 containerd[1599]: time="2025-09-04T15:42:51.399643804Z" level=info msg="CreateContainer within sandbox \"e1fde3f232df139dee4c34e1586b3f3446713a9195da1a8d15e686beaac044ad\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\"" Sep 4 15:42:51.400266 containerd[1599]: time="2025-09-04T15:42:51.400202638Z" level=info msg="StartContainer for \"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\"" Sep 4 15:42:51.401557 containerd[1599]: time="2025-09-04T15:42:51.401528419Z" level=info msg="connecting to shim 3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0" address="unix:///run/containerd/s/6e952f0475bb301ce72ce02449f759338a69b78cc8e88cf2c3c7c2494868db6f" protocol=ttrpc version=3 Sep 4 15:42:51.473372 systemd[1]: Started cri-containerd-3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0.scope - libcontainer container 3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0. Sep 4 15:42:51.525728 containerd[1599]: time="2025-09-04T15:42:51.525669110Z" level=info msg="StartContainer for \"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\" returns successfully" Sep 4 15:42:52.479605 kubelet[2721]: I0904 15:42:52.479519 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-7qmzd" podStartSLOduration=46.645150447 podStartE2EDuration="1m0.479455379s" podCreationTimestamp="2025-09-04 15:41:52 +0000 UTC" firstStartedPulling="2025-09-04 15:42:37.544357836 +0000 UTC m=+60.454830225" lastFinishedPulling="2025-09-04 15:42:51.378662768 +0000 UTC m=+74.289135157" observedRunningTime="2025-09-04 15:42:52.474973884 +0000 UTC m=+75.385446283" watchObservedRunningTime="2025-09-04 15:42:52.479455379 +0000 UTC m=+75.389927778" Sep 4 15:42:52.554047 containerd[1599]: time="2025-09-04T15:42:52.553983252Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\" id:\"410d844cc9006c2f92cae42a5cbcac33dc82a6883b6e38f7840daad8266e1a13\" pid:5479 exit_status:1 exited_at:{seconds:1757000572 nanos:553546634}" Sep 4 15:42:53.082249 systemd[1]: Started sshd@16-10.0.0.9:22-10.0.0.1:34382.service - OpenSSH per-connection server daemon (10.0.0.1:34382). Sep 4 15:42:53.151537 sshd[5496]: Accepted publickey for core from 10.0.0.1 port 34382 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:53.153284 sshd-session[5496]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:53.157953 systemd-logind[1569]: New session 17 of user core. Sep 4 15:42:53.168369 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 15:42:53.300160 sshd[5500]: Connection closed by 10.0.0.1 port 34382 Sep 4 15:42:53.300589 sshd-session[5496]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:53.305763 systemd[1]: sshd@16-10.0.0.9:22-10.0.0.1:34382.service: Deactivated successfully. Sep 4 15:42:53.308059 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 15:42:53.309236 systemd-logind[1569]: Session 17 logged out. Waiting for processes to exit. Sep 4 15:42:53.310807 systemd-logind[1569]: Removed session 17. Sep 4 15:42:53.533433 containerd[1599]: time="2025-09-04T15:42:53.533388516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\" id:\"aa8d2906edbed2532b955f67c01ff08a5967497ffe69f0d92284e3dae0cc7ee0\" pid:5525 exit_status:1 exited_at:{seconds:1757000573 nanos:533071718}" Sep 4 15:42:56.783420 containerd[1599]: time="2025-09-04T15:42:56.783338746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:56.784077 containerd[1599]: time="2025-09-04T15:42:56.784034250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 15:42:56.785252 containerd[1599]: time="2025-09-04T15:42:56.785206578Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:56.787223 containerd[1599]: time="2025-09-04T15:42:56.787188238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:42:56.787753 containerd[1599]: time="2025-09-04T15:42:56.787711833Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 5.408753845s" Sep 4 15:42:56.787799 containerd[1599]: time="2025-09-04T15:42:56.787752952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 15:42:56.791826 containerd[1599]: time="2025-09-04T15:42:56.791789963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 15:42:56.808229 containerd[1599]: time="2025-09-04T15:42:56.808021979Z" level=info msg="CreateContainer within sandbox \"3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 15:42:56.820073 containerd[1599]: time="2025-09-04T15:42:56.820017813Z" level=info msg="Container f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:42:56.897866 containerd[1599]: time="2025-09-04T15:42:56.897803677Z" level=info msg="CreateContainer within sandbox \"3ed03d37cd10d772e65fb492f4759b70b8ebad23199684733a7d7e782df298e3\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\"" Sep 4 15:42:56.898603 containerd[1599]: time="2025-09-04T15:42:56.898541070Z" level=info msg="StartContainer for \"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\"" Sep 4 15:42:56.900343 containerd[1599]: time="2025-09-04T15:42:56.900299763Z" level=info msg="connecting to shim f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304" address="unix:///run/containerd/s/dc6477e4a8f80f3d30a24e3eba3af929b9959ad22cf602a5b351445d4def6c79" protocol=ttrpc version=3 Sep 4 15:42:56.931390 systemd[1]: Started cri-containerd-f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304.scope - libcontainer container f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304. Sep 4 15:42:56.984116 containerd[1599]: time="2025-09-04T15:42:56.984053540Z" level=info msg="StartContainer for \"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\" returns successfully" Sep 4 15:42:57.481522 kubelet[2721]: I0904 15:42:57.481191 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-cd6bff945-8bhdp" podStartSLOduration=46.090735181 podStartE2EDuration="1m4.481172328s" podCreationTimestamp="2025-09-04 15:41:53 +0000 UTC" firstStartedPulling="2025-09-04 15:42:38.401192899 +0000 UTC m=+61.311665298" lastFinishedPulling="2025-09-04 15:42:56.791630016 +0000 UTC m=+79.702102445" observedRunningTime="2025-09-04 15:42:57.48116794 +0000 UTC m=+80.391640339" watchObservedRunningTime="2025-09-04 15:42:57.481172328 +0000 UTC m=+80.391644717" Sep 4 15:42:57.513558 containerd[1599]: time="2025-09-04T15:42:57.513507148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\" id:\"0a31a45e5e762befceda7f6bdf50922ca52fad7acfedef58572fee57f6f2d952\" pid:5600 exited_at:{seconds:1757000577 nanos:513132339}" Sep 4 15:42:58.317202 systemd[1]: Started sshd@17-10.0.0.9:22-10.0.0.1:34390.service - OpenSSH per-connection server daemon (10.0.0.1:34390). Sep 4 15:42:58.395846 sshd[5611]: Accepted publickey for core from 10.0.0.1 port 34390 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:42:58.398088 sshd-session[5611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:42:58.403768 systemd-logind[1569]: New session 18 of user core. Sep 4 15:42:58.413380 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 15:42:58.523822 containerd[1599]: time="2025-09-04T15:42:58.523758699Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\" id:\"0b92d96ddcbe5b1fb6ce13c040bd8604aefd26180f139c832ff1f2748c1d0e20\" pid:5627 exited_at:{seconds:1757000578 nanos:523416814}" Sep 4 15:42:58.571279 sshd[5633]: Connection closed by 10.0.0.1 port 34390 Sep 4 15:42:58.571753 sshd-session[5611]: pam_unix(sshd:session): session closed for user core Sep 4 15:42:58.575900 systemd[1]: sshd@17-10.0.0.9:22-10.0.0.1:34390.service: Deactivated successfully. Sep 4 15:42:58.578842 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 15:42:58.581200 systemd-logind[1569]: Session 18 logged out. Waiting for processes to exit. Sep 4 15:42:58.583087 systemd-logind[1569]: Removed session 18. Sep 4 15:43:03.254747 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1618436876.mount: Deactivated successfully. Sep 4 15:43:03.274053 containerd[1599]: time="2025-09-04T15:43:03.273999861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:03.274905 containerd[1599]: time="2025-09-04T15:43:03.274873039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 15:43:03.276037 containerd[1599]: time="2025-09-04T15:43:03.275995673Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:03.278386 containerd[1599]: time="2025-09-04T15:43:03.278336324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:03.278898 containerd[1599]: time="2025-09-04T15:43:03.278852411Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 6.487033823s" Sep 4 15:43:03.278898 containerd[1599]: time="2025-09-04T15:43:03.278885644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 15:43:03.280155 containerd[1599]: time="2025-09-04T15:43:03.280114962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 15:43:03.282381 containerd[1599]: time="2025-09-04T15:43:03.282341987Z" level=info msg="CreateContainer within sandbox \"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 15:43:03.392464 containerd[1599]: time="2025-09-04T15:43:03.392396062Z" level=info msg="Container 71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:43:03.401826 containerd[1599]: time="2025-09-04T15:43:03.401779397Z" level=info msg="CreateContainer within sandbox \"f8e704dcad8a9ed11dc44bc76bc1e1bac813bb2ead356681f3ceb3dd4549af83\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9\"" Sep 4 15:43:03.402677 containerd[1599]: time="2025-09-04T15:43:03.402640533Z" level=info msg="StartContainer for \"71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9\"" Sep 4 15:43:03.403666 containerd[1599]: time="2025-09-04T15:43:03.403637367Z" level=info msg="connecting to shim 71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9" address="unix:///run/containerd/s/4063f9ce26a77cfa5a6825c641d8cf81c39de5b8fe807422de0bd19271da1c16" protocol=ttrpc version=3 Sep 4 15:43:03.429363 systemd[1]: Started cri-containerd-71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9.scope - libcontainer container 71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9. Sep 4 15:43:03.477966 containerd[1599]: time="2025-09-04T15:43:03.477916808Z" level=info msg="StartContainer for \"71bd38a7c672a4e429fe2da53bbd5dc15cdafb6209d5c7a42f01348dc783e9c9\" returns successfully" Sep 4 15:43:03.595133 systemd[1]: Started sshd@18-10.0.0.9:22-10.0.0.1:36618.service - OpenSSH per-connection server daemon (10.0.0.1:36618). Sep 4 15:43:03.697022 sshd[5695]: Accepted publickey for core from 10.0.0.1 port 36618 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:03.699063 sshd-session[5695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:03.703641 systemd-logind[1569]: New session 19 of user core. Sep 4 15:43:03.713349 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 15:43:03.905522 sshd[5698]: Connection closed by 10.0.0.1 port 36618 Sep 4 15:43:03.905866 sshd-session[5695]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:03.915458 systemd[1]: sshd@18-10.0.0.9:22-10.0.0.1:36618.service: Deactivated successfully. Sep 4 15:43:03.917710 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 15:43:03.918749 systemd-logind[1569]: Session 19 logged out. Waiting for processes to exit. Sep 4 15:43:03.921958 systemd[1]: Started sshd@19-10.0.0.9:22-10.0.0.1:36628.service - OpenSSH per-connection server daemon (10.0.0.1:36628). Sep 4 15:43:03.923510 systemd-logind[1569]: Removed session 19. Sep 4 15:43:03.980720 sshd[5713]: Accepted publickey for core from 10.0.0.1 port 36628 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:03.982462 sshd-session[5713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:03.987029 systemd-logind[1569]: New session 20 of user core. Sep 4 15:43:03.994357 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 15:43:04.470176 sshd[5716]: Connection closed by 10.0.0.1 port 36628 Sep 4 15:43:04.470607 sshd-session[5713]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:04.483153 systemd[1]: sshd@19-10.0.0.9:22-10.0.0.1:36628.service: Deactivated successfully. Sep 4 15:43:04.486013 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 15:43:04.486901 systemd-logind[1569]: Session 20 logged out. Waiting for processes to exit. Sep 4 15:43:04.490286 systemd[1]: Started sshd@20-10.0.0.9:22-10.0.0.1:36630.service - OpenSSH per-connection server daemon (10.0.0.1:36630). Sep 4 15:43:04.493827 systemd-logind[1569]: Removed session 20. Sep 4 15:43:04.499910 kubelet[2721]: I0904 15:43:04.499848 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-655f54c8b4-9dm8z" podStartSLOduration=2.19840775 podStartE2EDuration="36.499831238s" podCreationTimestamp="2025-09-04 15:42:28 +0000 UTC" firstStartedPulling="2025-09-04 15:42:28.9785267 +0000 UTC m=+51.888999099" lastFinishedPulling="2025-09-04 15:43:03.279950198 +0000 UTC m=+86.190422587" observedRunningTime="2025-09-04 15:43:04.498341263 +0000 UTC m=+87.408813662" watchObservedRunningTime="2025-09-04 15:43:04.499831238 +0000 UTC m=+87.410303647" Sep 4 15:43:04.554159 sshd[5728]: Accepted publickey for core from 10.0.0.1 port 36630 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:04.555552 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:04.560077 systemd-logind[1569]: New session 21 of user core. Sep 4 15:43:04.570360 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 15:43:05.179533 kubelet[2721]: E0904 15:43:05.179413 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:43:05.280636 sshd[5733]: Connection closed by 10.0.0.1 port 36630 Sep 4 15:43:05.281618 sshd-session[5728]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:05.290353 systemd[1]: sshd@20-10.0.0.9:22-10.0.0.1:36630.service: Deactivated successfully. Sep 4 15:43:05.293047 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 15:43:05.293920 systemd-logind[1569]: Session 21 logged out. Waiting for processes to exit. Sep 4 15:43:05.296728 systemd-logind[1569]: Removed session 21. Sep 4 15:43:05.298044 systemd[1]: Started sshd@21-10.0.0.9:22-10.0.0.1:36638.service - OpenSSH per-connection server daemon (10.0.0.1:36638). Sep 4 15:43:05.366248 sshd[5752]: Accepted publickey for core from 10.0.0.1 port 36638 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:05.367737 sshd-session[5752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:05.372787 systemd-logind[1569]: New session 22 of user core. Sep 4 15:43:05.383363 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 15:43:05.883813 sshd[5755]: Connection closed by 10.0.0.1 port 36638 Sep 4 15:43:05.884552 sshd-session[5752]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:05.895956 systemd[1]: sshd@21-10.0.0.9:22-10.0.0.1:36638.service: Deactivated successfully. Sep 4 15:43:05.898388 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 15:43:05.899178 systemd-logind[1569]: Session 22 logged out. Waiting for processes to exit. Sep 4 15:43:05.902955 systemd[1]: Started sshd@22-10.0.0.9:22-10.0.0.1:36644.service - OpenSSH per-connection server daemon (10.0.0.1:36644). Sep 4 15:43:05.904255 systemd-logind[1569]: Removed session 22. Sep 4 15:43:05.954994 sshd[5766]: Accepted publickey for core from 10.0.0.1 port 36644 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:05.956753 sshd-session[5766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:05.961739 systemd-logind[1569]: New session 23 of user core. Sep 4 15:43:05.972363 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 15:43:06.083651 sshd[5769]: Connection closed by 10.0.0.1 port 36644 Sep 4 15:43:06.084080 sshd-session[5766]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:06.087587 systemd[1]: sshd@22-10.0.0.9:22-10.0.0.1:36644.service: Deactivated successfully. Sep 4 15:43:06.089715 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 15:43:06.091789 systemd-logind[1569]: Session 23 logged out. Waiting for processes to exit. Sep 4 15:43:06.093108 systemd-logind[1569]: Removed session 23. Sep 4 15:43:06.768723 kubelet[2721]: I0904 15:43:06.768662 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:43:07.281147 kubelet[2721]: I0904 15:43:07.281087 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 15:43:08.098165 containerd[1599]: time="2025-09-04T15:43:08.098083881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:08.098839 containerd[1599]: time="2025-09-04T15:43:08.098808392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 15:43:08.102800 containerd[1599]: time="2025-09-04T15:43:08.102770985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 4.82261779s" Sep 4 15:43:08.102850 containerd[1599]: time="2025-09-04T15:43:08.102804439Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 15:43:08.103704 containerd[1599]: time="2025-09-04T15:43:08.103674297Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:08.104443 containerd[1599]: time="2025-09-04T15:43:08.104411181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:08.105153 containerd[1599]: time="2025-09-04T15:43:08.105122147Z" level=info msg="CreateContainer within sandbox \"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 15:43:08.150554 containerd[1599]: time="2025-09-04T15:43:08.150496084Z" level=info msg="Container 74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:43:08.204559 containerd[1599]: time="2025-09-04T15:43:08.204506774Z" level=info msg="CreateContainer within sandbox \"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b\"" Sep 4 15:43:08.205446 containerd[1599]: time="2025-09-04T15:43:08.205043436Z" level=info msg="StartContainer for \"74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b\"" Sep 4 15:43:08.206796 containerd[1599]: time="2025-09-04T15:43:08.206769166Z" level=info msg="connecting to shim 74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b" address="unix:///run/containerd/s/30108420f119f1304121c176512ab363c79313e43da16cb1a30422ada3a8dedc" protocol=ttrpc version=3 Sep 4 15:43:08.234345 systemd[1]: Started cri-containerd-74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b.scope - libcontainer container 74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b. Sep 4 15:43:08.395489 containerd[1599]: time="2025-09-04T15:43:08.395352343Z" level=info msg="StartContainer for \"74218da16dad91900b37dd439f3a44c5891580f7990bca0cbf7e4546615ee33b\" returns successfully" Sep 4 15:43:08.397110 containerd[1599]: time="2025-09-04T15:43:08.397071340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 15:43:11.101592 systemd[1]: Started sshd@23-10.0.0.9:22-10.0.0.1:40802.service - OpenSSH per-connection server daemon (10.0.0.1:40802). Sep 4 15:43:11.179663 kubelet[2721]: E0904 15:43:11.179545 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:43:11.189227 sshd[5829]: Accepted publickey for core from 10.0.0.1 port 40802 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:11.191384 sshd-session[5829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:11.196592 systemd-logind[1569]: New session 24 of user core. Sep 4 15:43:11.208510 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 15:43:11.453921 sshd[5832]: Connection closed by 10.0.0.1 port 40802 Sep 4 15:43:11.454260 sshd-session[5829]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:11.459589 systemd[1]: sshd@23-10.0.0.9:22-10.0.0.1:40802.service: Deactivated successfully. Sep 4 15:43:11.462045 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 15:43:11.462997 systemd-logind[1569]: Session 24 logged out. Waiting for processes to exit. Sep 4 15:43:11.464281 systemd-logind[1569]: Removed session 24. Sep 4 15:43:13.745095 containerd[1599]: time="2025-09-04T15:43:13.745029930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:13.745761 containerd[1599]: time="2025-09-04T15:43:13.745715604Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 15:43:13.746869 containerd[1599]: time="2025-09-04T15:43:13.746830926Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:13.749418 containerd[1599]: time="2025-09-04T15:43:13.749388001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 15:43:13.749813 containerd[1599]: time="2025-09-04T15:43:13.749776490Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 5.352669582s" Sep 4 15:43:13.749855 containerd[1599]: time="2025-09-04T15:43:13.749818570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 15:43:13.753164 containerd[1599]: time="2025-09-04T15:43:13.753102177Z" level=info msg="CreateContainer within sandbox \"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 15:43:13.761044 containerd[1599]: time="2025-09-04T15:43:13.761005652Z" level=info msg="Container 14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135: CDI devices from CRI Config.CDIDevices: []" Sep 4 15:43:13.772681 containerd[1599]: time="2025-09-04T15:43:13.772627843Z" level=info msg="CreateContainer within sandbox \"5115f758f47887875c05a62ddf55f2e53eddbd724060bfe61401b3a2b2949a55\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135\"" Sep 4 15:43:13.773272 containerd[1599]: time="2025-09-04T15:43:13.773228566Z" level=info msg="StartContainer for \"14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135\"" Sep 4 15:43:13.774788 containerd[1599]: time="2025-09-04T15:43:13.774757345Z" level=info msg="connecting to shim 14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135" address="unix:///run/containerd/s/30108420f119f1304121c176512ab363c79313e43da16cb1a30422ada3a8dedc" protocol=ttrpc version=3 Sep 4 15:43:13.804372 systemd[1]: Started cri-containerd-14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135.scope - libcontainer container 14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135. Sep 4 15:43:13.865535 containerd[1599]: time="2025-09-04T15:43:13.865474389Z" level=info msg="StartContainer for \"14e7548005ace4798b406dedf699e1ccbc655d6f852d1172247584bb8802b135\" returns successfully" Sep 4 15:43:14.282548 kubelet[2721]: I0904 15:43:14.282495 2721 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 15:43:14.282548 kubelet[2721]: I0904 15:43:14.282547 2721 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 15:43:14.542435 kubelet[2721]: I0904 15:43:14.542225 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-g5qkw" podStartSLOduration=49.282503151 podStartE2EDuration="1m21.542186821s" podCreationTimestamp="2025-09-04 15:41:53 +0000 UTC" firstStartedPulling="2025-09-04 15:42:41.490933489 +0000 UTC m=+64.401405889" lastFinishedPulling="2025-09-04 15:43:13.750617159 +0000 UTC m=+96.661089559" observedRunningTime="2025-09-04 15:43:14.542123181 +0000 UTC m=+97.452595580" watchObservedRunningTime="2025-09-04 15:43:14.542186821 +0000 UTC m=+97.452659220" Sep 4 15:43:16.179722 kubelet[2721]: E0904 15:43:16.179656 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 4 15:43:16.469923 systemd[1]: Started sshd@24-10.0.0.9:22-10.0.0.1:40804.service - OpenSSH per-connection server daemon (10.0.0.1:40804). Sep 4 15:43:16.554746 sshd[5884]: Accepted publickey for core from 10.0.0.1 port 40804 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:16.556709 sshd-session[5884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:16.561435 systemd-logind[1569]: New session 25 of user core. Sep 4 15:43:16.576504 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 15:43:16.905703 sshd[5887]: Connection closed by 10.0.0.1 port 40804 Sep 4 15:43:16.907654 sshd-session[5884]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:16.912185 systemd[1]: sshd@24-10.0.0.9:22-10.0.0.1:40804.service: Deactivated successfully. Sep 4 15:43:16.915306 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 15:43:16.916189 systemd-logind[1569]: Session 25 logged out. Waiting for processes to exit. Sep 4 15:43:16.918235 systemd-logind[1569]: Removed session 25. Sep 4 15:43:20.022330 containerd[1599]: time="2025-09-04T15:43:20.022276213Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\" id:\"0065a2376ac4a7d10597bd2ffc601167deafe66bc9b61ae47bc62853b6ebecb4\" pid:5913 exited_at:{seconds:1757000600 nanos:21905008}" Sep 4 15:43:21.834417 containerd[1599]: time="2025-09-04T15:43:21.834360771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\" id:\"315ec76f5622210de087967d87107f2f914cf80f636c1f3ce27f8cde3f247148\" pid:5940 exited_at:{seconds:1757000601 nanos:833757987}" Sep 4 15:43:21.919449 systemd[1]: Started sshd@25-10.0.0.9:22-10.0.0.1:41008.service - OpenSSH per-connection server daemon (10.0.0.1:41008). Sep 4 15:43:21.987258 sshd[5952]: Accepted publickey for core from 10.0.0.1 port 41008 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:21.989310 sshd-session[5952]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:21.994167 systemd-logind[1569]: New session 26 of user core. Sep 4 15:43:21.999422 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 15:43:22.139385 sshd[5955]: Connection closed by 10.0.0.1 port 41008 Sep 4 15:43:22.139673 sshd-session[5952]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:22.144041 systemd[1]: sshd@25-10.0.0.9:22-10.0.0.1:41008.service: Deactivated successfully. Sep 4 15:43:22.146439 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 15:43:22.147189 systemd-logind[1569]: Session 26 logged out. Waiting for processes to exit. Sep 4 15:43:22.148551 systemd-logind[1569]: Removed session 26. Sep 4 15:43:23.555493 containerd[1599]: time="2025-09-04T15:43:23.555439584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3645fcca175812905a478de322445e18418dc950cce11f342da4511829fc50a0\" id:\"d2889ef3382b2dadfcc65b92bce9dae584c7a92358d99f0031be71e3b12ab1db\" pid:5980 exited_at:{seconds:1757000603 nanos:555037020}" Sep 4 15:43:27.157530 systemd[1]: Started sshd@26-10.0.0.9:22-10.0.0.1:41022.service - OpenSSH per-connection server daemon (10.0.0.1:41022). Sep 4 15:43:27.220235 sshd[5995]: Accepted publickey for core from 10.0.0.1 port 41022 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:27.222161 sshd-session[5995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:27.226758 systemd-logind[1569]: New session 27 of user core. Sep 4 15:43:27.237436 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 4 15:43:27.370756 sshd[5998]: Connection closed by 10.0.0.1 port 41022 Sep 4 15:43:27.371087 sshd-session[5995]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:27.377878 systemd[1]: sshd@26-10.0.0.9:22-10.0.0.1:41022.service: Deactivated successfully. Sep 4 15:43:27.380255 systemd[1]: session-27.scope: Deactivated successfully. Sep 4 15:43:27.381300 systemd-logind[1569]: Session 27 logged out. Waiting for processes to exit. Sep 4 15:43:27.382708 systemd-logind[1569]: Removed session 27. Sep 4 15:43:27.513326 containerd[1599]: time="2025-09-04T15:43:27.513169796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f5555092ea73dac236e3b85c763cd757a4f5d8c7b3fe803e19cb920a9d690304\" id:\"a986cb8c5b5710d35233b72a1dca39f2640f72db9bb0e10d5dc91239bb419851\" pid:6023 exited_at:{seconds:1757000607 nanos:512900606}" Sep 4 15:43:28.461692 containerd[1599]: time="2025-09-04T15:43:28.461636493Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee0644b2c8fcfbc24193a3021d168df3de24e21dbb9f5d14ae2b4c2ff5ebe858\" id:\"a38659ebf5f5da753afd8b8c6f1426a547d2c834868363093bb44cd48f43fb6d\" pid:6046 exited_at:{seconds:1757000608 nanos:460099621}" Sep 4 15:43:32.385235 systemd[1]: Started sshd@27-10.0.0.9:22-10.0.0.1:53420.service - OpenSSH per-connection server daemon (10.0.0.1:53420). Sep 4 15:43:32.457320 sshd[6059]: Accepted publickey for core from 10.0.0.1 port 53420 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:32.459129 sshd-session[6059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:32.463706 systemd-logind[1569]: New session 28 of user core. Sep 4 15:43:32.470346 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 4 15:43:32.618252 sshd[6063]: Connection closed by 10.0.0.1 port 53420 Sep 4 15:43:32.618605 sshd-session[6059]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:32.623702 systemd[1]: sshd@27-10.0.0.9:22-10.0.0.1:53420.service: Deactivated successfully. Sep 4 15:43:32.626350 systemd[1]: session-28.scope: Deactivated successfully. Sep 4 15:43:32.627307 systemd-logind[1569]: Session 28 logged out. Waiting for processes to exit. Sep 4 15:43:32.629325 systemd-logind[1569]: Removed session 28. Sep 4 15:43:37.632412 systemd[1]: Started sshd@28-10.0.0.9:22-10.0.0.1:53426.service - OpenSSH per-connection server daemon (10.0.0.1:53426). Sep 4 15:43:37.682895 sshd[6080]: Accepted publickey for core from 10.0.0.1 port 53426 ssh2: RSA SHA256:SszFEPmrTqRUaDyvoUBO6h80kx9iM4Mig8Tr827cHt8 Sep 4 15:43:37.684402 sshd-session[6080]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 15:43:37.689352 systemd-logind[1569]: New session 29 of user core. Sep 4 15:43:37.700356 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 4 15:43:37.860762 sshd[6083]: Connection closed by 10.0.0.1 port 53426 Sep 4 15:43:37.861747 sshd-session[6080]: pam_unix(sshd:session): session closed for user core Sep 4 15:43:37.867221 systemd-logind[1569]: Session 29 logged out. Waiting for processes to exit. Sep 4 15:43:37.867802 systemd[1]: sshd@28-10.0.0.9:22-10.0.0.1:53426.service: Deactivated successfully. Sep 4 15:43:37.872889 systemd[1]: session-29.scope: Deactivated successfully. Sep 4 15:43:37.878939 systemd-logind[1569]: Removed session 29.