Sep 10 05:22:12.811946 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Sep 10 03:32:41 -00 2025 Sep 10 05:22:12.811975 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cb34a525c000ff57e16870cd9f0af09c033a700c5f8ee35d58f46d8926fcf6e5 Sep 10 05:22:12.811985 kernel: BIOS-provided physical RAM map: Sep 10 05:22:12.811992 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 10 05:22:12.811998 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 10 05:22:12.812004 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 10 05:22:12.812012 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Sep 10 05:22:12.812019 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Sep 10 05:22:12.812039 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 10 05:22:12.812045 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 10 05:22:12.812052 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 10 05:22:12.812059 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 10 05:22:12.812065 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 10 05:22:12.812072 kernel: NX (Execute Disable) protection: active Sep 10 05:22:12.812083 kernel: APIC: Static calls initialized Sep 10 05:22:12.812090 kernel: SMBIOS 2.8 present. Sep 10 05:22:12.812101 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Sep 10 05:22:12.812108 kernel: DMI: Memory slots populated: 1/1 Sep 10 05:22:12.812115 kernel: Hypervisor detected: KVM Sep 10 05:22:12.812122 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 10 05:22:12.812180 kernel: kvm-clock: using sched offset of 4797022255 cycles Sep 10 05:22:12.812188 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 10 05:22:12.812195 kernel: tsc: Detected 2794.750 MHz processor Sep 10 05:22:12.812203 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 10 05:22:12.812237 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 10 05:22:12.812246 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Sep 10 05:22:12.812253 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 10 05:22:12.812261 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 10 05:22:12.812268 kernel: Using GB pages for direct mapping Sep 10 05:22:12.812276 kernel: ACPI: Early table checksum verification disabled Sep 10 05:22:12.812283 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Sep 10 05:22:12.812291 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812307 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812315 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812322 kernel: ACPI: FACS 0x000000009CFE0000 000040 Sep 10 05:22:12.812330 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812337 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812344 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812352 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 05:22:12.812359 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Sep 10 05:22:12.812373 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Sep 10 05:22:12.812380 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Sep 10 05:22:12.812388 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Sep 10 05:22:12.812395 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Sep 10 05:22:12.812403 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Sep 10 05:22:12.812410 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Sep 10 05:22:12.812420 kernel: No NUMA configuration found Sep 10 05:22:12.812428 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Sep 10 05:22:12.812435 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Sep 10 05:22:12.812443 kernel: Zone ranges: Sep 10 05:22:12.812450 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 10 05:22:12.812458 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Sep 10 05:22:12.812465 kernel: Normal empty Sep 10 05:22:12.812473 kernel: Device empty Sep 10 05:22:12.812480 kernel: Movable zone start for each node Sep 10 05:22:12.812488 kernel: Early memory node ranges Sep 10 05:22:12.812498 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 10 05:22:12.812505 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Sep 10 05:22:12.812513 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Sep 10 05:22:12.812520 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 10 05:22:12.812539 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 10 05:22:12.812550 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 10 05:22:12.812558 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 10 05:22:12.812570 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 10 05:22:12.812579 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 10 05:22:12.812589 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 10 05:22:12.812597 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 10 05:22:12.812607 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 10 05:22:12.812615 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 10 05:22:12.812622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 10 05:22:12.812630 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 10 05:22:12.812637 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 10 05:22:12.812645 kernel: TSC deadline timer available Sep 10 05:22:12.812653 kernel: CPU topo: Max. logical packages: 1 Sep 10 05:22:12.812663 kernel: CPU topo: Max. logical dies: 1 Sep 10 05:22:12.812670 kernel: CPU topo: Max. dies per package: 1 Sep 10 05:22:12.812678 kernel: CPU topo: Max. threads per core: 1 Sep 10 05:22:12.812685 kernel: CPU topo: Num. cores per package: 4 Sep 10 05:22:12.812693 kernel: CPU topo: Num. threads per package: 4 Sep 10 05:22:12.812700 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 10 05:22:12.812708 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 10 05:22:12.812715 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 10 05:22:12.812723 kernel: kvm-guest: setup PV sched yield Sep 10 05:22:12.812730 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 10 05:22:12.812741 kernel: Booting paravirtualized kernel on KVM Sep 10 05:22:12.812749 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 10 05:22:12.812757 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 10 05:22:12.812764 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 10 05:22:12.812772 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 10 05:22:12.812779 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 10 05:22:12.812786 kernel: kvm-guest: PV spinlocks enabled Sep 10 05:22:12.812794 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 10 05:22:12.812803 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cb34a525c000ff57e16870cd9f0af09c033a700c5f8ee35d58f46d8926fcf6e5 Sep 10 05:22:12.812814 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 10 05:22:12.812868 kernel: random: crng init done Sep 10 05:22:12.812876 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 10 05:22:12.812884 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 10 05:22:12.812891 kernel: Fallback order for Node 0: 0 Sep 10 05:22:12.812899 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Sep 10 05:22:12.812906 kernel: Policy zone: DMA32 Sep 10 05:22:12.812914 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 10 05:22:12.812924 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 10 05:22:12.812932 kernel: ftrace: allocating 40102 entries in 157 pages Sep 10 05:22:12.812939 kernel: ftrace: allocated 157 pages with 5 groups Sep 10 05:22:12.812947 kernel: Dynamic Preempt: voluntary Sep 10 05:22:12.812954 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 10 05:22:12.812963 kernel: rcu: RCU event tracing is enabled. Sep 10 05:22:12.812971 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 10 05:22:12.812979 kernel: Trampoline variant of Tasks RCU enabled. Sep 10 05:22:12.812990 kernel: Rude variant of Tasks RCU enabled. Sep 10 05:22:12.813000 kernel: Tracing variant of Tasks RCU enabled. Sep 10 05:22:12.813008 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 10 05:22:12.813016 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 10 05:22:12.813024 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 10 05:22:12.813031 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 10 05:22:12.813039 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 10 05:22:12.813047 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 10 05:22:12.813054 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 10 05:22:12.813072 kernel: Console: colour VGA+ 80x25 Sep 10 05:22:12.813080 kernel: printk: legacy console [ttyS0] enabled Sep 10 05:22:12.813087 kernel: ACPI: Core revision 20240827 Sep 10 05:22:12.813095 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 10 05:22:12.813106 kernel: APIC: Switch to symmetric I/O mode setup Sep 10 05:22:12.813113 kernel: x2apic enabled Sep 10 05:22:12.813121 kernel: APIC: Switched APIC routing to: physical x2apic Sep 10 05:22:12.813146 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 10 05:22:12.813155 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 10 05:22:12.813166 kernel: kvm-guest: setup PV IPIs Sep 10 05:22:12.813173 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 10 05:22:12.813182 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 10 05:22:12.813190 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 10 05:22:12.813198 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 10 05:22:12.813206 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 10 05:22:12.813213 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 10 05:22:12.813221 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 10 05:22:12.813232 kernel: Spectre V2 : Mitigation: Retpolines Sep 10 05:22:12.813240 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 10 05:22:12.813248 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 10 05:22:12.813256 kernel: active return thunk: retbleed_return_thunk Sep 10 05:22:12.813264 kernel: RETBleed: Mitigation: untrained return thunk Sep 10 05:22:12.813272 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 10 05:22:12.813317 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 10 05:22:12.813326 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 10 05:22:12.813334 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 10 05:22:12.813345 kernel: active return thunk: srso_return_thunk Sep 10 05:22:12.813353 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 10 05:22:12.813361 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 10 05:22:12.813369 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 10 05:22:12.813377 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 10 05:22:12.813385 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 10 05:22:12.813393 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 10 05:22:12.813401 kernel: Freeing SMP alternatives memory: 32K Sep 10 05:22:12.813411 kernel: pid_max: default: 32768 minimum: 301 Sep 10 05:22:12.813419 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 10 05:22:12.813427 kernel: landlock: Up and running. Sep 10 05:22:12.813435 kernel: SELinux: Initializing. Sep 10 05:22:12.813446 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 10 05:22:12.813455 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 10 05:22:12.813463 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 10 05:22:12.813471 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 10 05:22:12.813479 kernel: ... version: 0 Sep 10 05:22:12.813489 kernel: ... bit width: 48 Sep 10 05:22:12.813497 kernel: ... generic registers: 6 Sep 10 05:22:12.813505 kernel: ... value mask: 0000ffffffffffff Sep 10 05:22:12.813513 kernel: ... max period: 00007fffffffffff Sep 10 05:22:12.813521 kernel: ... fixed-purpose events: 0 Sep 10 05:22:12.813537 kernel: ... event mask: 000000000000003f Sep 10 05:22:12.813544 kernel: signal: max sigframe size: 1776 Sep 10 05:22:12.813552 kernel: rcu: Hierarchical SRCU implementation. Sep 10 05:22:12.813560 kernel: rcu: Max phase no-delay instances is 400. Sep 10 05:22:12.813568 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 10 05:22:12.813579 kernel: smp: Bringing up secondary CPUs ... Sep 10 05:22:12.813587 kernel: smpboot: x86: Booting SMP configuration: Sep 10 05:22:12.813595 kernel: .... node #0, CPUs: #1 #2 #3 Sep 10 05:22:12.813602 kernel: smp: Brought up 1 node, 4 CPUs Sep 10 05:22:12.813610 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 10 05:22:12.813618 kernel: Memory: 2428920K/2571752K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54068K init, 2900K bss, 136904K reserved, 0K cma-reserved) Sep 10 05:22:12.813626 kernel: devtmpfs: initialized Sep 10 05:22:12.813634 kernel: x86/mm: Memory block size: 128MB Sep 10 05:22:12.813642 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 10 05:22:12.813653 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 10 05:22:12.813662 kernel: pinctrl core: initialized pinctrl subsystem Sep 10 05:22:12.813671 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 10 05:22:12.813679 kernel: audit: initializing netlink subsys (disabled) Sep 10 05:22:12.813689 kernel: audit: type=2000 audit(1757481730.010:1): state=initialized audit_enabled=0 res=1 Sep 10 05:22:12.813697 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 10 05:22:12.813707 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 10 05:22:12.813715 kernel: cpuidle: using governor menu Sep 10 05:22:12.813723 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 10 05:22:12.813733 kernel: dca service started, version 1.12.1 Sep 10 05:22:12.813741 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 10 05:22:12.813749 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 10 05:22:12.813757 kernel: PCI: Using configuration type 1 for base access Sep 10 05:22:12.813765 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 10 05:22:12.813773 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 10 05:22:12.813780 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 10 05:22:12.813788 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 10 05:22:12.813799 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 10 05:22:12.813807 kernel: ACPI: Added _OSI(Module Device) Sep 10 05:22:12.813815 kernel: ACPI: Added _OSI(Processor Device) Sep 10 05:22:12.813823 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 10 05:22:12.813830 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 10 05:22:12.813838 kernel: ACPI: Interpreter enabled Sep 10 05:22:12.813846 kernel: ACPI: PM: (supports S0 S3 S5) Sep 10 05:22:12.813854 kernel: ACPI: Using IOAPIC for interrupt routing Sep 10 05:22:12.813862 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 10 05:22:12.813869 kernel: PCI: Using E820 reservations for host bridge windows Sep 10 05:22:12.813880 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 10 05:22:12.813888 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 10 05:22:12.814172 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 10 05:22:12.814331 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 10 05:22:12.814455 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 10 05:22:12.814465 kernel: PCI host bridge to bus 0000:00 Sep 10 05:22:12.814616 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 10 05:22:12.814737 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 10 05:22:12.814848 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 10 05:22:12.814958 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Sep 10 05:22:12.815067 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 10 05:22:12.815197 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 10 05:22:12.815309 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 10 05:22:12.815466 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 10 05:22:12.815617 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 10 05:22:12.815749 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Sep 10 05:22:12.815870 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Sep 10 05:22:12.815989 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Sep 10 05:22:12.816109 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 10 05:22:12.816284 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 10 05:22:12.816415 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Sep 10 05:22:12.816546 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Sep 10 05:22:12.816671 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Sep 10 05:22:12.816812 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 10 05:22:12.816935 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Sep 10 05:22:12.817056 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Sep 10 05:22:12.817198 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Sep 10 05:22:12.817346 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 10 05:22:12.817470 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Sep 10 05:22:12.817601 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Sep 10 05:22:12.817722 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Sep 10 05:22:12.817843 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Sep 10 05:22:12.817981 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 10 05:22:12.818110 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 10 05:22:12.818277 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 10 05:22:12.818402 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Sep 10 05:22:12.818522 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Sep 10 05:22:12.818674 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 10 05:22:12.818796 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 10 05:22:12.818807 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 10 05:22:12.818820 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 10 05:22:12.818829 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 10 05:22:12.818837 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 10 05:22:12.818845 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 10 05:22:12.818853 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 10 05:22:12.818860 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 10 05:22:12.818869 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 10 05:22:12.818876 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 10 05:22:12.818884 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 10 05:22:12.818895 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 10 05:22:12.818903 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 10 05:22:12.818911 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 10 05:22:12.818919 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 10 05:22:12.818927 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 10 05:22:12.818935 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 10 05:22:12.818943 kernel: iommu: Default domain type: Translated Sep 10 05:22:12.818951 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 10 05:22:12.818959 kernel: PCI: Using ACPI for IRQ routing Sep 10 05:22:12.818970 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 10 05:22:12.818978 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 10 05:22:12.818986 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Sep 10 05:22:12.819107 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 10 05:22:12.819249 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 10 05:22:12.819371 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 10 05:22:12.819382 kernel: vgaarb: loaded Sep 10 05:22:12.819390 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 10 05:22:12.819403 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 10 05:22:12.819411 kernel: clocksource: Switched to clocksource kvm-clock Sep 10 05:22:12.819419 kernel: VFS: Disk quotas dquot_6.6.0 Sep 10 05:22:12.819427 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 10 05:22:12.819435 kernel: pnp: PnP ACPI init Sep 10 05:22:12.819588 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 10 05:22:12.819601 kernel: pnp: PnP ACPI: found 6 devices Sep 10 05:22:12.819609 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 10 05:22:12.819621 kernel: NET: Registered PF_INET protocol family Sep 10 05:22:12.819629 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 10 05:22:12.819637 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 10 05:22:12.819645 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 10 05:22:12.819653 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 10 05:22:12.819661 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 10 05:22:12.819669 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 10 05:22:12.819677 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 10 05:22:12.819685 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 10 05:22:12.819696 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 10 05:22:12.819704 kernel: NET: Registered PF_XDP protocol family Sep 10 05:22:12.819816 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 10 05:22:12.819927 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 10 05:22:12.820038 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 10 05:22:12.820172 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Sep 10 05:22:12.820305 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 10 05:22:12.820448 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 10 05:22:12.820467 kernel: PCI: CLS 0 bytes, default 64 Sep 10 05:22:12.820475 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 10 05:22:12.820488 kernel: Initialise system trusted keyrings Sep 10 05:22:12.820496 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 10 05:22:12.820504 kernel: Key type asymmetric registered Sep 10 05:22:12.820512 kernel: Asymmetric key parser 'x509' registered Sep 10 05:22:12.820520 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 10 05:22:12.820536 kernel: io scheduler mq-deadline registered Sep 10 05:22:12.820546 kernel: io scheduler kyber registered Sep 10 05:22:12.820554 kernel: io scheduler bfq registered Sep 10 05:22:12.820568 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 10 05:22:12.820576 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 10 05:22:12.820585 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 10 05:22:12.820593 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 10 05:22:12.820601 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 10 05:22:12.820609 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 10 05:22:12.820617 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 10 05:22:12.820625 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 10 05:22:12.820633 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 10 05:22:12.820790 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 10 05:22:12.820908 kernel: rtc_cmos 00:04: registered as rtc0 Sep 10 05:22:12.821024 kernel: rtc_cmos 00:04: setting system clock to 2025-09-10T05:22:12 UTC (1757481732) Sep 10 05:22:12.821154 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 10 05:22:12.821166 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 10 05:22:12.821175 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Sep 10 05:22:12.821183 kernel: NET: Registered PF_INET6 protocol family Sep 10 05:22:12.821196 kernel: Segment Routing with IPv6 Sep 10 05:22:12.821203 kernel: In-situ OAM (IOAM) with IPv6 Sep 10 05:22:12.821212 kernel: NET: Registered PF_PACKET protocol family Sep 10 05:22:12.821220 kernel: Key type dns_resolver registered Sep 10 05:22:12.821228 kernel: IPI shorthand broadcast: enabled Sep 10 05:22:12.821236 kernel: sched_clock: Marking stable (2806002228, 109918627)->(2934032640, -18111785) Sep 10 05:22:12.821244 kernel: registered taskstats version 1 Sep 10 05:22:12.821251 kernel: Loading compiled-in X.509 certificates Sep 10 05:22:12.821259 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: f6c45bc801b894d4dac30a723f1f683ea8f7e3ae' Sep 10 05:22:12.821270 kernel: Demotion targets for Node 0: null Sep 10 05:22:12.821278 kernel: Key type .fscrypt registered Sep 10 05:22:12.821286 kernel: Key type fscrypt-provisioning registered Sep 10 05:22:12.821294 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 10 05:22:12.821301 kernel: ima: Allocated hash algorithm: sha1 Sep 10 05:22:12.821309 kernel: ima: No architecture policies found Sep 10 05:22:12.821317 kernel: clk: Disabling unused clocks Sep 10 05:22:12.821325 kernel: Warning: unable to open an initial console. Sep 10 05:22:12.821333 kernel: Freeing unused kernel image (initmem) memory: 54068K Sep 10 05:22:12.821343 kernel: Write protecting the kernel read-only data: 24576k Sep 10 05:22:12.821352 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 10 05:22:12.821359 kernel: Run /init as init process Sep 10 05:22:12.821367 kernel: with arguments: Sep 10 05:22:12.821375 kernel: /init Sep 10 05:22:12.821383 kernel: with environment: Sep 10 05:22:12.821391 kernel: HOME=/ Sep 10 05:22:12.821398 kernel: TERM=linux Sep 10 05:22:12.821406 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 10 05:22:12.821418 systemd[1]: Successfully made /usr/ read-only. Sep 10 05:22:12.821442 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 10 05:22:12.821454 systemd[1]: Detected virtualization kvm. Sep 10 05:22:12.821462 systemd[1]: Detected architecture x86-64. Sep 10 05:22:12.821470 systemd[1]: Running in initrd. Sep 10 05:22:12.821482 systemd[1]: No hostname configured, using default hostname. Sep 10 05:22:12.821491 systemd[1]: Hostname set to . Sep 10 05:22:12.821499 systemd[1]: Initializing machine ID from VM UUID. Sep 10 05:22:12.821508 systemd[1]: Queued start job for default target initrd.target. Sep 10 05:22:12.821517 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 05:22:12.821533 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 05:22:12.821542 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 10 05:22:12.821551 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 10 05:22:12.821563 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 10 05:22:12.821574 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 10 05:22:12.821587 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 10 05:22:12.821598 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 10 05:22:12.821608 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 05:22:12.821619 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 10 05:22:12.821630 systemd[1]: Reached target paths.target - Path Units. Sep 10 05:22:12.821644 systemd[1]: Reached target slices.target - Slice Units. Sep 10 05:22:12.821655 systemd[1]: Reached target swap.target - Swaps. Sep 10 05:22:12.821665 systemd[1]: Reached target timers.target - Timer Units. Sep 10 05:22:12.821676 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 10 05:22:12.821687 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 10 05:22:12.821698 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 10 05:22:12.821709 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 10 05:22:12.821723 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 10 05:22:12.821735 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 10 05:22:12.821749 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 05:22:12.821759 systemd[1]: Reached target sockets.target - Socket Units. Sep 10 05:22:12.821770 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 10 05:22:12.821781 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 10 05:22:12.821793 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 10 05:22:12.821805 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 10 05:22:12.821814 systemd[1]: Starting systemd-fsck-usr.service... Sep 10 05:22:12.821823 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 10 05:22:12.821832 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 10 05:22:12.821840 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 05:22:12.821849 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 10 05:22:12.821861 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 05:22:12.821870 systemd[1]: Finished systemd-fsck-usr.service. Sep 10 05:22:12.821879 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 10 05:22:12.821914 systemd-journald[220]: Collecting audit messages is disabled. Sep 10 05:22:12.821938 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 10 05:22:12.821948 systemd-journald[220]: Journal started Sep 10 05:22:12.821967 systemd-journald[220]: Runtime Journal (/run/log/journal/6b2286d0a82a4307a0bf087cdb0e4688) is 6M, max 48.6M, 42.5M free. Sep 10 05:22:12.810182 systemd-modules-load[222]: Inserted module 'overlay' Sep 10 05:22:12.851833 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 10 05:22:12.851855 kernel: Bridge firewalling registered Sep 10 05:22:12.837283 systemd-modules-load[222]: Inserted module 'br_netfilter' Sep 10 05:22:12.853513 systemd[1]: Started systemd-journald.service - Journal Service. Sep 10 05:22:12.854846 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 10 05:22:12.857202 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 05:22:12.862870 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 10 05:22:12.864856 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 10 05:22:12.867927 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 10 05:22:12.874830 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 10 05:22:12.883631 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 05:22:12.885192 systemd-tmpfiles[243]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 10 05:22:12.890502 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 10 05:22:12.890864 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 05:22:12.893224 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 10 05:22:12.893498 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 10 05:22:12.898567 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 10 05:22:12.921050 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cb34a525c000ff57e16870cd9f0af09c033a700c5f8ee35d58f46d8926fcf6e5 Sep 10 05:22:12.938651 systemd-resolved[260]: Positive Trust Anchors: Sep 10 05:22:12.938667 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 10 05:22:12.938696 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 10 05:22:12.941199 systemd-resolved[260]: Defaulting to hostname 'linux'. Sep 10 05:22:12.942336 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 10 05:22:12.949341 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 10 05:22:13.034166 kernel: SCSI subsystem initialized Sep 10 05:22:13.043160 kernel: Loading iSCSI transport class v2.0-870. Sep 10 05:22:13.053162 kernel: iscsi: registered transport (tcp) Sep 10 05:22:13.074160 kernel: iscsi: registered transport (qla4xxx) Sep 10 05:22:13.074197 kernel: QLogic iSCSI HBA Driver Sep 10 05:22:13.095636 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 10 05:22:13.113829 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 05:22:13.117440 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 10 05:22:13.175591 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 10 05:22:13.179394 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 10 05:22:13.241161 kernel: raid6: avx2x4 gen() 30546 MB/s Sep 10 05:22:13.258155 kernel: raid6: avx2x2 gen() 30924 MB/s Sep 10 05:22:13.275185 kernel: raid6: avx2x1 gen() 25754 MB/s Sep 10 05:22:13.275201 kernel: raid6: using algorithm avx2x2 gen() 30924 MB/s Sep 10 05:22:13.293190 kernel: raid6: .... xor() 19930 MB/s, rmw enabled Sep 10 05:22:13.293217 kernel: raid6: using avx2x2 recovery algorithm Sep 10 05:22:13.313158 kernel: xor: automatically using best checksumming function avx Sep 10 05:22:13.475170 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 10 05:22:13.483353 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 10 05:22:13.487113 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 05:22:13.524007 systemd-udevd[471]: Using default interface naming scheme 'v255'. Sep 10 05:22:13.529690 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 05:22:13.532880 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 10 05:22:13.558033 dracut-pre-trigger[479]: rd.md=0: removing MD RAID activation Sep 10 05:22:13.587492 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 10 05:22:13.591010 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 10 05:22:13.668648 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 05:22:13.672171 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 10 05:22:13.721641 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 10 05:22:13.723164 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 10 05:22:13.730155 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 10 05:22:13.733115 kernel: cryptd: max_cpu_qlen set to 1000 Sep 10 05:22:13.736871 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 10 05:22:13.736930 kernel: GPT:9289727 != 19775487 Sep 10 05:22:13.736962 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 10 05:22:13.736980 kernel: GPT:9289727 != 19775487 Sep 10 05:22:13.737001 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 10 05:22:13.737023 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 10 05:22:13.744155 kernel: libata version 3.00 loaded. Sep 10 05:22:13.745600 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 10 05:22:13.746810 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 05:22:13.749774 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 05:22:13.752659 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 05:22:13.758154 kernel: AES CTR mode by8 optimization enabled Sep 10 05:22:13.759167 kernel: ahci 0000:00:1f.2: version 3.0 Sep 10 05:22:13.761944 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 10 05:22:13.761970 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 10 05:22:13.762158 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 10 05:22:13.763579 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 10 05:22:13.768170 kernel: scsi host0: ahci Sep 10 05:22:13.771649 kernel: scsi host1: ahci Sep 10 05:22:13.772260 kernel: scsi host2: ahci Sep 10 05:22:13.772455 kernel: scsi host3: ahci Sep 10 05:22:13.777155 kernel: scsi host4: ahci Sep 10 05:22:13.794294 kernel: scsi host5: ahci Sep 10 05:22:13.794594 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 1 Sep 10 05:22:13.794613 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 1 Sep 10 05:22:13.794628 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 1 Sep 10 05:22:13.794642 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 1 Sep 10 05:22:13.794656 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 1 Sep 10 05:22:13.794670 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 1 Sep 10 05:22:13.805227 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 10 05:22:13.838483 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 05:22:13.855477 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 10 05:22:13.870702 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 10 05:22:13.873166 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 10 05:22:13.884727 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 10 05:22:13.887933 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 10 05:22:13.922928 disk-uuid[634]: Primary Header is updated. Sep 10 05:22:13.922928 disk-uuid[634]: Secondary Entries is updated. Sep 10 05:22:13.922928 disk-uuid[634]: Secondary Header is updated. Sep 10 05:22:13.926335 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 10 05:22:14.100148 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 10 05:22:14.100207 kernel: ata3.00: LPM support broken, forcing max_power Sep 10 05:22:14.100220 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 10 05:22:14.100230 kernel: ata3.00: applying bridge limits Sep 10 05:22:14.101195 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 10 05:22:14.101255 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 10 05:22:14.102169 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 10 05:22:14.103161 kernel: ata3.00: LPM support broken, forcing max_power Sep 10 05:22:14.104167 kernel: ata3.00: configured for UDMA/100 Sep 10 05:22:14.108157 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 10 05:22:14.108186 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 10 05:22:14.109161 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 10 05:22:14.177716 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 10 05:22:14.177938 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 10 05:22:14.198160 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 10 05:22:14.555612 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 10 05:22:14.556967 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 10 05:22:14.559373 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 05:22:14.559667 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 10 05:22:14.561255 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 10 05:22:14.604084 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 10 05:22:14.934178 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 10 05:22:14.934834 disk-uuid[635]: The operation has completed successfully. Sep 10 05:22:14.971867 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 10 05:22:14.972052 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 10 05:22:15.018403 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 10 05:22:15.059142 sh[665]: Success Sep 10 05:22:15.080169 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 10 05:22:15.080226 kernel: device-mapper: uevent: version 1.0.3 Sep 10 05:22:15.082660 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 10 05:22:15.094686 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 10 05:22:15.131085 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 10 05:22:15.135052 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 10 05:22:15.161009 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 10 05:22:15.166863 kernel: BTRFS: device fsid d8201365-420d-4e6d-a9af-b12a81c8fc98 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (677) Sep 10 05:22:15.166887 kernel: BTRFS info (device dm-0): first mount of filesystem d8201365-420d-4e6d-a9af-b12a81c8fc98 Sep 10 05:22:15.166901 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 10 05:22:15.171529 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 10 05:22:15.171600 kernel: BTRFS info (device dm-0): enabling free space tree Sep 10 05:22:15.173063 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 10 05:22:15.175202 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 10 05:22:15.177782 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 10 05:22:15.180221 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 10 05:22:15.182644 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 10 05:22:15.209178 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (710) Sep 10 05:22:15.211176 kernel: BTRFS info (device vda6): first mount of filesystem 44235b0d-89ef-44b4-a2ec-00ee2c04a5f6 Sep 10 05:22:15.211235 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 10 05:22:15.214165 kernel: BTRFS info (device vda6): turning on async discard Sep 10 05:22:15.214194 kernel: BTRFS info (device vda6): enabling free space tree Sep 10 05:22:15.219164 kernel: BTRFS info (device vda6): last unmount of filesystem 44235b0d-89ef-44b4-a2ec-00ee2c04a5f6 Sep 10 05:22:15.219728 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 10 05:22:15.223571 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 10 05:22:15.337383 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 10 05:22:15.342260 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 10 05:22:15.462620 systemd-networkd[846]: lo: Link UP Sep 10 05:22:15.462631 systemd-networkd[846]: lo: Gained carrier Sep 10 05:22:15.465935 systemd-networkd[846]: Enumeration completed Sep 10 05:22:15.466077 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 10 05:22:15.467302 systemd[1]: Reached target network.target - Network. Sep 10 05:22:15.470580 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 05:22:15.470597 systemd-networkd[846]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 05:22:15.483276 systemd-networkd[846]: eth0: Link UP Sep 10 05:22:15.483698 systemd-networkd[846]: eth0: Gained carrier Sep 10 05:22:15.483730 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 05:22:15.531540 systemd-networkd[846]: eth0: DHCPv4 address 10.0.0.54/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 10 05:22:15.557801 ignition[751]: Ignition 2.22.0 Sep 10 05:22:15.557817 ignition[751]: Stage: fetch-offline Sep 10 05:22:15.557862 ignition[751]: no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:15.557872 ignition[751]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:15.557966 ignition[751]: parsed url from cmdline: "" Sep 10 05:22:15.557971 ignition[751]: no config URL provided Sep 10 05:22:15.557976 ignition[751]: reading system config file "/usr/lib/ignition/user.ign" Sep 10 05:22:15.557985 ignition[751]: no config at "/usr/lib/ignition/user.ign" Sep 10 05:22:15.558014 ignition[751]: op(1): [started] loading QEMU firmware config module Sep 10 05:22:15.558020 ignition[751]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 10 05:22:15.568651 ignition[751]: op(1): [finished] loading QEMU firmware config module Sep 10 05:22:15.610966 ignition[751]: parsing config with SHA512: fdf53638d3f913f6c52e6b8f60acabea5e945df125d13461e5410098c910f6f9a5dc68a231e6a9536ce46389cf76f54afa90c90b9813be153a86bebb825e4d6f Sep 10 05:22:15.620552 unknown[751]: fetched base config from "system" Sep 10 05:22:15.620566 unknown[751]: fetched user config from "qemu" Sep 10 05:22:15.621148 ignition[751]: fetch-offline: fetch-offline passed Sep 10 05:22:15.621258 ignition[751]: Ignition finished successfully Sep 10 05:22:15.624993 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 10 05:22:15.625298 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 10 05:22:15.629063 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 10 05:22:15.687803 ignition[859]: Ignition 2.22.0 Sep 10 05:22:15.687817 ignition[859]: Stage: kargs Sep 10 05:22:15.687961 ignition[859]: no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:15.687972 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:15.691544 ignition[859]: kargs: kargs passed Sep 10 05:22:15.691601 ignition[859]: Ignition finished successfully Sep 10 05:22:15.696621 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 10 05:22:15.699627 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 10 05:22:15.755927 ignition[867]: Ignition 2.22.0 Sep 10 05:22:15.755941 ignition[867]: Stage: disks Sep 10 05:22:15.756084 ignition[867]: no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:15.756096 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:15.756846 ignition[867]: disks: disks passed Sep 10 05:22:15.756897 ignition[867]: Ignition finished successfully Sep 10 05:22:15.763379 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 10 05:22:15.763674 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 10 05:22:15.766271 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 10 05:22:15.768357 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 10 05:22:15.768574 systemd[1]: Reached target sysinit.target - System Initialization. Sep 10 05:22:15.768878 systemd[1]: Reached target basic.target - Basic System. Sep 10 05:22:15.770442 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 10 05:22:15.798548 systemd-fsck[877]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 10 05:22:15.806233 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 10 05:22:15.808694 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 10 05:22:15.974164 kernel: EXT4-fs (vda9): mounted filesystem 8812db3a-0650-4908-b2d8-56c2f0883ee2 r/w with ordered data mode. Quota mode: none. Sep 10 05:22:15.974837 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 10 05:22:15.976166 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 10 05:22:15.978654 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 10 05:22:15.980718 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 10 05:22:15.981774 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 10 05:22:15.981814 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 10 05:22:15.981836 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 10 05:22:15.998521 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 10 05:22:16.000877 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 10 05:22:16.005776 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (885) Sep 10 05:22:16.005799 kernel: BTRFS info (device vda6): first mount of filesystem 44235b0d-89ef-44b4-a2ec-00ee2c04a5f6 Sep 10 05:22:16.005810 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 10 05:22:16.008828 kernel: BTRFS info (device vda6): turning on async discard Sep 10 05:22:16.008860 kernel: BTRFS info (device vda6): enabling free space tree Sep 10 05:22:16.011077 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 10 05:22:16.050960 initrd-setup-root[909]: cut: /sysroot/etc/passwd: No such file or directory Sep 10 05:22:16.055523 initrd-setup-root[916]: cut: /sysroot/etc/group: No such file or directory Sep 10 05:22:16.060032 initrd-setup-root[923]: cut: /sysroot/etc/shadow: No such file or directory Sep 10 05:22:16.065319 initrd-setup-root[930]: cut: /sysroot/etc/gshadow: No such file or directory Sep 10 05:22:16.160705 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 10 05:22:16.162802 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 10 05:22:16.164835 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 10 05:22:16.183102 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 10 05:22:16.185159 kernel: BTRFS info (device vda6): last unmount of filesystem 44235b0d-89ef-44b4-a2ec-00ee2c04a5f6 Sep 10 05:22:16.204263 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 10 05:22:16.224045 ignition[999]: INFO : Ignition 2.22.0 Sep 10 05:22:16.224045 ignition[999]: INFO : Stage: mount Sep 10 05:22:16.225979 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:16.225979 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:16.228123 ignition[999]: INFO : mount: mount passed Sep 10 05:22:16.228895 ignition[999]: INFO : Ignition finished successfully Sep 10 05:22:16.232353 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 10 05:22:16.234433 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 10 05:22:16.260912 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 10 05:22:16.286163 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1012) Sep 10 05:22:16.288159 kernel: BTRFS info (device vda6): first mount of filesystem 44235b0d-89ef-44b4-a2ec-00ee2c04a5f6 Sep 10 05:22:16.288184 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 10 05:22:16.291338 kernel: BTRFS info (device vda6): turning on async discard Sep 10 05:22:16.291360 kernel: BTRFS info (device vda6): enabling free space tree Sep 10 05:22:16.293351 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 10 05:22:16.348424 ignition[1029]: INFO : Ignition 2.22.0 Sep 10 05:22:16.348424 ignition[1029]: INFO : Stage: files Sep 10 05:22:16.350239 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:16.350239 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:16.352613 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping Sep 10 05:22:16.354656 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 10 05:22:16.354656 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 10 05:22:16.358032 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 10 05:22:16.359468 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 10 05:22:16.361224 unknown[1029]: wrote ssh authorized keys file for user: core Sep 10 05:22:16.362286 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 10 05:22:16.364973 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 10 05:22:16.367187 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 10 05:22:16.411281 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 10 05:22:16.538504 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 10 05:22:16.538504 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 10 05:22:16.542215 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 10 05:22:16.543847 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 10 05:22:16.545610 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 10 05:22:16.547248 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 10 05:22:16.548930 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 10 05:22:16.550567 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 10 05:22:16.552491 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 10 05:22:16.554854 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 10 05:22:16.556729 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 10 05:22:16.558601 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 10 05:22:16.561278 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 10 05:22:16.561278 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 10 05:22:16.561278 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 10 05:22:16.904589 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 10 05:22:17.516351 systemd-networkd[846]: eth0: Gained IPv6LL Sep 10 05:22:17.652392 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 10 05:22:17.652392 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 10 05:22:17.656172 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 10 05:22:17.662598 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 10 05:22:17.662598 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 10 05:22:17.662598 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 10 05:22:17.666876 ignition[1029]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 10 05:22:17.666876 ignition[1029]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 10 05:22:17.666876 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 10 05:22:17.666876 ignition[1029]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 10 05:22:17.691884 ignition[1029]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 10 05:22:17.696267 ignition[1029]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 10 05:22:17.697795 ignition[1029]: INFO : files: files passed Sep 10 05:22:17.697795 ignition[1029]: INFO : Ignition finished successfully Sep 10 05:22:17.706678 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 10 05:22:17.709017 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 10 05:22:17.710010 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 10 05:22:17.737720 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 10 05:22:17.737850 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 10 05:22:17.741056 initrd-setup-root-after-ignition[1058]: grep: /sysroot/oem/oem-release: No such file or directory Sep 10 05:22:17.744826 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 10 05:22:17.744826 initrd-setup-root-after-ignition[1060]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 10 05:22:17.747954 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 10 05:22:17.750548 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 10 05:22:17.751947 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 10 05:22:17.755203 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 10 05:22:17.808963 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 10 05:22:17.809103 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 10 05:22:17.811434 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 10 05:22:17.812425 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 10 05:22:17.812773 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 10 05:22:17.813670 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 10 05:22:17.851822 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 10 05:22:17.856182 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 10 05:22:17.882095 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 10 05:22:17.884395 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 05:22:17.884652 systemd[1]: Stopped target timers.target - Timer Units. Sep 10 05:22:17.888293 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 10 05:22:17.888503 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 10 05:22:17.891735 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 10 05:22:17.891929 systemd[1]: Stopped target basic.target - Basic System. Sep 10 05:22:17.895317 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 10 05:22:17.895531 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 10 05:22:17.898543 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 10 05:22:17.899687 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 10 05:22:17.899998 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 10 05:22:17.900673 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 10 05:22:17.901001 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 10 05:22:17.901479 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 10 05:22:17.901785 systemd[1]: Stopped target swap.target - Swaps. Sep 10 05:22:17.902079 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 10 05:22:17.902242 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 10 05:22:17.902962 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 10 05:22:17.903483 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 05:22:17.903758 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 10 05:22:17.904078 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 05:22:17.918539 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 10 05:22:17.918650 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 10 05:22:17.920877 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 10 05:22:17.920985 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 10 05:22:17.923670 systemd[1]: Stopped target paths.target - Path Units. Sep 10 05:22:17.923892 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 10 05:22:17.931238 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 05:22:17.932568 systemd[1]: Stopped target slices.target - Slice Units. Sep 10 05:22:17.934865 systemd[1]: Stopped target sockets.target - Socket Units. Sep 10 05:22:17.936607 systemd[1]: iscsid.socket: Deactivated successfully. Sep 10 05:22:17.936707 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 10 05:22:17.938403 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 10 05:22:17.938493 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 10 05:22:17.939448 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 10 05:22:17.939565 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 10 05:22:17.942151 systemd[1]: ignition-files.service: Deactivated successfully. Sep 10 05:22:17.942258 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 10 05:22:17.944846 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 10 05:22:17.946656 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 10 05:22:17.946773 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 05:22:17.950370 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 10 05:22:17.955329 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 10 05:22:17.956392 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 05:22:17.958717 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 10 05:22:17.959778 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 10 05:22:17.967078 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 10 05:22:17.967305 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 10 05:22:17.982943 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 10 05:22:17.990970 ignition[1085]: INFO : Ignition 2.22.0 Sep 10 05:22:17.990970 ignition[1085]: INFO : Stage: umount Sep 10 05:22:17.992842 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 05:22:17.992842 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 10 05:22:17.992842 ignition[1085]: INFO : umount: umount passed Sep 10 05:22:17.992842 ignition[1085]: INFO : Ignition finished successfully Sep 10 05:22:17.995151 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 10 05:22:17.995277 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 10 05:22:17.996842 systemd[1]: Stopped target network.target - Network. Sep 10 05:22:17.998416 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 10 05:22:17.998468 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 10 05:22:18.000274 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 10 05:22:18.000323 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 10 05:22:18.002195 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 10 05:22:18.002250 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 10 05:22:18.004072 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 10 05:22:18.004147 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 10 05:22:18.005158 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 10 05:22:18.005635 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 10 05:22:18.017208 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 10 05:22:18.017350 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 10 05:22:18.021979 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 10 05:22:18.022274 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 10 05:22:18.022323 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 05:22:18.025259 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 10 05:22:18.026695 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 10 05:22:18.026842 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 10 05:22:18.030527 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 10 05:22:18.030757 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 10 05:22:18.031501 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 10 05:22:18.031542 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 10 05:22:18.035618 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 10 05:22:18.037580 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 10 05:22:18.037637 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 10 05:22:18.037926 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 10 05:22:18.037969 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 10 05:22:18.043458 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 10 05:22:18.043509 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 10 05:22:18.044588 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 05:22:18.045748 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 10 05:22:18.066888 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 10 05:22:18.067081 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 05:22:18.069373 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 10 05:22:18.069502 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 10 05:22:18.071886 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 10 05:22:18.071971 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 10 05:22:18.073104 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 10 05:22:18.073157 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 05:22:18.075049 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 10 05:22:18.075098 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 10 05:22:18.075693 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 10 05:22:18.075742 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 10 05:22:18.076446 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 10 05:22:18.076500 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 10 05:22:18.113863 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 10 05:22:18.114925 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 10 05:22:18.115000 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 05:22:18.122117 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 10 05:22:18.122268 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 05:22:18.125596 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 10 05:22:18.125665 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 05:22:18.134578 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 10 05:22:18.135290 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 10 05:22:18.136739 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 10 05:22:18.136789 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 10 05:22:18.143140 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 10 05:22:18.143262 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 10 05:22:18.144553 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 10 05:22:18.148565 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 10 05:22:18.177677 systemd[1]: Switching root. Sep 10 05:22:18.217051 systemd-journald[220]: Journal stopped Sep 10 05:22:19.597768 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Sep 10 05:22:19.597838 kernel: SELinux: policy capability network_peer_controls=1 Sep 10 05:22:19.597859 kernel: SELinux: policy capability open_perms=1 Sep 10 05:22:19.597870 kernel: SELinux: policy capability extended_socket_class=1 Sep 10 05:22:19.597889 kernel: SELinux: policy capability always_check_network=0 Sep 10 05:22:19.597901 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 10 05:22:19.597912 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 10 05:22:19.597925 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 10 05:22:19.597941 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 10 05:22:19.597952 kernel: SELinux: policy capability userspace_initial_context=0 Sep 10 05:22:19.597964 kernel: audit: type=1403 audit(1757481738.821:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 10 05:22:19.597983 systemd[1]: Successfully loaded SELinux policy in 68.186ms. Sep 10 05:22:19.598004 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.481ms. Sep 10 05:22:19.598017 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 10 05:22:19.598030 systemd[1]: Detected virtualization kvm. Sep 10 05:22:19.598043 systemd[1]: Detected architecture x86-64. Sep 10 05:22:19.598055 systemd[1]: Detected first boot. Sep 10 05:22:19.598074 systemd[1]: Initializing machine ID from VM UUID. Sep 10 05:22:19.598086 kernel: Guest personality initialized and is inactive Sep 10 05:22:19.598097 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 10 05:22:19.598120 kernel: Initialized host personality Sep 10 05:22:19.599030 kernel: NET: Registered PF_VSOCK protocol family Sep 10 05:22:19.599048 zram_generator::config[1132]: No configuration found. Sep 10 05:22:19.599068 systemd[1]: Populated /etc with preset unit settings. Sep 10 05:22:19.599085 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 10 05:22:19.599098 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 10 05:22:19.599114 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 10 05:22:19.599141 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 10 05:22:19.599161 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 10 05:22:19.599174 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 10 05:22:19.599186 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 10 05:22:19.599198 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 10 05:22:19.599211 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 10 05:22:19.599223 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 10 05:22:19.599235 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 10 05:22:19.599253 systemd[1]: Created slice user.slice - User and Session Slice. Sep 10 05:22:19.599271 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 05:22:19.599283 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 05:22:19.599296 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 10 05:22:19.599308 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 10 05:22:19.599321 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 10 05:22:19.599334 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 10 05:22:19.599346 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 10 05:22:19.599366 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 05:22:19.599384 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 10 05:22:19.599397 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 10 05:22:19.599409 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 10 05:22:19.599425 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 10 05:22:19.599437 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 10 05:22:19.599452 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 05:22:19.599465 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 10 05:22:19.599479 systemd[1]: Reached target slices.target - Slice Units. Sep 10 05:22:19.599496 systemd[1]: Reached target swap.target - Swaps. Sep 10 05:22:19.599509 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 10 05:22:19.599526 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 10 05:22:19.599539 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 10 05:22:19.599551 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 10 05:22:19.599563 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 10 05:22:19.599575 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 05:22:19.599587 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 10 05:22:19.599599 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 10 05:22:19.599611 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 10 05:22:19.599624 systemd[1]: Mounting media.mount - External Media Directory... Sep 10 05:22:19.599641 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:19.599653 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 10 05:22:19.599665 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 10 05:22:19.599677 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 10 05:22:19.599690 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 10 05:22:19.599703 systemd[1]: Reached target machines.target - Containers. Sep 10 05:22:19.599716 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 10 05:22:19.599728 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 05:22:19.599745 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 10 05:22:19.599757 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 10 05:22:19.599769 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 05:22:19.599781 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 10 05:22:19.599793 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 05:22:19.599805 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 10 05:22:19.599817 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 05:22:19.599830 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 10 05:22:19.599847 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 10 05:22:19.599860 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 10 05:22:19.599871 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 10 05:22:19.599884 systemd[1]: Stopped systemd-fsck-usr.service. Sep 10 05:22:19.599897 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 05:22:19.599909 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 10 05:22:19.599922 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 10 05:22:19.599933 kernel: loop: module loaded Sep 10 05:22:19.599945 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 10 05:22:19.599963 kernel: fuse: init (API version 7.41) Sep 10 05:22:19.599976 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 10 05:22:19.599988 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 10 05:22:19.600000 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 10 05:22:19.600013 systemd[1]: verity-setup.service: Deactivated successfully. Sep 10 05:22:19.600030 systemd[1]: Stopped verity-setup.service. Sep 10 05:22:19.600043 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:19.600080 systemd-journald[1201]: Collecting audit messages is disabled. Sep 10 05:22:19.600103 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 10 05:22:19.600116 systemd-journald[1201]: Journal started Sep 10 05:22:19.600160 systemd-journald[1201]: Runtime Journal (/run/log/journal/6b2286d0a82a4307a0bf087cdb0e4688) is 6M, max 48.6M, 42.5M free. Sep 10 05:22:19.373335 systemd[1]: Queued start job for default target multi-user.target. Sep 10 05:22:19.386249 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 10 05:22:19.386755 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 10 05:22:19.602147 systemd[1]: Started systemd-journald.service - Journal Service. Sep 10 05:22:19.605978 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 10 05:22:19.606153 kernel: ACPI: bus type drm_connector registered Sep 10 05:22:19.607493 systemd[1]: Mounted media.mount - External Media Directory. Sep 10 05:22:19.608699 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 10 05:22:19.609898 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 10 05:22:19.611106 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 10 05:22:19.612585 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 05:22:19.614618 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 10 05:22:19.614984 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 10 05:22:19.616825 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 05:22:19.617096 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 05:22:19.618556 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 10 05:22:19.618821 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 10 05:22:19.620535 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 10 05:22:19.622005 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 05:22:19.622273 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 05:22:19.623814 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 10 05:22:19.624066 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 10 05:22:19.625794 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 05:22:19.626063 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 05:22:19.627854 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 10 05:22:19.629404 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 05:22:19.631025 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 10 05:22:19.632553 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 10 05:22:19.651942 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 10 05:22:19.654709 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 10 05:22:19.657159 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 10 05:22:19.658397 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 10 05:22:19.658430 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 10 05:22:19.660495 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 10 05:22:19.670243 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 10 05:22:19.671519 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 05:22:19.673307 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 10 05:22:19.677273 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 10 05:22:19.678525 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 10 05:22:19.679547 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 10 05:22:19.680725 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 10 05:22:19.681944 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 10 05:22:19.691994 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 10 05:22:19.710238 kernel: loop0: detected capacity change from 0 to 110984 Sep 10 05:22:19.710339 systemd-journald[1201]: Time spent on flushing to /var/log/journal/6b2286d0a82a4307a0bf087cdb0e4688 is 19.858ms for 978 entries. Sep 10 05:22:19.710339 systemd-journald[1201]: System Journal (/var/log/journal/6b2286d0a82a4307a0bf087cdb0e4688) is 8M, max 195.6M, 187.6M free. Sep 10 05:22:19.738218 systemd-journald[1201]: Received client request to flush runtime journal. Sep 10 05:22:19.738272 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 10 05:22:19.699322 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 10 05:22:19.702809 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 10 05:22:19.704078 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 10 05:22:19.720679 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 10 05:22:19.723733 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 05:22:19.726330 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 10 05:22:19.730865 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 10 05:22:19.732613 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 10 05:22:19.741392 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 10 05:22:19.764001 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 10 05:22:19.764282 kernel: loop1: detected capacity change from 0 to 229808 Sep 10 05:22:19.776111 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 10 05:22:19.779452 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 10 05:22:19.802155 kernel: loop2: detected capacity change from 0 to 128016 Sep 10 05:22:19.806205 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 10 05:22:19.806226 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 10 05:22:19.812408 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 05:22:19.839193 kernel: loop3: detected capacity change from 0 to 110984 Sep 10 05:22:19.849166 kernel: loop4: detected capacity change from 0 to 229808 Sep 10 05:22:19.859191 kernel: loop5: detected capacity change from 0 to 128016 Sep 10 05:22:19.866975 (sd-merge)[1273]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 10 05:22:19.867621 (sd-merge)[1273]: Merged extensions into '/usr'. Sep 10 05:22:19.872059 systemd[1]: Reload requested from client PID 1250 ('systemd-sysext') (unit systemd-sysext.service)... Sep 10 05:22:19.872228 systemd[1]: Reloading... Sep 10 05:22:19.938170 zram_generator::config[1299]: No configuration found. Sep 10 05:22:20.032231 ldconfig[1245]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 10 05:22:20.158066 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 10 05:22:20.158507 systemd[1]: Reloading finished in 285 ms. Sep 10 05:22:20.189067 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 10 05:22:20.190793 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 10 05:22:20.205544 systemd[1]: Starting ensure-sysext.service... Sep 10 05:22:20.208000 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 10 05:22:20.225358 systemd[1]: Reload requested from client PID 1336 ('systemctl') (unit ensure-sysext.service)... Sep 10 05:22:20.225379 systemd[1]: Reloading... Sep 10 05:22:20.228953 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 10 05:22:20.229374 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 10 05:22:20.229844 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 10 05:22:20.230214 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 10 05:22:20.231423 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 10 05:22:20.231829 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Sep 10 05:22:20.231923 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Sep 10 05:22:20.237938 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Sep 10 05:22:20.237955 systemd-tmpfiles[1337]: Skipping /boot Sep 10 05:22:20.250912 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Sep 10 05:22:20.250928 systemd-tmpfiles[1337]: Skipping /boot Sep 10 05:22:20.282209 zram_generator::config[1364]: No configuration found. Sep 10 05:22:20.463561 systemd[1]: Reloading finished in 237 ms. Sep 10 05:22:20.484073 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 10 05:22:20.500678 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 05:22:20.512691 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 10 05:22:20.515679 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 10 05:22:20.518472 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 10 05:22:20.527252 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 10 05:22:20.530464 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 05:22:20.534340 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 10 05:22:20.538411 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.538594 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 05:22:20.539789 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 05:22:20.550689 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 05:22:20.554256 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 05:22:20.555523 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 05:22:20.555689 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 05:22:20.558366 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 10 05:22:20.559564 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.561812 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 10 05:22:20.563648 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 05:22:20.564182 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 05:22:20.565787 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 05:22:20.565996 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 05:22:20.570222 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 05:22:20.570579 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 05:22:20.579067 systemd-udevd[1407]: Using default interface naming scheme 'v255'. Sep 10 05:22:20.580543 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.580765 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 05:22:20.583723 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 05:22:20.586441 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 05:22:20.589575 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 05:22:20.590841 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 05:22:20.591024 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 05:22:20.597162 augenrules[1439]: No rules Sep 10 05:22:20.600402 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 10 05:22:20.601646 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.603263 systemd[1]: audit-rules.service: Deactivated successfully. Sep 10 05:22:20.603581 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 10 05:22:20.605429 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 10 05:22:20.607386 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 05:22:20.607611 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 05:22:20.609360 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 05:22:20.609581 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 05:22:20.611372 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 05:22:20.611592 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 05:22:20.613544 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 10 05:22:20.616928 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 05:22:20.627184 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 10 05:22:20.631910 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 10 05:22:20.643210 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.648302 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 10 05:22:20.649406 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 05:22:20.654374 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 05:22:20.658384 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 10 05:22:20.660615 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 05:22:20.667292 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 05:22:20.669209 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 05:22:20.669257 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 05:22:20.674687 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 10 05:22:20.675979 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 10 05:22:20.676015 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 10 05:22:20.683348 systemd[1]: Finished ensure-sysext.service. Sep 10 05:22:20.685412 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 05:22:20.685939 augenrules[1483]: /sbin/augenrules: No change Sep 10 05:22:20.686217 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 05:22:20.687795 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 10 05:22:20.689351 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 10 05:22:20.690716 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 05:22:20.690992 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 05:22:20.692547 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 05:22:20.692758 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 05:22:20.708617 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 10 05:22:20.708691 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 10 05:22:20.712643 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 10 05:22:20.714572 augenrules[1511]: No rules Sep 10 05:22:20.717269 systemd[1]: audit-rules.service: Deactivated successfully. Sep 10 05:22:20.717550 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 10 05:22:20.731953 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 10 05:22:20.814894 systemd-resolved[1406]: Positive Trust Anchors: Sep 10 05:22:20.814910 systemd-resolved[1406]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 10 05:22:20.814940 systemd-resolved[1406]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 10 05:22:20.819174 kernel: mousedev: PS/2 mouse device common for all mice Sep 10 05:22:20.820222 systemd-resolved[1406]: Defaulting to hostname 'linux'. Sep 10 05:22:20.821909 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 10 05:22:20.823153 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 10 05:22:20.824661 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 10 05:22:20.828171 kernel: ACPI: button: Power Button [PWRF] Sep 10 05:22:20.829519 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 10 05:22:20.833253 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 10 05:22:20.862018 systemd-networkd[1495]: lo: Link UP Sep 10 05:22:20.862464 systemd-networkd[1495]: lo: Gained carrier Sep 10 05:22:20.865572 systemd-networkd[1495]: Enumeration completed Sep 10 05:22:20.866023 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 05:22:20.866033 systemd-networkd[1495]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 05:22:20.866306 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 10 05:22:20.867692 systemd-networkd[1495]: eth0: Link UP Sep 10 05:22:20.868307 systemd-networkd[1495]: eth0: Gained carrier Sep 10 05:22:20.868338 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 05:22:20.868436 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 10 05:22:20.870396 systemd[1]: Reached target network.target - Network. Sep 10 05:22:20.873658 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 10 05:22:20.876528 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 10 05:22:20.879289 systemd-networkd[1495]: eth0: DHCPv4 address 10.0.0.54/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 10 05:22:20.885402 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 10 05:22:20.886951 systemd[1]: Reached target sysinit.target - System Initialization. Sep 10 05:22:20.888114 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 10 05:22:20.889442 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 10 05:22:20.890881 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 10 05:22:20.892777 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 10 05:22:20.893101 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 10 05:22:20.893527 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 10 05:22:20.894207 systemd-timesyncd[1514]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 10 05:22:20.894251 systemd-timesyncd[1514]: Initial clock synchronization to Wed 2025-09-10 05:22:21.212267 UTC. Sep 10 05:22:20.894959 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 10 05:22:20.894990 systemd[1]: Reached target paths.target - Path Units. Sep 10 05:22:20.895923 systemd[1]: Reached target time-set.target - System Time Set. Sep 10 05:22:20.897217 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 10 05:22:20.898418 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 10 05:22:20.899670 systemd[1]: Reached target timers.target - Timer Units. Sep 10 05:22:20.901352 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 10 05:22:20.904166 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 10 05:22:20.909779 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 10 05:22:20.913359 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 10 05:22:20.914623 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 10 05:22:20.919765 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 10 05:22:20.921459 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 10 05:22:20.924524 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 10 05:22:20.925938 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 10 05:22:20.928620 systemd[1]: Reached target sockets.target - Socket Units. Sep 10 05:22:20.931266 systemd[1]: Reached target basic.target - Basic System. Sep 10 05:22:20.932409 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 10 05:22:20.932440 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 10 05:22:20.936284 systemd[1]: Starting containerd.service - containerd container runtime... Sep 10 05:22:20.939303 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 10 05:22:20.942399 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 10 05:22:20.953287 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 10 05:22:20.955855 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 10 05:22:20.956943 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 10 05:22:20.958245 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 10 05:22:20.962017 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 10 05:22:20.970794 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 10 05:22:20.976957 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 10 05:22:20.977766 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Refreshing passwd entry cache Sep 10 05:22:20.978229 oslogin_cache_refresh[1556]: Refreshing passwd entry cache Sep 10 05:22:20.980909 jq[1552]: false Sep 10 05:22:20.980222 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 10 05:22:20.987748 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Failure getting users, quitting Sep 10 05:22:20.987748 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 10 05:22:20.987748 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Refreshing group entry cache Sep 10 05:22:20.986859 oslogin_cache_refresh[1556]: Failure getting users, quitting Sep 10 05:22:20.986884 oslogin_cache_refresh[1556]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 10 05:22:20.986952 oslogin_cache_refresh[1556]: Refreshing group entry cache Sep 10 05:22:20.991863 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 10 05:22:20.995048 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 10 05:22:20.995567 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 10 05:22:20.996067 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Failure getting groups, quitting Sep 10 05:22:20.996067 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 10 05:22:20.996058 oslogin_cache_refresh[1556]: Failure getting groups, quitting Sep 10 05:22:20.996070 oslogin_cache_refresh[1556]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 10 05:22:20.996258 extend-filesystems[1554]: Found /dev/vda6 Sep 10 05:22:20.996839 systemd[1]: Starting update-engine.service - Update Engine... Sep 10 05:22:21.001378 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 10 05:22:21.008785 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 10 05:22:21.011030 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 10 05:22:21.011359 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 10 05:22:21.011719 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 10 05:22:21.011983 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 10 05:22:21.012540 extend-filesystems[1554]: Found /dev/vda9 Sep 10 05:22:21.013495 systemd[1]: motdgen.service: Deactivated successfully. Sep 10 05:22:21.013766 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 10 05:22:21.019607 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 10 05:22:21.019930 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 10 05:22:21.021670 jq[1573]: true Sep 10 05:22:21.038188 extend-filesystems[1554]: Checking size of /dev/vda9 Sep 10 05:22:21.041976 (ntainerd)[1588]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 10 05:22:21.044936 jq[1581]: true Sep 10 05:22:21.066756 update_engine[1571]: I20250910 05:22:21.053143 1571 main.cc:92] Flatcar Update Engine starting Sep 10 05:22:21.077724 extend-filesystems[1554]: Resized partition /dev/vda9 Sep 10 05:22:21.086042 tar[1579]: linux-amd64/LICENSE Sep 10 05:22:21.086042 tar[1579]: linux-amd64/helm Sep 10 05:22:21.091673 extend-filesystems[1599]: resize2fs 1.47.3 (8-Jul-2025) Sep 10 05:22:21.099199 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 10 05:22:21.104693 dbus-daemon[1550]: [system] SELinux support is enabled Sep 10 05:22:21.104931 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 10 05:22:21.105945 sshd_keygen[1577]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 10 05:22:21.108806 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 10 05:22:21.109762 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 10 05:22:21.111486 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 10 05:22:21.111509 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 10 05:22:21.124785 update_engine[1571]: I20250910 05:22:21.123072 1571 update_check_scheduler.cc:74] Next update check in 7m13s Sep 10 05:22:21.125060 systemd-logind[1570]: New seat seat0. Sep 10 05:22:21.127960 systemd[1]: Started systemd-logind.service - User Login Management. Sep 10 05:22:21.130552 systemd[1]: Started update-engine.service - Update Engine. Sep 10 05:22:21.135730 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 10 05:22:21.139704 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 10 05:22:21.139459 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 05:22:21.161829 extend-filesystems[1599]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 10 05:22:21.161829 extend-filesystems[1599]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 10 05:22:21.161829 extend-filesystems[1599]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 10 05:22:21.167955 extend-filesystems[1554]: Resized filesystem in /dev/vda9 Sep 10 05:22:21.164090 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 10 05:22:21.169254 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 10 05:22:21.176595 bash[1625]: Updated "/home/core/.ssh/authorized_keys" Sep 10 05:22:21.181833 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 10 05:22:21.190137 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 10 05:22:21.204226 kernel: kvm_amd: TSC scaling supported Sep 10 05:22:21.204260 kernel: kvm_amd: Nested Virtualization enabled Sep 10 05:22:21.204273 kernel: kvm_amd: Nested Paging enabled Sep 10 05:22:21.205210 kernel: kvm_amd: LBR virtualization supported Sep 10 05:22:21.210648 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 10 05:22:21.218199 systemd-logind[1570]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 10 05:22:21.220197 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 10 05:22:21.220246 kernel: kvm_amd: Virtual GIF supported Sep 10 05:22:21.223393 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 10 05:22:21.234373 systemd-logind[1570]: Watching system buttons on /dev/input/event2 (Power Button) Sep 10 05:22:21.271259 systemd[1]: issuegen.service: Deactivated successfully. Sep 10 05:22:21.271556 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 10 05:22:21.273612 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 10 05:22:21.282710 locksmithd[1619]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 10 05:22:21.288832 kernel: EDAC MC: Ver: 3.0.0 Sep 10 05:22:21.294036 containerd[1588]: time="2025-09-10T05:22:21Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 10 05:22:21.294852 containerd[1588]: time="2025-09-10T05:22:21.294815306Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307688077Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.209µs" Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307720371Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307737028Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307914360Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307927945Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.307951519Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.308013772Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308197 containerd[1588]: time="2025-09-10T05:22:21.308025482Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308444 containerd[1588]: time="2025-09-10T05:22:21.308314622Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308444 containerd[1588]: time="2025-09-10T05:22:21.308329747Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308444 containerd[1588]: time="2025-09-10T05:22:21.308339613Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308444 containerd[1588]: time="2025-09-10T05:22:21.308347415Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308532 containerd[1588]: time="2025-09-10T05:22:21.308473735Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308737 containerd[1588]: time="2025-09-10T05:22:21.308705799Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308769 containerd[1588]: time="2025-09-10T05:22:21.308746562Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 10 05:22:21.308769 containerd[1588]: time="2025-09-10T05:22:21.308757875Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 10 05:22:21.308932 containerd[1588]: time="2025-09-10T05:22:21.308887883Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 10 05:22:21.309243 containerd[1588]: time="2025-09-10T05:22:21.309197128Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 10 05:22:21.309326 containerd[1588]: time="2025-09-10T05:22:21.309294185Z" level=info msg="metadata content store policy set" policy=shared Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315780725Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315822321Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315848666Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315860000Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315870886Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315880887Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315893179Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315915733Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315928004Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315938098Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315946245Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.315958630Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.316076606Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 10 05:22:21.318220 containerd[1588]: time="2025-09-10T05:22:21.316098118Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316112305Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316121921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316136005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316175944Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316188487Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316197780Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316218759Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316230374Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316240209Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316300400Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316313618Z" level=info msg="Start snapshots syncer" Sep 10 05:22:21.318541 containerd[1588]: time="2025-09-10T05:22:21.316334151Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 10 05:22:21.318782 containerd[1588]: time="2025-09-10T05:22:21.316543110Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 10 05:22:21.318782 containerd[1588]: time="2025-09-10T05:22:21.316584363Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320357047Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320535368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320557974Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320596602Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320606404Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320616623Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320626321Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320636791Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320685836Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320703431Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320715264Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320770163Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320785299Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 10 05:22:21.322305 containerd[1588]: time="2025-09-10T05:22:21.320795092Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 10 05:22:21.322599 containerd[1588]: time="2025-09-10T05:22:21.320805446Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 10 05:22:21.322599 containerd[1588]: time="2025-09-10T05:22:21.320814175Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 10 05:22:21.322599 containerd[1588]: time="2025-09-10T05:22:21.320833989Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324393203Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324442383Z" level=info msg="runtime interface created" Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324449727Z" level=info msg="created NRI interface" Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324463947Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324488792Z" level=info msg="Connect containerd service" Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.324530773Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 10 05:22:21.328449 containerd[1588]: time="2025-09-10T05:22:21.325374559Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 10 05:22:21.334232 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 10 05:22:21.389303 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 10 05:22:21.407526 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 10 05:22:21.410332 systemd[1]: Reached target getty.target - Login Prompts. Sep 10 05:22:21.446538 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 05:22:22.206197 containerd[1588]: time="2025-09-10T05:22:22.206054829Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 10 05:22:22.206197 containerd[1588]: time="2025-09-10T05:22:22.206139208Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206506537Z" level=info msg="Start subscribing containerd event" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206554196Z" level=info msg="Start recovering state" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206684846Z" level=info msg="Start event monitor" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206703363Z" level=info msg="Start cni network conf syncer for default" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206717493Z" level=info msg="Start streaming server" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206728483Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206735336Z" level=info msg="runtime interface starting up..." Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206749394Z" level=info msg="starting plugins..." Sep 10 05:22:22.206914 containerd[1588]: time="2025-09-10T05:22:22.206772171Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 10 05:22:22.207308 containerd[1588]: time="2025-09-10T05:22:22.207265059Z" level=info msg="containerd successfully booted in 0.913734s" Sep 10 05:22:22.207509 systemd[1]: Started containerd.service - containerd container runtime. Sep 10 05:22:22.432359 tar[1579]: linux-amd64/README.md Sep 10 05:22:22.444397 systemd-networkd[1495]: eth0: Gained IPv6LL Sep 10 05:22:22.460652 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 10 05:22:22.465080 systemd[1]: Reached target network-online.target - Network is Online. Sep 10 05:22:22.472089 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 10 05:22:22.477054 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:22.480290 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 10 05:22:22.499369 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 10 05:22:22.538480 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 10 05:22:22.542387 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 10 05:22:22.542792 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 10 05:22:22.544846 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 10 05:22:23.786132 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:23.788374 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 10 05:22:23.790362 systemd[1]: Startup finished in 2.861s (kernel) + 6.187s (initrd) + 5.035s (userspace) = 14.084s. Sep 10 05:22:23.790890 (kubelet)[1695]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 05:22:24.429983 kubelet[1695]: E0910 05:22:24.429896 1695 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 05:22:24.435056 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 05:22:24.435324 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 05:22:24.435918 systemd[1]: kubelet.service: Consumed 1.707s CPU time, 267.5M memory peak. Sep 10 05:22:26.212941 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 10 05:22:26.214696 systemd[1]: Started sshd@0-10.0.0.54:22-10.0.0.1:54328.service - OpenSSH per-connection server daemon (10.0.0.1:54328). Sep 10 05:22:26.399503 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 54328 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:26.401476 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:26.408871 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 10 05:22:26.410221 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 10 05:22:26.417444 systemd-logind[1570]: New session 1 of user core. Sep 10 05:22:26.435451 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 10 05:22:26.439514 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 10 05:22:26.456651 (systemd)[1714]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 10 05:22:26.459360 systemd-logind[1570]: New session c1 of user core. Sep 10 05:22:26.618445 systemd[1714]: Queued start job for default target default.target. Sep 10 05:22:26.630899 systemd[1714]: Created slice app.slice - User Application Slice. Sep 10 05:22:26.630935 systemd[1714]: Reached target paths.target - Paths. Sep 10 05:22:26.630984 systemd[1714]: Reached target timers.target - Timers. Sep 10 05:22:26.632985 systemd[1714]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 10 05:22:26.645246 systemd[1714]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 10 05:22:26.645467 systemd[1714]: Reached target sockets.target - Sockets. Sep 10 05:22:26.645541 systemd[1714]: Reached target basic.target - Basic System. Sep 10 05:22:26.645599 systemd[1714]: Reached target default.target - Main User Target. Sep 10 05:22:26.645650 systemd[1714]: Startup finished in 179ms. Sep 10 05:22:26.645783 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 10 05:22:26.647598 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 10 05:22:26.721960 systemd[1]: Started sshd@1-10.0.0.54:22-10.0.0.1:54336.service - OpenSSH per-connection server daemon (10.0.0.1:54336). Sep 10 05:22:26.782062 sshd[1725]: Accepted publickey for core from 10.0.0.1 port 54336 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:26.783789 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:26.788978 systemd-logind[1570]: New session 2 of user core. Sep 10 05:22:26.802269 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 10 05:22:26.858980 sshd[1728]: Connection closed by 10.0.0.1 port 54336 Sep 10 05:22:26.859281 sshd-session[1725]: pam_unix(sshd:session): session closed for user core Sep 10 05:22:26.874925 systemd[1]: sshd@1-10.0.0.54:22-10.0.0.1:54336.service: Deactivated successfully. Sep 10 05:22:26.876804 systemd[1]: session-2.scope: Deactivated successfully. Sep 10 05:22:26.877722 systemd-logind[1570]: Session 2 logged out. Waiting for processes to exit. Sep 10 05:22:26.880530 systemd[1]: Started sshd@2-10.0.0.54:22-10.0.0.1:54340.service - OpenSSH per-connection server daemon (10.0.0.1:54340). Sep 10 05:22:26.881353 systemd-logind[1570]: Removed session 2. Sep 10 05:22:26.938161 sshd[1734]: Accepted publickey for core from 10.0.0.1 port 54340 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:26.939417 sshd-session[1734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:26.944187 systemd-logind[1570]: New session 3 of user core. Sep 10 05:22:26.961322 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 10 05:22:27.012492 sshd[1737]: Connection closed by 10.0.0.1 port 54340 Sep 10 05:22:27.012874 sshd-session[1734]: pam_unix(sshd:session): session closed for user core Sep 10 05:22:27.022401 systemd[1]: sshd@2-10.0.0.54:22-10.0.0.1:54340.service: Deactivated successfully. Sep 10 05:22:27.024646 systemd[1]: session-3.scope: Deactivated successfully. Sep 10 05:22:27.025459 systemd-logind[1570]: Session 3 logged out. Waiting for processes to exit. Sep 10 05:22:27.029102 systemd[1]: Started sshd@3-10.0.0.54:22-10.0.0.1:54354.service - OpenSSH per-connection server daemon (10.0.0.1:54354). Sep 10 05:22:27.029727 systemd-logind[1570]: Removed session 3. Sep 10 05:22:27.090013 sshd[1743]: Accepted publickey for core from 10.0.0.1 port 54354 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:27.091572 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:27.096434 systemd-logind[1570]: New session 4 of user core. Sep 10 05:22:27.114323 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 10 05:22:27.170231 sshd[1746]: Connection closed by 10.0.0.1 port 54354 Sep 10 05:22:27.170217 sshd-session[1743]: pam_unix(sshd:session): session closed for user core Sep 10 05:22:27.179928 systemd[1]: sshd@3-10.0.0.54:22-10.0.0.1:54354.service: Deactivated successfully. Sep 10 05:22:27.181906 systemd[1]: session-4.scope: Deactivated successfully. Sep 10 05:22:27.182830 systemd-logind[1570]: Session 4 logged out. Waiting for processes to exit. Sep 10 05:22:27.186088 systemd[1]: Started sshd@4-10.0.0.54:22-10.0.0.1:54364.service - OpenSSH per-connection server daemon (10.0.0.1:54364). Sep 10 05:22:27.186669 systemd-logind[1570]: Removed session 4. Sep 10 05:22:27.252722 sshd[1752]: Accepted publickey for core from 10.0.0.1 port 54364 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:27.254238 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:27.259067 systemd-logind[1570]: New session 5 of user core. Sep 10 05:22:27.269333 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 10 05:22:27.330587 sudo[1756]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 10 05:22:27.330928 sudo[1756]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 05:22:27.354233 sudo[1756]: pam_unix(sudo:session): session closed for user root Sep 10 05:22:27.355976 sshd[1755]: Connection closed by 10.0.0.1 port 54364 Sep 10 05:22:27.356538 sshd-session[1752]: pam_unix(sshd:session): session closed for user core Sep 10 05:22:27.366094 systemd[1]: sshd@4-10.0.0.54:22-10.0.0.1:54364.service: Deactivated successfully. Sep 10 05:22:27.368070 systemd[1]: session-5.scope: Deactivated successfully. Sep 10 05:22:27.368913 systemd-logind[1570]: Session 5 logged out. Waiting for processes to exit. Sep 10 05:22:27.372092 systemd[1]: Started sshd@5-10.0.0.54:22-10.0.0.1:54376.service - OpenSSH per-connection server daemon (10.0.0.1:54376). Sep 10 05:22:27.372999 systemd-logind[1570]: Removed session 5. Sep 10 05:22:27.424101 sshd[1762]: Accepted publickey for core from 10.0.0.1 port 54376 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:27.425407 sshd-session[1762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:27.429945 systemd-logind[1570]: New session 6 of user core. Sep 10 05:22:27.441287 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 10 05:22:27.496431 sudo[1767]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 10 05:22:27.496768 sudo[1767]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 05:22:27.504110 sudo[1767]: pam_unix(sudo:session): session closed for user root Sep 10 05:22:27.511343 sudo[1766]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 10 05:22:27.511668 sudo[1766]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 05:22:27.521780 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 10 05:22:27.575567 augenrules[1789]: No rules Sep 10 05:22:27.577311 systemd[1]: audit-rules.service: Deactivated successfully. Sep 10 05:22:27.577607 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 10 05:22:27.578821 sudo[1766]: pam_unix(sudo:session): session closed for user root Sep 10 05:22:27.580260 sshd[1765]: Connection closed by 10.0.0.1 port 54376 Sep 10 05:22:27.580664 sshd-session[1762]: pam_unix(sshd:session): session closed for user core Sep 10 05:22:27.587853 systemd[1]: sshd@5-10.0.0.54:22-10.0.0.1:54376.service: Deactivated successfully. Sep 10 05:22:27.589816 systemd[1]: session-6.scope: Deactivated successfully. Sep 10 05:22:27.590548 systemd-logind[1570]: Session 6 logged out. Waiting for processes to exit. Sep 10 05:22:27.593668 systemd[1]: Started sshd@6-10.0.0.54:22-10.0.0.1:54388.service - OpenSSH per-connection server daemon (10.0.0.1:54388). Sep 10 05:22:27.594333 systemd-logind[1570]: Removed session 6. Sep 10 05:22:27.647218 sshd[1798]: Accepted publickey for core from 10.0.0.1 port 54388 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:22:27.648355 sshd-session[1798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:22:27.652810 systemd-logind[1570]: New session 7 of user core. Sep 10 05:22:27.662374 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 10 05:22:27.718796 sudo[1802]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 10 05:22:27.719167 sudo[1802]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 05:22:28.679942 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 10 05:22:28.704688 (dockerd)[1822]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 10 05:22:29.225151 dockerd[1822]: time="2025-09-10T05:22:29.225048545Z" level=info msg="Starting up" Sep 10 05:22:29.226534 dockerd[1822]: time="2025-09-10T05:22:29.226498799Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 10 05:22:29.254762 dockerd[1822]: time="2025-09-10T05:22:29.254673826Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 10 05:22:29.531281 dockerd[1822]: time="2025-09-10T05:22:29.531103401Z" level=info msg="Loading containers: start." Sep 10 05:22:29.542971 kernel: Initializing XFRM netlink socket Sep 10 05:22:29.820130 systemd-networkd[1495]: docker0: Link UP Sep 10 05:22:29.829181 dockerd[1822]: time="2025-09-10T05:22:29.829114009Z" level=info msg="Loading containers: done." Sep 10 05:22:29.843407 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4057403013-merged.mount: Deactivated successfully. Sep 10 05:22:29.844523 dockerd[1822]: time="2025-09-10T05:22:29.844484562Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 10 05:22:29.844632 dockerd[1822]: time="2025-09-10T05:22:29.844607994Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 10 05:22:29.844748 dockerd[1822]: time="2025-09-10T05:22:29.844725841Z" level=info msg="Initializing buildkit" Sep 10 05:22:29.879016 dockerd[1822]: time="2025-09-10T05:22:29.878972104Z" level=info msg="Completed buildkit initialization" Sep 10 05:22:29.883635 dockerd[1822]: time="2025-09-10T05:22:29.883607286Z" level=info msg="Daemon has completed initialization" Sep 10 05:22:29.883766 dockerd[1822]: time="2025-09-10T05:22:29.883701906Z" level=info msg="API listen on /run/docker.sock" Sep 10 05:22:29.883860 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 10 05:22:30.873385 containerd[1588]: time="2025-09-10T05:22:30.873333287Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 10 05:22:31.541436 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3554402045.mount: Deactivated successfully. Sep 10 05:22:33.488240 containerd[1588]: time="2025-09-10T05:22:33.488169832Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:33.488745 containerd[1588]: time="2025-09-10T05:22:33.488701838Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114893" Sep 10 05:22:33.489900 containerd[1588]: time="2025-09-10T05:22:33.489868469Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:33.492406 containerd[1588]: time="2025-09-10T05:22:33.492349299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:33.493574 containerd[1588]: time="2025-09-10T05:22:33.493538946Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 2.620163898s" Sep 10 05:22:33.493618 containerd[1588]: time="2025-09-10T05:22:33.493578645Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Sep 10 05:22:33.494754 containerd[1588]: time="2025-09-10T05:22:33.494723977Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 10 05:22:34.685842 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 10 05:22:34.688403 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:34.935756 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:34.944051 (kubelet)[2110]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 05:22:34.987594 kubelet[2110]: E0910 05:22:34.987531 2110 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 05:22:34.994369 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 05:22:34.994582 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 05:22:34.994976 systemd[1]: kubelet.service: Consumed 274ms CPU time, 110.8M memory peak. Sep 10 05:22:35.270203 containerd[1588]: time="2025-09-10T05:22:35.269972954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:35.271005 containerd[1588]: time="2025-09-10T05:22:35.270934648Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020844" Sep 10 05:22:35.274203 containerd[1588]: time="2025-09-10T05:22:35.274089649Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:35.278848 containerd[1588]: time="2025-09-10T05:22:35.278781744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:35.280012 containerd[1588]: time="2025-09-10T05:22:35.279949519Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.785192663s" Sep 10 05:22:35.280012 containerd[1588]: time="2025-09-10T05:22:35.279997571Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Sep 10 05:22:35.280824 containerd[1588]: time="2025-09-10T05:22:35.280774928Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 10 05:22:36.804883 containerd[1588]: time="2025-09-10T05:22:36.804801258Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:36.805790 containerd[1588]: time="2025-09-10T05:22:36.805730330Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155568" Sep 10 05:22:36.807062 containerd[1588]: time="2025-09-10T05:22:36.807027785Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:36.812170 containerd[1588]: time="2025-09-10T05:22:36.812064190Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:36.812924 containerd[1588]: time="2025-09-10T05:22:36.812882021Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.532077735s" Sep 10 05:22:36.812924 containerd[1588]: time="2025-09-10T05:22:36.812923137Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Sep 10 05:22:36.813845 containerd[1588]: time="2025-09-10T05:22:36.813809763Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 10 05:22:37.775582 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1488846978.mount: Deactivated successfully. Sep 10 05:22:38.482047 containerd[1588]: time="2025-09-10T05:22:38.481978324Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:38.482827 containerd[1588]: time="2025-09-10T05:22:38.482763050Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929469" Sep 10 05:22:38.484031 containerd[1588]: time="2025-09-10T05:22:38.483955798Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:38.485786 containerd[1588]: time="2025-09-10T05:22:38.485728331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:38.486385 containerd[1588]: time="2025-09-10T05:22:38.486331939Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.672486839s" Sep 10 05:22:38.486385 containerd[1588]: time="2025-09-10T05:22:38.486378104Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Sep 10 05:22:38.486917 containerd[1588]: time="2025-09-10T05:22:38.486891674Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 10 05:22:39.311485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2716598808.mount: Deactivated successfully. Sep 10 05:22:40.597021 containerd[1588]: time="2025-09-10T05:22:40.596952537Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:40.597796 containerd[1588]: time="2025-09-10T05:22:40.597745325Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 10 05:22:40.598835 containerd[1588]: time="2025-09-10T05:22:40.598801973Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:40.601327 containerd[1588]: time="2025-09-10T05:22:40.601293492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:40.602313 containerd[1588]: time="2025-09-10T05:22:40.602284733Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.115365645s" Sep 10 05:22:40.602359 containerd[1588]: time="2025-09-10T05:22:40.602320080Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 10 05:22:40.603503 containerd[1588]: time="2025-09-10T05:22:40.603447031Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 10 05:22:41.121119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3582449492.mount: Deactivated successfully. Sep 10 05:22:41.233544 containerd[1588]: time="2025-09-10T05:22:41.233459117Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 05:22:41.276885 containerd[1588]: time="2025-09-10T05:22:41.276844249Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 10 05:22:41.302717 containerd[1588]: time="2025-09-10T05:22:41.302673916Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 05:22:41.332630 containerd[1588]: time="2025-09-10T05:22:41.332551974Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 05:22:41.333284 containerd[1588]: time="2025-09-10T05:22:41.333254034Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 729.764102ms" Sep 10 05:22:41.333284 containerd[1588]: time="2025-09-10T05:22:41.333284354Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 10 05:22:41.333870 containerd[1588]: time="2025-09-10T05:22:41.333837105Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 10 05:22:42.142746 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount276516158.mount: Deactivated successfully. Sep 10 05:22:45.014234 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 10 05:22:45.016260 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:45.049337 containerd[1588]: time="2025-09-10T05:22:45.049287888Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:45.050185 containerd[1588]: time="2025-09-10T05:22:45.050150332Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378433" Sep 10 05:22:45.051330 containerd[1588]: time="2025-09-10T05:22:45.051285034Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:45.054691 containerd[1588]: time="2025-09-10T05:22:45.054602166Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:22:45.055574 containerd[1588]: time="2025-09-10T05:22:45.055526566Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 3.721508925s" Sep 10 05:22:45.055574 containerd[1588]: time="2025-09-10T05:22:45.055560374Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 10 05:22:45.284601 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:45.300669 (kubelet)[2257]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 05:22:45.577098 kubelet[2257]: E0910 05:22:45.576908 2257 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 05:22:45.581270 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 05:22:45.581454 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 05:22:45.581844 systemd[1]: kubelet.service: Consumed 433ms CPU time, 112.9M memory peak. Sep 10 05:22:48.511747 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:48.511983 systemd[1]: kubelet.service: Consumed 433ms CPU time, 112.9M memory peak. Sep 10 05:22:48.514928 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:48.541610 systemd[1]: Reload requested from client PID 2289 ('systemctl') (unit session-7.scope)... Sep 10 05:22:48.541635 systemd[1]: Reloading... Sep 10 05:22:48.640169 zram_generator::config[2338]: No configuration found. Sep 10 05:22:48.939929 systemd[1]: Reloading finished in 397 ms. Sep 10 05:22:49.024352 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 10 05:22:49.024476 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 10 05:22:49.024960 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:49.025065 systemd[1]: kubelet.service: Consumed 169ms CPU time, 98.3M memory peak. Sep 10 05:22:49.028374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:49.253925 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:49.266546 (kubelet)[2380]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 10 05:22:49.317922 kubelet[2380]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 05:22:49.317922 kubelet[2380]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 10 05:22:49.317922 kubelet[2380]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 05:22:49.318372 kubelet[2380]: I0910 05:22:49.317980 2380 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 10 05:22:49.898668 kubelet[2380]: I0910 05:22:49.898606 2380 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 10 05:22:49.898668 kubelet[2380]: I0910 05:22:49.898653 2380 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 10 05:22:49.899003 kubelet[2380]: I0910 05:22:49.898982 2380 server.go:956] "Client rotation is on, will bootstrap in background" Sep 10 05:22:49.948330 kubelet[2380]: E0910 05:22:49.948261 2380 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.54:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 10 05:22:49.949470 kubelet[2380]: I0910 05:22:49.949424 2380 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 10 05:22:49.960197 kubelet[2380]: I0910 05:22:49.960159 2380 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 10 05:22:49.966692 kubelet[2380]: I0910 05:22:49.966658 2380 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 10 05:22:49.966967 kubelet[2380]: I0910 05:22:49.966926 2380 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 10 05:22:49.967194 kubelet[2380]: I0910 05:22:49.966951 2380 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 10 05:22:49.967665 kubelet[2380]: I0910 05:22:49.967200 2380 topology_manager.go:138] "Creating topology manager with none policy" Sep 10 05:22:49.967665 kubelet[2380]: I0910 05:22:49.967209 2380 container_manager_linux.go:303] "Creating device plugin manager" Sep 10 05:22:49.967665 kubelet[2380]: I0910 05:22:49.967370 2380 state_mem.go:36] "Initialized new in-memory state store" Sep 10 05:22:49.970108 kubelet[2380]: I0910 05:22:49.970060 2380 kubelet.go:480] "Attempting to sync node with API server" Sep 10 05:22:49.970108 kubelet[2380]: I0910 05:22:49.970085 2380 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 10 05:22:49.970304 kubelet[2380]: I0910 05:22:49.970124 2380 kubelet.go:386] "Adding apiserver pod source" Sep 10 05:22:49.970304 kubelet[2380]: I0910 05:22:49.970161 2380 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 10 05:22:49.980724 kubelet[2380]: E0910 05:22:49.980398 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 10 05:22:49.980793 kubelet[2380]: E0910 05:22:49.980654 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 10 05:22:49.981753 kubelet[2380]: I0910 05:22:49.981717 2380 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 10 05:22:49.982813 kubelet[2380]: I0910 05:22:49.982763 2380 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 10 05:22:49.984984 kubelet[2380]: W0910 05:22:49.984932 2380 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 10 05:22:49.989850 kubelet[2380]: I0910 05:22:49.989814 2380 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 10 05:22:49.989971 kubelet[2380]: I0910 05:22:49.989882 2380 server.go:1289] "Started kubelet" Sep 10 05:22:49.991870 kubelet[2380]: I0910 05:22:49.991787 2380 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 10 05:22:49.994765 kubelet[2380]: I0910 05:22:49.993499 2380 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 10 05:22:49.994765 kubelet[2380]: I0910 05:22:49.993511 2380 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 10 05:22:49.994765 kubelet[2380]: I0910 05:22:49.993664 2380 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 10 05:22:49.996207 kubelet[2380]: I0910 05:22:49.996165 2380 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 10 05:22:49.997613 kubelet[2380]: I0910 05:22:49.997585 2380 factory.go:223] Registration of the systemd container factory successfully Sep 10 05:22:49.997731 kubelet[2380]: I0910 05:22:49.997708 2380 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 10 05:22:49.997924 kubelet[2380]: I0910 05:22:49.997896 2380 server.go:317] "Adding debug handlers to kubelet server" Sep 10 05:22:49.998813 kubelet[2380]: E0910 05:22:49.996708 2380 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.54:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.54:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863d45bb7781f6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-10 05:22:49.989840748 +0000 UTC m=+0.712090108,LastTimestamp:2025-09-10 05:22:49.989840748 +0000 UTC m=+0.712090108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 10 05:22:49.998941 kubelet[2380]: I0910 05:22:49.998916 2380 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 10 05:22:49.999075 kubelet[2380]: I0910 05:22:49.999041 2380 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 10 05:22:49.999121 kubelet[2380]: I0910 05:22:49.999115 2380 reconciler.go:26] "Reconciler: start to sync state" Sep 10 05:22:49.999512 kubelet[2380]: I0910 05:22:49.999494 2380 factory.go:223] Registration of the containerd container factory successfully Sep 10 05:22:49.999559 kubelet[2380]: E0910 05:22:49.999537 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 10 05:22:50.003832 kubelet[2380]: E0910 05:22:50.002834 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.003832 kubelet[2380]: E0910 05:22:50.003458 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="200ms" Sep 10 05:22:50.011101 kubelet[2380]: E0910 05:22:50.011061 2380 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 10 05:22:50.016753 kubelet[2380]: I0910 05:22:50.016725 2380 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 10 05:22:50.016753 kubelet[2380]: I0910 05:22:50.016749 2380 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 10 05:22:50.016887 kubelet[2380]: I0910 05:22:50.016771 2380 state_mem.go:36] "Initialized new in-memory state store" Sep 10 05:22:50.104238 kubelet[2380]: E0910 05:22:50.104177 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.204451 kubelet[2380]: E0910 05:22:50.204307 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.204579 kubelet[2380]: E0910 05:22:50.204490 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="400ms" Sep 10 05:22:50.304766 kubelet[2380]: E0910 05:22:50.304723 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.405644 kubelet[2380]: E0910 05:22:50.405607 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.470562 kubelet[2380]: I0910 05:22:50.470442 2380 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 10 05:22:50.472427 kubelet[2380]: I0910 05:22:50.472377 2380 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 10 05:22:50.472427 kubelet[2380]: I0910 05:22:50.472428 2380 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 10 05:22:50.472506 kubelet[2380]: I0910 05:22:50.472459 2380 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 10 05:22:50.472506 kubelet[2380]: I0910 05:22:50.472474 2380 kubelet.go:2436] "Starting kubelet main sync loop" Sep 10 05:22:50.472549 kubelet[2380]: E0910 05:22:50.472534 2380 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 10 05:22:50.473520 kubelet[2380]: E0910 05:22:50.473403 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 10 05:22:50.506549 kubelet[2380]: E0910 05:22:50.506519 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.572691 kubelet[2380]: E0910 05:22:50.572656 2380 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 10 05:22:50.579996 kubelet[2380]: I0910 05:22:50.579944 2380 policy_none.go:49] "None policy: Start" Sep 10 05:22:50.580056 kubelet[2380]: I0910 05:22:50.580011 2380 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 10 05:22:50.580056 kubelet[2380]: I0910 05:22:50.580054 2380 state_mem.go:35] "Initializing new in-memory state store" Sep 10 05:22:50.605237 kubelet[2380]: E0910 05:22:50.605170 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="800ms" Sep 10 05:22:50.607245 kubelet[2380]: E0910 05:22:50.607194 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.708087 kubelet[2380]: E0910 05:22:50.708002 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.773310 kubelet[2380]: E0910 05:22:50.773256 2380 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 10 05:22:50.793806 kubelet[2380]: E0910 05:22:50.793764 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 10 05:22:50.794107 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 10 05:22:50.808400 kubelet[2380]: E0910 05:22:50.808337 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 10 05:22:50.814627 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 10 05:22:50.818094 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 10 05:22:50.844490 kubelet[2380]: E0910 05:22:50.844456 2380 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 10 05:22:50.844941 kubelet[2380]: I0910 05:22:50.844897 2380 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 10 05:22:50.844941 kubelet[2380]: I0910 05:22:50.844914 2380 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 10 05:22:50.845259 kubelet[2380]: I0910 05:22:50.845227 2380 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 10 05:22:50.846199 kubelet[2380]: E0910 05:22:50.846171 2380 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 10 05:22:50.846282 kubelet[2380]: E0910 05:22:50.846242 2380 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 10 05:22:50.947805 kubelet[2380]: I0910 05:22:50.947704 2380 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 10 05:22:50.950775 kubelet[2380]: E0910 05:22:50.950687 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 10 05:22:51.153011 kubelet[2380]: I0910 05:22:51.152851 2380 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 10 05:22:51.153486 kubelet[2380]: E0910 05:22:51.153418 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 10 05:22:51.179707 kubelet[2380]: E0910 05:22:51.179639 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 10 05:22:51.206525 kubelet[2380]: I0910 05:22:51.206488 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:51.301985 systemd[1]: Created slice kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice - libcontainer container kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice. Sep 10 05:22:51.307392 kubelet[2380]: I0910 05:22:51.307355 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:51.307392 kubelet[2380]: I0910 05:22:51.307389 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:51.307482 kubelet[2380]: I0910 05:22:51.307407 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:51.307482 kubelet[2380]: I0910 05:22:51.307448 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:51.307532 kubelet[2380]: I0910 05:22:51.307504 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:51.307563 kubelet[2380]: I0910 05:22:51.307544 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:51.307605 kubelet[2380]: I0910 05:22:51.307582 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:51.307631 kubelet[2380]: I0910 05:22:51.307611 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:51.319895 kubelet[2380]: E0910 05:22:51.319834 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:51.321222 containerd[1588]: time="2025-09-10T05:22:51.321119416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,}" Sep 10 05:22:51.323736 systemd[1]: Created slice kubepods-burstable-pod38b62cd258c48d8489facccbd808b9bf.slice - libcontainer container kubepods-burstable-pod38b62cd258c48d8489facccbd808b9bf.slice. Sep 10 05:22:51.325905 kubelet[2380]: E0910 05:22:51.325873 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:51.327790 systemd[1]: Created slice kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice - libcontainer container kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice. Sep 10 05:22:51.329670 kubelet[2380]: E0910 05:22:51.329646 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:51.347486 containerd[1588]: time="2025-09-10T05:22:51.347427202Z" level=info msg="connecting to shim 4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75" address="unix:///run/containerd/s/bd1fe5b15694a51006e017051dcf4d34d9dc9f34f236ff853ad77ca04ece8494" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:22:51.385345 systemd[1]: Started cri-containerd-4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75.scope - libcontainer container 4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75. Sep 10 05:22:51.406406 kubelet[2380]: E0910 05:22:51.406283 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="1.6s" Sep 10 05:22:51.430992 containerd[1588]: time="2025-09-10T05:22:51.430917264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,} returns sandbox id \"4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75\"" Sep 10 05:22:51.437611 containerd[1588]: time="2025-09-10T05:22:51.437557309Z" level=info msg="CreateContainer within sandbox \"4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 10 05:22:51.447591 containerd[1588]: time="2025-09-10T05:22:51.447547586Z" level=info msg="Container e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:22:51.456099 containerd[1588]: time="2025-09-10T05:22:51.456038539Z" level=info msg="CreateContainer within sandbox \"4bbc1dcee8b7208f6db4206efc91659b0221d9c6c6d1b3274d73adc625383b75\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988\"" Sep 10 05:22:51.456833 containerd[1588]: time="2025-09-10T05:22:51.456800357Z" level=info msg="StartContainer for \"e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988\"" Sep 10 05:22:51.457935 containerd[1588]: time="2025-09-10T05:22:51.457906460Z" level=info msg="connecting to shim e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988" address="unix:///run/containerd/s/bd1fe5b15694a51006e017051dcf4d34d9dc9f34f236ff853ad77ca04ece8494" protocol=ttrpc version=3 Sep 10 05:22:51.493925 kubelet[2380]: E0910 05:22:51.493862 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 10 05:22:51.495265 systemd[1]: Started cri-containerd-e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988.scope - libcontainer container e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988. Sep 10 05:22:51.556015 kubelet[2380]: I0910 05:22:51.555963 2380 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 10 05:22:51.556412 kubelet[2380]: E0910 05:22:51.556388 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 10 05:22:51.559977 containerd[1588]: time="2025-09-10T05:22:51.559922749Z" level=info msg="StartContainer for \"e92dbc49c994966657653857b4c8b47234933e8b1a415d7ba5dbbe340fca6988\" returns successfully" Sep 10 05:22:51.627746 containerd[1588]: time="2025-09-10T05:22:51.627683979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:38b62cd258c48d8489facccbd808b9bf,Namespace:kube-system,Attempt:0,}" Sep 10 05:22:51.631438 containerd[1588]: time="2025-09-10T05:22:51.631379898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,}" Sep 10 05:22:51.664757 containerd[1588]: time="2025-09-10T05:22:51.663173543Z" level=info msg="connecting to shim 894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe" address="unix:///run/containerd/s/7ca77b95bd9d71f612df955216d34f5fbb5ce089e437bbf9fcff51d3ae4b3534" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:22:51.671778 containerd[1588]: time="2025-09-10T05:22:51.671724582Z" level=info msg="connecting to shim ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3" address="unix:///run/containerd/s/b7fd9f300309c75b0399ce43d8622202c26200fb540483c00dcb8b5a76be4828" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:22:51.707928 systemd[1]: Started cri-containerd-894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe.scope - libcontainer container 894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe. Sep 10 05:22:51.712960 systemd[1]: Started cri-containerd-ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3.scope - libcontainer container ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3. Sep 10 05:22:51.770923 containerd[1588]: time="2025-09-10T05:22:51.770866305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,} returns sandbox id \"ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3\"" Sep 10 05:22:51.773996 containerd[1588]: time="2025-09-10T05:22:51.773892093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:38b62cd258c48d8489facccbd808b9bf,Namespace:kube-system,Attempt:0,} returns sandbox id \"894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe\"" Sep 10 05:22:51.777101 containerd[1588]: time="2025-09-10T05:22:51.776699043Z" level=info msg="CreateContainer within sandbox \"ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 10 05:22:51.779900 containerd[1588]: time="2025-09-10T05:22:51.779673750Z" level=info msg="CreateContainer within sandbox \"894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 10 05:22:51.791372 containerd[1588]: time="2025-09-10T05:22:51.791312447Z" level=info msg="Container 56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:22:51.794212 containerd[1588]: time="2025-09-10T05:22:51.794161598Z" level=info msg="Container 98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:22:51.799752 containerd[1588]: time="2025-09-10T05:22:51.799707732Z" level=info msg="CreateContainer within sandbox \"ee6cfb382a9c68650b8ce8f60e4069b07e93081ee45ba053b3c8ce9c28c3f2a3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4\"" Sep 10 05:22:51.800629 containerd[1588]: time="2025-09-10T05:22:51.800552005Z" level=info msg="StartContainer for \"56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4\"" Sep 10 05:22:51.801428 containerd[1588]: time="2025-09-10T05:22:51.801391004Z" level=info msg="CreateContainer within sandbox \"894ecf603342eba28d0be5ca638c7a2856197496f6710386e23ad80bb61378fe\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461\"" Sep 10 05:22:51.801748 containerd[1588]: time="2025-09-10T05:22:51.801711005Z" level=info msg="StartContainer for \"98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461\"" Sep 10 05:22:51.802732 containerd[1588]: time="2025-09-10T05:22:51.802684884Z" level=info msg="connecting to shim 98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461" address="unix:///run/containerd/s/7ca77b95bd9d71f612df955216d34f5fbb5ce089e437bbf9fcff51d3ae4b3534" protocol=ttrpc version=3 Sep 10 05:22:51.803931 containerd[1588]: time="2025-09-10T05:22:51.803343686Z" level=info msg="connecting to shim 56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4" address="unix:///run/containerd/s/b7fd9f300309c75b0399ce43d8622202c26200fb540483c00dcb8b5a76be4828" protocol=ttrpc version=3 Sep 10 05:22:51.827287 systemd[1]: Started cri-containerd-98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461.scope - libcontainer container 98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461. Sep 10 05:22:51.838448 systemd[1]: Started cri-containerd-56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4.scope - libcontainer container 56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4. Sep 10 05:22:51.874122 kubelet[2380]: E0910 05:22:51.874080 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 10 05:22:52.021635 containerd[1588]: time="2025-09-10T05:22:52.021543215Z" level=info msg="StartContainer for \"98d7e7272ae31162ddf30662645c9a50ccc3e329a67310bf8cefeba0959d6461\" returns successfully" Sep 10 05:22:52.021765 kubelet[2380]: E0910 05:22:52.021633 2380 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.54:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 10 05:22:52.034108 containerd[1588]: time="2025-09-10T05:22:52.034057619Z" level=info msg="StartContainer for \"56a2158c69975dbbcac068deb44ef60850864d37f7b7c3f4ee479c491e5a07f4\" returns successfully" Sep 10 05:22:52.360376 kubelet[2380]: I0910 05:22:52.360094 2380 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 10 05:22:52.485860 kubelet[2380]: E0910 05:22:52.485802 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:52.486864 kubelet[2380]: E0910 05:22:52.486835 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:52.489217 kubelet[2380]: E0910 05:22:52.489195 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:53.493037 kubelet[2380]: E0910 05:22:53.492974 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:53.495158 kubelet[2380]: E0910 05:22:53.494071 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 10 05:22:53.571672 kubelet[2380]: E0910 05:22:53.571590 2380 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 10 05:22:53.663910 kubelet[2380]: I0910 05:22:53.663842 2380 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 10 05:22:53.703946 kubelet[2380]: I0910 05:22:53.703893 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:53.710890 kubelet[2380]: E0910 05:22:53.710862 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:53.710890 kubelet[2380]: I0910 05:22:53.710882 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:53.712401 kubelet[2380]: E0910 05:22:53.712347 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:53.712401 kubelet[2380]: I0910 05:22:53.712380 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:53.713954 kubelet[2380]: E0910 05:22:53.713916 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:53.974184 kubelet[2380]: I0910 05:22:53.974112 2380 apiserver.go:52] "Watching apiserver" Sep 10 05:22:54.000026 kubelet[2380]: I0910 05:22:53.999982 2380 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 10 05:22:54.492259 kubelet[2380]: I0910 05:22:54.492208 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:54.494423 kubelet[2380]: E0910 05:22:54.494355 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:55.712527 systemd[1]: Reload requested from client PID 2662 ('systemctl') (unit session-7.scope)... Sep 10 05:22:55.712554 systemd[1]: Reloading... Sep 10 05:22:55.806183 zram_generator::config[2707]: No configuration found. Sep 10 05:22:56.054800 systemd[1]: Reloading finished in 341 ms. Sep 10 05:22:56.088358 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:56.088729 kubelet[2380]: I0910 05:22:56.088389 2380 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 10 05:22:56.107569 systemd[1]: kubelet.service: Deactivated successfully. Sep 10 05:22:56.107905 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:56.107962 systemd[1]: kubelet.service: Consumed 1.366s CPU time, 131.8M memory peak. Sep 10 05:22:56.110149 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 05:22:56.332427 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 05:22:56.352658 (kubelet)[2750]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 10 05:22:56.408346 kubelet[2750]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 05:22:56.408346 kubelet[2750]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 10 05:22:56.408346 kubelet[2750]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 05:22:56.408774 kubelet[2750]: I0910 05:22:56.408381 2750 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 10 05:22:56.415720 kubelet[2750]: I0910 05:22:56.415683 2750 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 10 05:22:56.415720 kubelet[2750]: I0910 05:22:56.415706 2750 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 10 05:22:56.415901 kubelet[2750]: I0910 05:22:56.415889 2750 server.go:956] "Client rotation is on, will bootstrap in background" Sep 10 05:22:56.417053 kubelet[2750]: I0910 05:22:56.417037 2750 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 10 05:22:56.419249 kubelet[2750]: I0910 05:22:56.419210 2750 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 10 05:22:56.423603 kubelet[2750]: I0910 05:22:56.423574 2750 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 10 05:22:56.433483 kubelet[2750]: I0910 05:22:56.433445 2750 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 10 05:22:56.433799 kubelet[2750]: I0910 05:22:56.433757 2750 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 10 05:22:56.433951 kubelet[2750]: I0910 05:22:56.433788 2750 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 10 05:22:56.434064 kubelet[2750]: I0910 05:22:56.433953 2750 topology_manager.go:138] "Creating topology manager with none policy" Sep 10 05:22:56.434064 kubelet[2750]: I0910 05:22:56.433961 2750 container_manager_linux.go:303] "Creating device plugin manager" Sep 10 05:22:56.434064 kubelet[2750]: I0910 05:22:56.434011 2750 state_mem.go:36] "Initialized new in-memory state store" Sep 10 05:22:56.434229 kubelet[2750]: I0910 05:22:56.434210 2750 kubelet.go:480] "Attempting to sync node with API server" Sep 10 05:22:56.434229 kubelet[2750]: I0910 05:22:56.434225 2750 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 10 05:22:56.434287 kubelet[2750]: I0910 05:22:56.434250 2750 kubelet.go:386] "Adding apiserver pod source" Sep 10 05:22:56.434287 kubelet[2750]: I0910 05:22:56.434266 2750 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 10 05:22:56.440205 kubelet[2750]: I0910 05:22:56.439547 2750 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 10 05:22:56.442158 kubelet[2750]: I0910 05:22:56.441230 2750 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 10 05:22:56.445062 kubelet[2750]: I0910 05:22:56.444742 2750 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 10 05:22:56.447330 kubelet[2750]: I0910 05:22:56.447277 2750 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 10 05:22:56.449153 kubelet[2750]: I0910 05:22:56.449124 2750 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 10 05:22:56.449400 kubelet[2750]: I0910 05:22:56.449341 2750 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 10 05:22:56.449726 kubelet[2750]: I0910 05:22:56.449704 2750 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 10 05:22:56.450702 kubelet[2750]: I0910 05:22:56.450372 2750 server.go:317] "Adding debug handlers to kubelet server" Sep 10 05:22:56.450702 kubelet[2750]: I0910 05:22:56.450468 2750 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 10 05:22:56.451260 kubelet[2750]: I0910 05:22:56.451241 2750 server.go:1289] "Started kubelet" Sep 10 05:22:56.454282 kubelet[2750]: I0910 05:22:56.454254 2750 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 10 05:22:56.454282 kubelet[2750]: E0910 05:22:56.454256 2750 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 10 05:22:56.454613 kubelet[2750]: I0910 05:22:56.454395 2750 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 10 05:22:56.454613 kubelet[2750]: I0910 05:22:56.454599 2750 reconciler.go:26] "Reconciler: start to sync state" Sep 10 05:22:56.455286 kubelet[2750]: I0910 05:22:56.455262 2750 factory.go:223] Registration of the systemd container factory successfully Sep 10 05:22:56.455494 kubelet[2750]: I0910 05:22:56.455445 2750 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 10 05:22:56.458067 kubelet[2750]: I0910 05:22:56.457180 2750 factory.go:223] Registration of the containerd container factory successfully Sep 10 05:22:56.465626 kubelet[2750]: I0910 05:22:56.465593 2750 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 10 05:22:56.466932 kubelet[2750]: I0910 05:22:56.466917 2750 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 10 05:22:56.467021 kubelet[2750]: I0910 05:22:56.467010 2750 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 10 05:22:56.467094 kubelet[2750]: I0910 05:22:56.467083 2750 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 10 05:22:56.467162 kubelet[2750]: I0910 05:22:56.467153 2750 kubelet.go:2436] "Starting kubelet main sync loop" Sep 10 05:22:56.467264 kubelet[2750]: E0910 05:22:56.467248 2750 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 10 05:22:56.506022 kubelet[2750]: I0910 05:22:56.505983 2750 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 10 05:22:56.506022 kubelet[2750]: I0910 05:22:56.506002 2750 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 10 05:22:56.506022 kubelet[2750]: I0910 05:22:56.506022 2750 state_mem.go:36] "Initialized new in-memory state store" Sep 10 05:22:56.506245 kubelet[2750]: I0910 05:22:56.506178 2750 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 10 05:22:56.506245 kubelet[2750]: I0910 05:22:56.506191 2750 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 10 05:22:56.506245 kubelet[2750]: I0910 05:22:56.506209 2750 policy_none.go:49] "None policy: Start" Sep 10 05:22:56.506245 kubelet[2750]: I0910 05:22:56.506219 2750 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 10 05:22:56.506245 kubelet[2750]: I0910 05:22:56.506229 2750 state_mem.go:35] "Initializing new in-memory state store" Sep 10 05:22:56.506381 kubelet[2750]: I0910 05:22:56.506316 2750 state_mem.go:75] "Updated machine memory state" Sep 10 05:22:56.510845 kubelet[2750]: E0910 05:22:56.510380 2750 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 10 05:22:56.510845 kubelet[2750]: I0910 05:22:56.510567 2750 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 10 05:22:56.510845 kubelet[2750]: I0910 05:22:56.510587 2750 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 10 05:22:56.510845 kubelet[2750]: I0910 05:22:56.510782 2750 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 10 05:22:56.511706 kubelet[2750]: E0910 05:22:56.511667 2750 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 10 05:22:56.568312 kubelet[2750]: I0910 05:22:56.568267 2750 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:56.568682 kubelet[2750]: I0910 05:22:56.568513 2750 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.568788 kubelet[2750]: I0910 05:22:56.568757 2750 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:56.613746 kubelet[2750]: I0910 05:22:56.613409 2750 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 10 05:22:56.620878 kubelet[2750]: I0910 05:22:56.620849 2750 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 10 05:22:56.620946 kubelet[2750]: I0910 05:22:56.620928 2750 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 10 05:22:56.656453 kubelet[2750]: I0910 05:22:56.656385 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:56.657470 kubelet[2750]: I0910 05:22:56.656445 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:56.657470 kubelet[2750]: I0910 05:22:56.657013 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.657470 kubelet[2750]: I0910 05:22:56.657054 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.657470 kubelet[2750]: I0910 05:22:56.657181 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.657470 kubelet[2750]: I0910 05:22:56.657228 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.657731 kubelet[2750]: I0910 05:22:56.657316 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:56.657731 kubelet[2750]: I0910 05:22:56.657561 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 10 05:22:56.657731 kubelet[2750]: I0910 05:22:56.657597 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/38b62cd258c48d8489facccbd808b9bf-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"38b62cd258c48d8489facccbd808b9bf\") " pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:57.435871 kubelet[2750]: I0910 05:22:57.435795 2750 apiserver.go:52] "Watching apiserver" Sep 10 05:22:57.454753 kubelet[2750]: I0910 05:22:57.454713 2750 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 10 05:22:57.484480 kubelet[2750]: I0910 05:22:57.484413 2750 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:57.484655 kubelet[2750]: I0910 05:22:57.484631 2750 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:57.494156 kubelet[2750]: E0910 05:22:57.492815 2750 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 10 05:22:57.494156 kubelet[2750]: E0910 05:22:57.493518 2750 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 10 05:22:57.505975 kubelet[2750]: I0910 05:22:57.505906 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.5058867679999999 podStartE2EDuration="1.505886768s" podCreationTimestamp="2025-09-10 05:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:22:57.505797983 +0000 UTC m=+1.145238931" watchObservedRunningTime="2025-09-10 05:22:57.505886768 +0000 UTC m=+1.145327706" Sep 10 05:22:57.521226 kubelet[2750]: I0910 05:22:57.521170 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.521120215 podStartE2EDuration="1.521120215s" podCreationTimestamp="2025-09-10 05:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:22:57.520954502 +0000 UTC m=+1.160395440" watchObservedRunningTime="2025-09-10 05:22:57.521120215 +0000 UTC m=+1.160561153" Sep 10 05:22:57.521437 kubelet[2750]: I0910 05:22:57.521271 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.5212670780000002 podStartE2EDuration="1.521267078s" podCreationTimestamp="2025-09-10 05:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:22:57.514517463 +0000 UTC m=+1.153958401" watchObservedRunningTime="2025-09-10 05:22:57.521267078 +0000 UTC m=+1.160708016" Sep 10 05:23:00.516842 kubelet[2750]: I0910 05:23:00.516799 2750 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 10 05:23:00.517377 containerd[1588]: time="2025-09-10T05:23:00.517198738Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 10 05:23:00.517619 kubelet[2750]: I0910 05:23:00.517410 2750 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 10 05:23:01.608286 systemd[1]: Created slice kubepods-besteffort-podf94c3c86_0aca_48a3_863a_02f1aa7ed94b.slice - libcontainer container kubepods-besteffort-podf94c3c86_0aca_48a3_863a_02f1aa7ed94b.slice. Sep 10 05:23:01.637073 systemd[1]: Created slice kubepods-besteffort-pod0561de30_7423_4e30_9754_3311e5cfd756.slice - libcontainer container kubepods-besteffort-pod0561de30_7423_4e30_9754_3311e5cfd756.slice. Sep 10 05:23:01.690845 kubelet[2750]: I0910 05:23:01.690797 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f94c3c86-0aca-48a3-863a-02f1aa7ed94b-xtables-lock\") pod \"kube-proxy-gjkqx\" (UID: \"f94c3c86-0aca-48a3-863a-02f1aa7ed94b\") " pod="kube-system/kube-proxy-gjkqx" Sep 10 05:23:01.690845 kubelet[2750]: I0910 05:23:01.690831 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f94c3c86-0aca-48a3-863a-02f1aa7ed94b-lib-modules\") pod \"kube-proxy-gjkqx\" (UID: \"f94c3c86-0aca-48a3-863a-02f1aa7ed94b\") " pod="kube-system/kube-proxy-gjkqx" Sep 10 05:23:01.691310 kubelet[2750]: I0910 05:23:01.690880 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0561de30-7423-4e30-9754-3311e5cfd756-var-lib-calico\") pod \"tigera-operator-755d956888-ghdxz\" (UID: \"0561de30-7423-4e30-9754-3311e5cfd756\") " pod="tigera-operator/tigera-operator-755d956888-ghdxz" Sep 10 05:23:01.691310 kubelet[2750]: I0910 05:23:01.690921 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj7k\" (UniqueName: \"kubernetes.io/projected/0561de30-7423-4e30-9754-3311e5cfd756-kube-api-access-zkj7k\") pod \"tigera-operator-755d956888-ghdxz\" (UID: \"0561de30-7423-4e30-9754-3311e5cfd756\") " pod="tigera-operator/tigera-operator-755d956888-ghdxz" Sep 10 05:23:01.691310 kubelet[2750]: I0910 05:23:01.690945 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c28\" (UniqueName: \"kubernetes.io/projected/f94c3c86-0aca-48a3-863a-02f1aa7ed94b-kube-api-access-b2c28\") pod \"kube-proxy-gjkqx\" (UID: \"f94c3c86-0aca-48a3-863a-02f1aa7ed94b\") " pod="kube-system/kube-proxy-gjkqx" Sep 10 05:23:01.691310 kubelet[2750]: I0910 05:23:01.690967 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f94c3c86-0aca-48a3-863a-02f1aa7ed94b-kube-proxy\") pod \"kube-proxy-gjkqx\" (UID: \"f94c3c86-0aca-48a3-863a-02f1aa7ed94b\") " pod="kube-system/kube-proxy-gjkqx" Sep 10 05:23:01.920125 containerd[1588]: time="2025-09-10T05:23:01.920017308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gjkqx,Uid:f94c3c86-0aca-48a3-863a-02f1aa7ed94b,Namespace:kube-system,Attempt:0,}" Sep 10 05:23:01.938107 containerd[1588]: time="2025-09-10T05:23:01.938057310Z" level=info msg="connecting to shim 5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c" address="unix:///run/containerd/s/3a0cf4486edc1fb0810365cfd95653132cedb84f988eac5fa4c132374fdad2b1" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:01.941850 containerd[1588]: time="2025-09-10T05:23:01.941516108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-ghdxz,Uid:0561de30-7423-4e30-9754-3311e5cfd756,Namespace:tigera-operator,Attempt:0,}" Sep 10 05:23:01.961533 systemd[1]: Started cri-containerd-5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c.scope - libcontainer container 5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c. Sep 10 05:23:01.969755 containerd[1588]: time="2025-09-10T05:23:01.969657449Z" level=info msg="connecting to shim 982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe" address="unix:///run/containerd/s/64c7bab1e294ea47c7a9e93bbbef8b15602410388e032e6d4e3b61ac7d8b67af" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:01.997458 containerd[1588]: time="2025-09-10T05:23:01.997410002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gjkqx,Uid:f94c3c86-0aca-48a3-863a-02f1aa7ed94b,Namespace:kube-system,Attempt:0,} returns sandbox id \"5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c\"" Sep 10 05:23:02.001517 systemd[1]: Started cri-containerd-982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe.scope - libcontainer container 982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe. Sep 10 05:23:02.004023 containerd[1588]: time="2025-09-10T05:23:02.003986656Z" level=info msg="CreateContainer within sandbox \"5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 10 05:23:02.017034 containerd[1588]: time="2025-09-10T05:23:02.016964905Z" level=info msg="Container 49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:02.025339 containerd[1588]: time="2025-09-10T05:23:02.025283659Z" level=info msg="CreateContainer within sandbox \"5eff2c80c44aa1d04c8a04571a94a16a6e334f9db7b2ceaa53a8e9801ec40b8c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6\"" Sep 10 05:23:02.026652 containerd[1588]: time="2025-09-10T05:23:02.025957336Z" level=info msg="StartContainer for \"49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6\"" Sep 10 05:23:02.027195 containerd[1588]: time="2025-09-10T05:23:02.027157018Z" level=info msg="connecting to shim 49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6" address="unix:///run/containerd/s/3a0cf4486edc1fb0810365cfd95653132cedb84f988eac5fa4c132374fdad2b1" protocol=ttrpc version=3 Sep 10 05:23:02.044493 containerd[1588]: time="2025-09-10T05:23:02.044409683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-ghdxz,Uid:0561de30-7423-4e30-9754-3311e5cfd756,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe\"" Sep 10 05:23:02.046236 containerd[1588]: time="2025-09-10T05:23:02.046163557Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 10 05:23:02.049372 systemd[1]: Started cri-containerd-49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6.scope - libcontainer container 49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6. Sep 10 05:23:02.091535 containerd[1588]: time="2025-09-10T05:23:02.091490694Z" level=info msg="StartContainer for \"49d7f190b5f9f36fc87473c258bac010f3cedd0df8b3e30666b198c9da0c3ea6\" returns successfully" Sep 10 05:23:03.329812 kubelet[2750]: I0910 05:23:03.329747 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gjkqx" podStartSLOduration=2.329715735 podStartE2EDuration="2.329715735s" podCreationTimestamp="2025-09-10 05:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:23:02.504859902 +0000 UTC m=+6.144300850" watchObservedRunningTime="2025-09-10 05:23:03.329715735 +0000 UTC m=+6.969156673" Sep 10 05:23:04.041885 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount737322924.mount: Deactivated successfully. Sep 10 05:23:06.707662 update_engine[1571]: I20250910 05:23:06.707571 1571 update_attempter.cc:509] Updating boot flags... Sep 10 05:23:09.077338 containerd[1588]: time="2025-09-10T05:23:09.077262880Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:09.078318 containerd[1588]: time="2025-09-10T05:23:09.078295104Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 10 05:23:09.079423 containerd[1588]: time="2025-09-10T05:23:09.079368931Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:09.081667 containerd[1588]: time="2025-09-10T05:23:09.081621469Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:09.082205 containerd[1588]: time="2025-09-10T05:23:09.082176107Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 7.035980585s" Sep 10 05:23:09.082249 containerd[1588]: time="2025-09-10T05:23:09.082206454Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 10 05:23:09.086988 containerd[1588]: time="2025-09-10T05:23:09.086957890Z" level=info msg="CreateContainer within sandbox \"982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 10 05:23:09.095927 containerd[1588]: time="2025-09-10T05:23:09.095886189Z" level=info msg="Container 17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:09.102943 containerd[1588]: time="2025-09-10T05:23:09.102910217Z" level=info msg="CreateContainer within sandbox \"982236a3dcf008b5790a177d8510bbfa763f3fd9721ce85dadaf0169d4db0abe\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521\"" Sep 10 05:23:09.104244 containerd[1588]: time="2025-09-10T05:23:09.103312044Z" level=info msg="StartContainer for \"17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521\"" Sep 10 05:23:09.104244 containerd[1588]: time="2025-09-10T05:23:09.104007727Z" level=info msg="connecting to shim 17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521" address="unix:///run/containerd/s/64c7bab1e294ea47c7a9e93bbbef8b15602410388e032e6d4e3b61ac7d8b67af" protocol=ttrpc version=3 Sep 10 05:23:09.157275 systemd[1]: Started cri-containerd-17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521.scope - libcontainer container 17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521. Sep 10 05:23:09.188257 containerd[1588]: time="2025-09-10T05:23:09.188215437Z" level=info msg="StartContainer for \"17bc49ea3669bd4540cff3b7dd85118e3496cec50835bd9a59d5df8029083521\" returns successfully" Sep 10 05:23:14.351472 sudo[1802]: pam_unix(sudo:session): session closed for user root Sep 10 05:23:14.353944 sshd[1801]: Connection closed by 10.0.0.1 port 54388 Sep 10 05:23:14.354961 sshd-session[1798]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:14.361376 systemd-logind[1570]: Session 7 logged out. Waiting for processes to exit. Sep 10 05:23:14.361641 systemd[1]: sshd@6-10.0.0.54:22-10.0.0.1:54388.service: Deactivated successfully. Sep 10 05:23:14.365352 systemd[1]: session-7.scope: Deactivated successfully. Sep 10 05:23:14.365797 systemd[1]: session-7.scope: Consumed 6.278s CPU time, 228M memory peak. Sep 10 05:23:14.368835 systemd-logind[1570]: Removed session 7. Sep 10 05:23:16.833551 kubelet[2750]: I0910 05:23:16.833237 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-ghdxz" podStartSLOduration=8.795899358 podStartE2EDuration="15.833218261s" podCreationTimestamp="2025-09-10 05:23:01 +0000 UTC" firstStartedPulling="2025-09-10 05:23:02.045785355 +0000 UTC m=+5.685226293" lastFinishedPulling="2025-09-10 05:23:09.083104258 +0000 UTC m=+12.722545196" observedRunningTime="2025-09-10 05:23:09.513775933 +0000 UTC m=+13.153216871" watchObservedRunningTime="2025-09-10 05:23:16.833218261 +0000 UTC m=+20.472659199" Sep 10 05:23:16.944834 systemd[1]: Created slice kubepods-besteffort-pod913a6669_9dce_4f4b_a14a_95164b29ad85.slice - libcontainer container kubepods-besteffort-pod913a6669_9dce_4f4b_a14a_95164b29ad85.slice. Sep 10 05:23:16.981874 kubelet[2750]: I0910 05:23:16.981810 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/913a6669-9dce-4f4b-a14a-95164b29ad85-tigera-ca-bundle\") pod \"calico-typha-7576c645f4-rphtg\" (UID: \"913a6669-9dce-4f4b-a14a-95164b29ad85\") " pod="calico-system/calico-typha-7576c645f4-rphtg" Sep 10 05:23:16.981874 kubelet[2750]: I0910 05:23:16.981857 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89t7\" (UniqueName: \"kubernetes.io/projected/913a6669-9dce-4f4b-a14a-95164b29ad85-kube-api-access-f89t7\") pod \"calico-typha-7576c645f4-rphtg\" (UID: \"913a6669-9dce-4f4b-a14a-95164b29ad85\") " pod="calico-system/calico-typha-7576c645f4-rphtg" Sep 10 05:23:16.981874 kubelet[2750]: I0910 05:23:16.981877 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/913a6669-9dce-4f4b-a14a-95164b29ad85-typha-certs\") pod \"calico-typha-7576c645f4-rphtg\" (UID: \"913a6669-9dce-4f4b-a14a-95164b29ad85\") " pod="calico-system/calico-typha-7576c645f4-rphtg" Sep 10 05:23:17.242397 systemd[1]: Created slice kubepods-besteffort-pod44be1c01_4d72_4619_ba0b_90f6b1f4ab7c.slice - libcontainer container kubepods-besteffort-pod44be1c01_4d72_4619_ba0b_90f6b1f4ab7c.slice. Sep 10 05:23:17.259183 containerd[1588]: time="2025-09-10T05:23:17.255923677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7576c645f4-rphtg,Uid:913a6669-9dce-4f4b-a14a-95164b29ad85,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:17.284594 kubelet[2750]: I0910 05:23:17.284519 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-cni-net-dir\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284594 kubelet[2750]: I0910 05:23:17.284579 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-xtables-lock\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284594 kubelet[2750]: I0910 05:23:17.284597 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-var-lib-calico\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284594 kubelet[2750]: I0910 05:23:17.284615 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-var-run-calico\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284926 kubelet[2750]: I0910 05:23:17.284631 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-cni-bin-dir\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284926 kubelet[2750]: I0910 05:23:17.284645 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-lib-modules\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284926 kubelet[2750]: I0910 05:23:17.284659 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-policysync\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284926 kubelet[2750]: I0910 05:23:17.284674 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-flexvol-driver-host\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.284926 kubelet[2750]: I0910 05:23:17.284692 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv6s7\" (UniqueName: \"kubernetes.io/projected/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-kube-api-access-gv6s7\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.285059 kubelet[2750]: I0910 05:23:17.284707 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-node-certs\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.285059 kubelet[2750]: I0910 05:23:17.284730 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-tigera-ca-bundle\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.285059 kubelet[2750]: I0910 05:23:17.284747 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/44be1c01-4d72-4619-ba0b-90f6b1f4ab7c-cni-log-dir\") pod \"calico-node-wdstg\" (UID: \"44be1c01-4d72-4619-ba0b-90f6b1f4ab7c\") " pod="calico-system/calico-node-wdstg" Sep 10 05:23:17.338612 containerd[1588]: time="2025-09-10T05:23:17.338552311Z" level=info msg="connecting to shim c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f" address="unix:///run/containerd/s/fab9b5fa5be58efb62012db9e71a066d9ea3b74ff885cd8e0e509c5031ab26e5" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:17.369534 systemd[1]: Started cri-containerd-c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f.scope - libcontainer container c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f. Sep 10 05:23:17.397629 kubelet[2750]: E0910 05:23:17.397291 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.397629 kubelet[2750]: W0910 05:23:17.397321 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.398575 kubelet[2750]: E0910 05:23:17.398531 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.398917 kubelet[2750]: E0910 05:23:17.398898 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.398917 kubelet[2750]: W0910 05:23:17.398912 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.398996 kubelet[2750]: E0910 05:23:17.398923 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.429646 containerd[1588]: time="2025-09-10T05:23:17.429603775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7576c645f4-rphtg,Uid:913a6669-9dce-4f4b-a14a-95164b29ad85,Namespace:calico-system,Attempt:0,} returns sandbox id \"c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f\"" Sep 10 05:23:17.431401 containerd[1588]: time="2025-09-10T05:23:17.431360907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 10 05:23:17.490026 kubelet[2750]: E0910 05:23:17.489298 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5hfbv" podUID="18af5fbc-bc21-4384-998a-43c1dd346c16" Sep 10 05:23:17.545825 containerd[1588]: time="2025-09-10T05:23:17.545771640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wdstg,Uid:44be1c01-4d72-4619-ba0b-90f6b1f4ab7c,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:17.567344 kubelet[2750]: E0910 05:23:17.567311 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.567344 kubelet[2750]: W0910 05:23:17.567335 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.567464 kubelet[2750]: E0910 05:23:17.567356 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.567657 kubelet[2750]: E0910 05:23:17.567641 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.567657 kubelet[2750]: W0910 05:23:17.567652 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.567719 kubelet[2750]: E0910 05:23:17.567662 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.567877 kubelet[2750]: E0910 05:23:17.567861 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.567877 kubelet[2750]: W0910 05:23:17.567873 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.567928 kubelet[2750]: E0910 05:23:17.567883 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.568182 kubelet[2750]: E0910 05:23:17.568151 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.568182 kubelet[2750]: W0910 05:23:17.568164 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.568182 kubelet[2750]: E0910 05:23:17.568174 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.568605 kubelet[2750]: E0910 05:23:17.568563 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.568605 kubelet[2750]: W0910 05:23:17.568577 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.568605 kubelet[2750]: E0910 05:23:17.568586 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.569006 kubelet[2750]: E0910 05:23:17.568987 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.569006 kubelet[2750]: W0910 05:23:17.569001 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.569104 kubelet[2750]: E0910 05:23:17.569011 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.569288 kubelet[2750]: E0910 05:23:17.569270 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.569288 kubelet[2750]: W0910 05:23:17.569279 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.569288 kubelet[2750]: E0910 05:23:17.569289 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.569691 kubelet[2750]: E0910 05:23:17.569672 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.569691 kubelet[2750]: W0910 05:23:17.569686 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.569765 kubelet[2750]: E0910 05:23:17.569696 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.569933 kubelet[2750]: E0910 05:23:17.569916 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.569933 kubelet[2750]: W0910 05:23:17.569929 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.570007 kubelet[2750]: E0910 05:23:17.569938 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.570228 kubelet[2750]: E0910 05:23:17.570208 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.570228 kubelet[2750]: W0910 05:23:17.570221 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.570228 kubelet[2750]: E0910 05:23:17.570231 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.570495 kubelet[2750]: E0910 05:23:17.570465 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.570495 kubelet[2750]: W0910 05:23:17.570477 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.570495 kubelet[2750]: E0910 05:23:17.570487 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.570791 kubelet[2750]: E0910 05:23:17.570760 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.570791 kubelet[2750]: W0910 05:23:17.570777 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.570791 kubelet[2750]: E0910 05:23:17.570786 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.571120 kubelet[2750]: E0910 05:23:17.571056 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.571120 kubelet[2750]: W0910 05:23:17.571078 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.571120 kubelet[2750]: E0910 05:23:17.571087 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.571435 kubelet[2750]: E0910 05:23:17.571409 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.571435 kubelet[2750]: W0910 05:23:17.571422 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.571435 kubelet[2750]: E0910 05:23:17.571431 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.571814 kubelet[2750]: E0910 05:23:17.571795 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.571814 kubelet[2750]: W0910 05:23:17.571808 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.571871 kubelet[2750]: E0910 05:23:17.571818 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.572056 kubelet[2750]: E0910 05:23:17.572037 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.572056 kubelet[2750]: W0910 05:23:17.572050 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.572056 kubelet[2750]: E0910 05:23:17.572059 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.572476 kubelet[2750]: E0910 05:23:17.572456 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.572476 kubelet[2750]: W0910 05:23:17.572470 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.572476 kubelet[2750]: E0910 05:23:17.572480 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.572834 kubelet[2750]: E0910 05:23:17.572810 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.572834 kubelet[2750]: W0910 05:23:17.572825 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.572834 kubelet[2750]: E0910 05:23:17.572834 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.573112 kubelet[2750]: E0910 05:23:17.573094 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.573112 kubelet[2750]: W0910 05:23:17.573107 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.573195 kubelet[2750]: E0910 05:23:17.573116 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.574286 kubelet[2750]: E0910 05:23:17.574264 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.574286 kubelet[2750]: W0910 05:23:17.574279 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.574286 kubelet[2750]: E0910 05:23:17.574289 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.574407 containerd[1588]: time="2025-09-10T05:23:17.574318928Z" level=info msg="connecting to shim 9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c" address="unix:///run/containerd/s/3ef62adf8bae4b45346f25eb68a307874b9ae26fe5ac14362a9b813db4f25d6b" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:17.587728 kubelet[2750]: E0910 05:23:17.587708 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.587728 kubelet[2750]: W0910 05:23:17.587722 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.587806 kubelet[2750]: E0910 05:23:17.587733 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.587806 kubelet[2750]: I0910 05:23:17.587801 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18af5fbc-bc21-4384-998a-43c1dd346c16-kubelet-dir\") pod \"csi-node-driver-5hfbv\" (UID: \"18af5fbc-bc21-4384-998a-43c1dd346c16\") " pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:17.588251 kubelet[2750]: E0910 05:23:17.588225 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.588313 kubelet[2750]: W0910 05:23:17.588278 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.588341 kubelet[2750]: E0910 05:23:17.588311 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.588384 kubelet[2750]: I0910 05:23:17.588366 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/18af5fbc-bc21-4384-998a-43c1dd346c16-varrun\") pod \"csi-node-driver-5hfbv\" (UID: \"18af5fbc-bc21-4384-998a-43c1dd346c16\") " pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:17.589066 kubelet[2750]: E0910 05:23:17.588945 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.589111 kubelet[2750]: W0910 05:23:17.589059 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.589111 kubelet[2750]: E0910 05:23:17.589101 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.589530 kubelet[2750]: E0910 05:23:17.589490 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.589530 kubelet[2750]: W0910 05:23:17.589508 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.589530 kubelet[2750]: E0910 05:23:17.589520 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.589969 kubelet[2750]: E0910 05:23:17.589946 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.589969 kubelet[2750]: W0910 05:23:17.589963 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.590035 kubelet[2750]: E0910 05:23:17.589976 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.590035 kubelet[2750]: I0910 05:23:17.590018 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxl7\" (UniqueName: \"kubernetes.io/projected/18af5fbc-bc21-4384-998a-43c1dd346c16-kube-api-access-2wxl7\") pod \"csi-node-driver-5hfbv\" (UID: \"18af5fbc-bc21-4384-998a-43c1dd346c16\") " pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:17.590242 kubelet[2750]: E0910 05:23:17.590218 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.590242 kubelet[2750]: W0910 05:23:17.590236 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.590311 kubelet[2750]: E0910 05:23:17.590246 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.590488 kubelet[2750]: E0910 05:23:17.590468 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.590488 kubelet[2750]: W0910 05:23:17.590481 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.590488 kubelet[2750]: E0910 05:23:17.590489 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.590739 kubelet[2750]: E0910 05:23:17.590724 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.590739 kubelet[2750]: W0910 05:23:17.590735 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.590795 kubelet[2750]: E0910 05:23:17.590743 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.590795 kubelet[2750]: I0910 05:23:17.590771 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/18af5fbc-bc21-4384-998a-43c1dd346c16-socket-dir\") pod \"csi-node-driver-5hfbv\" (UID: \"18af5fbc-bc21-4384-998a-43c1dd346c16\") " pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:17.591097 kubelet[2750]: E0910 05:23:17.591078 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.591097 kubelet[2750]: W0910 05:23:17.591091 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.591211 kubelet[2750]: E0910 05:23:17.591100 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.591211 kubelet[2750]: I0910 05:23:17.591118 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/18af5fbc-bc21-4384-998a-43c1dd346c16-registration-dir\") pod \"csi-node-driver-5hfbv\" (UID: \"18af5fbc-bc21-4384-998a-43c1dd346c16\") " pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:17.591610 kubelet[2750]: E0910 05:23:17.591363 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.591610 kubelet[2750]: W0910 05:23:17.591378 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.591610 kubelet[2750]: E0910 05:23:17.591387 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.591699 kubelet[2750]: E0910 05:23:17.591631 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.591699 kubelet[2750]: W0910 05:23:17.591639 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.591699 kubelet[2750]: E0910 05:23:17.591647 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.592090 kubelet[2750]: E0910 05:23:17.591887 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.592090 kubelet[2750]: W0910 05:23:17.591894 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.592090 kubelet[2750]: E0910 05:23:17.591902 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.592284 kubelet[2750]: E0910 05:23:17.592123 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.592284 kubelet[2750]: W0910 05:23:17.592277 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.592350 kubelet[2750]: E0910 05:23:17.592307 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.592581 kubelet[2750]: E0910 05:23:17.592562 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.592581 kubelet[2750]: W0910 05:23:17.592575 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.592581 kubelet[2750]: E0910 05:23:17.592584 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.592804 kubelet[2750]: E0910 05:23:17.592787 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.592804 kubelet[2750]: W0910 05:23:17.592798 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.592866 kubelet[2750]: E0910 05:23:17.592806 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.609364 systemd[1]: Started cri-containerd-9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c.scope - libcontainer container 9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c. Sep 10 05:23:17.678576 containerd[1588]: time="2025-09-10T05:23:17.678514436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wdstg,Uid:44be1c01-4d72-4619-ba0b-90f6b1f4ab7c,Namespace:calico-system,Attempt:0,} returns sandbox id \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\"" Sep 10 05:23:17.692475 kubelet[2750]: E0910 05:23:17.692383 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.692475 kubelet[2750]: W0910 05:23:17.692409 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.693001 kubelet[2750]: E0910 05:23:17.692432 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.693525 kubelet[2750]: E0910 05:23:17.693503 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.693525 kubelet[2750]: W0910 05:23:17.693517 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.693525 kubelet[2750]: E0910 05:23:17.693530 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.694022 kubelet[2750]: E0910 05:23:17.693907 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.694022 kubelet[2750]: W0910 05:23:17.693961 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.694022 kubelet[2750]: E0910 05:23:17.693989 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.694858 kubelet[2750]: E0910 05:23:17.694740 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.694858 kubelet[2750]: W0910 05:23:17.694779 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.694858 kubelet[2750]: E0910 05:23:17.694791 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.695287 kubelet[2750]: E0910 05:23:17.695258 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.695287 kubelet[2750]: W0910 05:23:17.695272 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.695287 kubelet[2750]: E0910 05:23:17.695283 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.695569 kubelet[2750]: E0910 05:23:17.695550 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.695569 kubelet[2750]: W0910 05:23:17.695565 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.695569 kubelet[2750]: E0910 05:23:17.695576 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.695843 kubelet[2750]: E0910 05:23:17.695820 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.695843 kubelet[2750]: W0910 05:23:17.695839 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.695964 kubelet[2750]: E0910 05:23:17.695850 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.696364 kubelet[2750]: E0910 05:23:17.696342 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.696426 kubelet[2750]: W0910 05:23:17.696399 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.696426 kubelet[2750]: E0910 05:23:17.696414 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.696744 kubelet[2750]: E0910 05:23:17.696724 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.696744 kubelet[2750]: W0910 05:23:17.696737 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.696744 kubelet[2750]: E0910 05:23:17.696748 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.696998 kubelet[2750]: E0910 05:23:17.696979 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.696998 kubelet[2750]: W0910 05:23:17.696991 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.696998 kubelet[2750]: E0910 05:23:17.696999 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.697279 kubelet[2750]: E0910 05:23:17.697272 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.697279 kubelet[2750]: W0910 05:23:17.697281 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.697485 kubelet[2750]: E0910 05:23:17.697290 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.697766 kubelet[2750]: E0910 05:23:17.697732 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.698055 kubelet[2750]: W0910 05:23:17.697808 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.698055 kubelet[2750]: E0910 05:23:17.697820 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.698694 kubelet[2750]: E0910 05:23:17.698677 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.698868 kubelet[2750]: W0910 05:23:17.698766 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.699021 kubelet[2750]: E0910 05:23:17.698927 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.699464 kubelet[2750]: E0910 05:23:17.699433 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.699464 kubelet[2750]: W0910 05:23:17.699445 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.699464 kubelet[2750]: E0910 05:23:17.699455 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.700744 kubelet[2750]: E0910 05:23:17.700711 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.700744 kubelet[2750]: W0910 05:23:17.700739 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.700982 kubelet[2750]: E0910 05:23:17.700954 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.701494 kubelet[2750]: E0910 05:23:17.701464 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.701536 kubelet[2750]: W0910 05:23:17.701477 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.701536 kubelet[2750]: E0910 05:23:17.701522 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.701904 kubelet[2750]: E0910 05:23:17.701884 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.701904 kubelet[2750]: W0910 05:23:17.701901 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.701959 kubelet[2750]: E0910 05:23:17.701914 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.702461 kubelet[2750]: E0910 05:23:17.702440 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.702617 kubelet[2750]: W0910 05:23:17.702597 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.702665 kubelet[2750]: E0910 05:23:17.702617 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.703436 kubelet[2750]: E0910 05:23:17.703415 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.703482 kubelet[2750]: W0910 05:23:17.703437 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.703482 kubelet[2750]: E0910 05:23:17.703451 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.704119 kubelet[2750]: E0910 05:23:17.704085 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.704119 kubelet[2750]: W0910 05:23:17.704112 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.704119 kubelet[2750]: E0910 05:23:17.704154 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.704557 kubelet[2750]: E0910 05:23:17.704537 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.704557 kubelet[2750]: W0910 05:23:17.704553 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.704688 kubelet[2750]: E0910 05:23:17.704566 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.704908 kubelet[2750]: E0910 05:23:17.704870 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.704908 kubelet[2750]: W0910 05:23:17.704890 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.704964 kubelet[2750]: E0910 05:23:17.704934 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.705887 kubelet[2750]: E0910 05:23:17.705481 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.705887 kubelet[2750]: W0910 05:23:17.705636 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.705887 kubelet[2750]: E0910 05:23:17.705650 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.706017 kubelet[2750]: E0910 05:23:17.705987 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.706017 kubelet[2750]: W0910 05:23:17.706004 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.706017 kubelet[2750]: E0910 05:23:17.706018 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.706479 kubelet[2750]: E0910 05:23:17.706442 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.706479 kubelet[2750]: W0910 05:23:17.706462 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.706479 kubelet[2750]: E0910 05:23:17.706474 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:17.714546 kubelet[2750]: E0910 05:23:17.714499 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:17.714546 kubelet[2750]: W0910 05:23:17.714521 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:17.714546 kubelet[2750]: E0910 05:23:17.714543 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:18.975204 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2284377398.mount: Deactivated successfully. Sep 10 05:23:19.318834 containerd[1588]: time="2025-09-10T05:23:19.318769749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:19.319603 containerd[1588]: time="2025-09-10T05:23:19.319546242Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 10 05:23:19.320728 containerd[1588]: time="2025-09-10T05:23:19.320680509Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:19.322597 containerd[1588]: time="2025-09-10T05:23:19.322557740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:19.323119 containerd[1588]: time="2025-09-10T05:23:19.323063394Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 1.891663033s" Sep 10 05:23:19.323119 containerd[1588]: time="2025-09-10T05:23:19.323108469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 10 05:23:19.324156 containerd[1588]: time="2025-09-10T05:23:19.324118544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 10 05:23:19.335733 containerd[1588]: time="2025-09-10T05:23:19.335691257Z" level=info msg="CreateContainer within sandbox \"c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 10 05:23:19.344576 containerd[1588]: time="2025-09-10T05:23:19.344521460Z" level=info msg="Container a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:19.352229 containerd[1588]: time="2025-09-10T05:23:19.352188183Z" level=info msg="CreateContainer within sandbox \"c3e8de724eba0cde9e715e34cb3efd5d2ee6eefbb59b3ced22b3fa7c29f8ae6f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d\"" Sep 10 05:23:19.353270 containerd[1588]: time="2025-09-10T05:23:19.353117569Z" level=info msg="StartContainer for \"a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d\"" Sep 10 05:23:19.354407 containerd[1588]: time="2025-09-10T05:23:19.354372239Z" level=info msg="connecting to shim a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d" address="unix:///run/containerd/s/fab9b5fa5be58efb62012db9e71a066d9ea3b74ff885cd8e0e509c5031ab26e5" protocol=ttrpc version=3 Sep 10 05:23:19.375264 systemd[1]: Started cri-containerd-a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d.scope - libcontainer container a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d. Sep 10 05:23:19.433619 containerd[1588]: time="2025-09-10T05:23:19.433557143Z" level=info msg="StartContainer for \"a3adc234338f3587d49d2dd09cadc4b7d322366f8925b9b906a05c98458e866d\" returns successfully" Sep 10 05:23:19.468575 kubelet[2750]: E0910 05:23:19.468451 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5hfbv" podUID="18af5fbc-bc21-4384-998a-43c1dd346c16" Sep 10 05:23:19.538659 kubelet[2750]: I0910 05:23:19.538572 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7576c645f4-rphtg" podStartSLOduration=1.6454120890000001 podStartE2EDuration="3.538550808s" podCreationTimestamp="2025-09-10 05:23:16 +0000 UTC" firstStartedPulling="2025-09-10 05:23:17.430856636 +0000 UTC m=+21.070297574" lastFinishedPulling="2025-09-10 05:23:19.323995355 +0000 UTC m=+22.963436293" observedRunningTime="2025-09-10 05:23:19.537865767 +0000 UTC m=+23.177306705" watchObservedRunningTime="2025-09-10 05:23:19.538550808 +0000 UTC m=+23.177991746" Sep 10 05:23:19.589532 kubelet[2750]: E0910 05:23:19.589252 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.589532 kubelet[2750]: W0910 05:23:19.589283 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.589532 kubelet[2750]: E0910 05:23:19.589308 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.591319 kubelet[2750]: E0910 05:23:19.591174 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.591319 kubelet[2750]: W0910 05:23:19.591190 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.591319 kubelet[2750]: E0910 05:23:19.591201 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.592155 kubelet[2750]: E0910 05:23:19.591623 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.592155 kubelet[2750]: W0910 05:23:19.591637 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.592155 kubelet[2750]: E0910 05:23:19.591647 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.592155 kubelet[2750]: E0910 05:23:19.591904 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.592155 kubelet[2750]: W0910 05:23:19.591913 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.592155 kubelet[2750]: E0910 05:23:19.591922 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.592155 kubelet[2750]: E0910 05:23:19.592159 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.592334 kubelet[2750]: W0910 05:23:19.592167 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.592334 kubelet[2750]: E0910 05:23:19.592177 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.592382 kubelet[2750]: E0910 05:23:19.592360 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.592382 kubelet[2750]: W0910 05:23:19.592369 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.592382 kubelet[2750]: E0910 05:23:19.592377 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.592630 kubelet[2750]: E0910 05:23:19.592607 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.592630 kubelet[2750]: W0910 05:23:19.592622 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.592630 kubelet[2750]: E0910 05:23:19.592632 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.594392 kubelet[2750]: E0910 05:23:19.594365 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.594392 kubelet[2750]: W0910 05:23:19.594383 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.594392 kubelet[2750]: E0910 05:23:19.594393 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.594636 kubelet[2750]: E0910 05:23:19.594613 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.594636 kubelet[2750]: W0910 05:23:19.594627 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.594636 kubelet[2750]: E0910 05:23:19.594636 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.594865 kubelet[2750]: E0910 05:23:19.594841 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.594865 kubelet[2750]: W0910 05:23:19.594856 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.594865 kubelet[2750]: E0910 05:23:19.594865 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.596156 kubelet[2750]: E0910 05:23:19.596001 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.596156 kubelet[2750]: W0910 05:23:19.596036 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.596156 kubelet[2750]: E0910 05:23:19.596068 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.596664 kubelet[2750]: E0910 05:23:19.596636 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.596664 kubelet[2750]: W0910 05:23:19.596654 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.596664 kubelet[2750]: E0910 05:23:19.596665 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.597398 kubelet[2750]: E0910 05:23:19.597370 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.597398 kubelet[2750]: W0910 05:23:19.597387 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.597398 kubelet[2750]: E0910 05:23:19.597397 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.598205 kubelet[2750]: E0910 05:23:19.598175 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.598205 kubelet[2750]: W0910 05:23:19.598193 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.598205 kubelet[2750]: E0910 05:23:19.598203 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.598542 kubelet[2750]: E0910 05:23:19.598515 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.598542 kubelet[2750]: W0910 05:23:19.598531 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.598542 kubelet[2750]: E0910 05:23:19.598543 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.609886 kubelet[2750]: E0910 05:23:19.609847 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.609886 kubelet[2750]: W0910 05:23:19.609872 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.609886 kubelet[2750]: E0910 05:23:19.609893 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.611262 kubelet[2750]: E0910 05:23:19.611236 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.611262 kubelet[2750]: W0910 05:23:19.611255 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.611367 kubelet[2750]: E0910 05:23:19.611266 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.611586 kubelet[2750]: E0910 05:23:19.611562 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.611586 kubelet[2750]: W0910 05:23:19.611577 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.611586 kubelet[2750]: E0910 05:23:19.611587 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.611878 kubelet[2750]: E0910 05:23:19.611854 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.611878 kubelet[2750]: W0910 05:23:19.611870 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.611878 kubelet[2750]: E0910 05:23:19.611880 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.612142 kubelet[2750]: E0910 05:23:19.612105 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.612142 kubelet[2750]: W0910 05:23:19.612119 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.613184 kubelet[2750]: E0910 05:23:19.613156 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.615529 kubelet[2750]: E0910 05:23:19.615491 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.615529 kubelet[2750]: W0910 05:23:19.615521 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.615529 kubelet[2750]: E0910 05:23:19.615533 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.615863 kubelet[2750]: E0910 05:23:19.615841 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.615863 kubelet[2750]: W0910 05:23:19.615855 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.615935 kubelet[2750]: E0910 05:23:19.615865 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.616163 kubelet[2750]: E0910 05:23:19.616112 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.617193 kubelet[2750]: W0910 05:23:19.616126 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.617238 kubelet[2750]: E0910 05:23:19.617189 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.617498 kubelet[2750]: E0910 05:23:19.617471 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.617498 kubelet[2750]: W0910 05:23:19.617490 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.617498 kubelet[2750]: E0910 05:23:19.617512 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.617801 kubelet[2750]: E0910 05:23:19.617777 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.617801 kubelet[2750]: W0910 05:23:19.617794 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.617857 kubelet[2750]: E0910 05:23:19.617804 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.618359 kubelet[2750]: E0910 05:23:19.618334 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.618359 kubelet[2750]: W0910 05:23:19.618351 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.618429 kubelet[2750]: E0910 05:23:19.618362 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.618857 kubelet[2750]: E0910 05:23:19.618829 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.618857 kubelet[2750]: W0910 05:23:19.618845 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.619008 kubelet[2750]: E0910 05:23:19.618855 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.619439 kubelet[2750]: E0910 05:23:19.619400 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.619439 kubelet[2750]: W0910 05:23:19.619415 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.619439 kubelet[2750]: E0910 05:23:19.619424 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.620075 kubelet[2750]: E0910 05:23:19.620050 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.620075 kubelet[2750]: W0910 05:23:19.620067 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.620075 kubelet[2750]: E0910 05:23:19.620076 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.620817 kubelet[2750]: E0910 05:23:19.620786 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.620817 kubelet[2750]: W0910 05:23:19.620803 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.620817 kubelet[2750]: E0910 05:23:19.620812 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.622173 kubelet[2750]: E0910 05:23:19.621294 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.622173 kubelet[2750]: W0910 05:23:19.621310 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.622173 kubelet[2750]: E0910 05:23:19.621320 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.622173 kubelet[2750]: E0910 05:23:19.621997 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.622173 kubelet[2750]: W0910 05:23:19.622006 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.622173 kubelet[2750]: E0910 05:23:19.622016 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:19.622394 kubelet[2750]: E0910 05:23:19.622235 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:19.622394 kubelet[2750]: W0910 05:23:19.622242 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:19.622394 kubelet[2750]: E0910 05:23:19.622250 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.528060 kubelet[2750]: I0910 05:23:20.528009 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:20.605630 kubelet[2750]: E0910 05:23:20.605570 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.605630 kubelet[2750]: W0910 05:23:20.605618 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.605856 kubelet[2750]: E0910 05:23:20.605657 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.606150 kubelet[2750]: E0910 05:23:20.606087 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.606150 kubelet[2750]: W0910 05:23:20.606103 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.606150 kubelet[2750]: E0910 05:23:20.606115 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.606484 kubelet[2750]: E0910 05:23:20.606460 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.606484 kubelet[2750]: W0910 05:23:20.606475 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.606556 kubelet[2750]: E0910 05:23:20.606487 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.606754 kubelet[2750]: E0910 05:23:20.606735 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.606754 kubelet[2750]: W0910 05:23:20.606750 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.606819 kubelet[2750]: E0910 05:23:20.606764 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.607044 kubelet[2750]: E0910 05:23:20.607026 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.607044 kubelet[2750]: W0910 05:23:20.607041 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.607226 kubelet[2750]: E0910 05:23:20.607077 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.607449 kubelet[2750]: E0910 05:23:20.607419 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.607487 kubelet[2750]: W0910 05:23:20.607449 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.607487 kubelet[2750]: E0910 05:23:20.607480 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.607806 kubelet[2750]: E0910 05:23:20.607774 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.607806 kubelet[2750]: W0910 05:23:20.607792 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.607806 kubelet[2750]: E0910 05:23:20.607805 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.608157 kubelet[2750]: E0910 05:23:20.608055 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.608157 kubelet[2750]: W0910 05:23:20.608078 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.608157 kubelet[2750]: E0910 05:23:20.608094 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.608337 kubelet[2750]: E0910 05:23:20.608310 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.608337 kubelet[2750]: W0910 05:23:20.608319 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.608337 kubelet[2750]: E0910 05:23:20.608327 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.608570 kubelet[2750]: E0910 05:23:20.608545 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.608570 kubelet[2750]: W0910 05:23:20.608565 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.608637 kubelet[2750]: E0910 05:23:20.608582 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.608854 kubelet[2750]: E0910 05:23:20.608835 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.608854 kubelet[2750]: W0910 05:23:20.608850 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.608915 kubelet[2750]: E0910 05:23:20.608863 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.609226 kubelet[2750]: E0910 05:23:20.609195 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.609305 kubelet[2750]: W0910 05:23:20.609224 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.609305 kubelet[2750]: E0910 05:23:20.609256 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.609600 kubelet[2750]: E0910 05:23:20.609578 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.609600 kubelet[2750]: W0910 05:23:20.609597 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.609674 kubelet[2750]: E0910 05:23:20.609612 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.609977 kubelet[2750]: E0910 05:23:20.609938 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.609977 kubelet[2750]: W0910 05:23:20.609955 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.609977 kubelet[2750]: E0910 05:23:20.609968 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.610249 kubelet[2750]: E0910 05:23:20.610230 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.610249 kubelet[2750]: W0910 05:23:20.610245 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.610323 kubelet[2750]: E0910 05:23:20.610259 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.617885 kubelet[2750]: E0910 05:23:20.617853 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.617885 kubelet[2750]: W0910 05:23:20.617867 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.617885 kubelet[2750]: E0910 05:23:20.617878 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.618193 kubelet[2750]: E0910 05:23:20.618174 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.618193 kubelet[2750]: W0910 05:23:20.618185 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.618193 kubelet[2750]: E0910 05:23:20.618194 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.618436 kubelet[2750]: E0910 05:23:20.618417 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.618614 kubelet[2750]: W0910 05:23:20.618587 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.618614 kubelet[2750]: E0910 05:23:20.618603 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.618889 kubelet[2750]: E0910 05:23:20.618870 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.618889 kubelet[2750]: W0910 05:23:20.618885 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.618965 kubelet[2750]: E0910 05:23:20.618897 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.619253 kubelet[2750]: E0910 05:23:20.619216 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.619253 kubelet[2750]: W0910 05:23:20.619249 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.619360 kubelet[2750]: E0910 05:23:20.619279 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.619594 kubelet[2750]: E0910 05:23:20.619565 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.619594 kubelet[2750]: W0910 05:23:20.619587 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.619684 kubelet[2750]: E0910 05:23:20.619600 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.619844 kubelet[2750]: E0910 05:23:20.619815 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.619844 kubelet[2750]: W0910 05:23:20.619826 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.619844 kubelet[2750]: E0910 05:23:20.619835 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.620104 kubelet[2750]: E0910 05:23:20.620088 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.620104 kubelet[2750]: W0910 05:23:20.620100 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.620198 kubelet[2750]: E0910 05:23:20.620147 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.620467 kubelet[2750]: E0910 05:23:20.620419 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.620467 kubelet[2750]: W0910 05:23:20.620434 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.620467 kubelet[2750]: E0910 05:23:20.620443 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.620781 kubelet[2750]: E0910 05:23:20.620764 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.620781 kubelet[2750]: W0910 05:23:20.620781 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.620858 kubelet[2750]: E0910 05:23:20.620790 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.621062 kubelet[2750]: E0910 05:23:20.621037 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.621062 kubelet[2750]: W0910 05:23:20.621048 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.621062 kubelet[2750]: E0910 05:23:20.621056 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.621371 kubelet[2750]: E0910 05:23:20.621354 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.621371 kubelet[2750]: W0910 05:23:20.621367 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.621450 kubelet[2750]: E0910 05:23:20.621377 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.621667 kubelet[2750]: E0910 05:23:20.621650 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.621667 kubelet[2750]: W0910 05:23:20.621662 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.621744 kubelet[2750]: E0910 05:23:20.621672 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.622009 kubelet[2750]: E0910 05:23:20.621982 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.622049 kubelet[2750]: W0910 05:23:20.622019 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.622049 kubelet[2750]: E0910 05:23:20.622028 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.622302 kubelet[2750]: E0910 05:23:20.622284 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.622302 kubelet[2750]: W0910 05:23:20.622296 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.622390 kubelet[2750]: E0910 05:23:20.622305 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.625177 kubelet[2750]: E0910 05:23:20.624504 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.625177 kubelet[2750]: W0910 05:23:20.624522 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.625177 kubelet[2750]: E0910 05:23:20.624539 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.625177 kubelet[2750]: E0910 05:23:20.624764 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.625177 kubelet[2750]: W0910 05:23:20.624779 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.625177 kubelet[2750]: E0910 05:23:20.624789 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.629684 kubelet[2750]: E0910 05:23:20.629662 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 05:23:20.629684 kubelet[2750]: W0910 05:23:20.629680 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 05:23:20.629789 kubelet[2750]: E0910 05:23:20.629693 2750 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 05:23:20.698303 containerd[1588]: time="2025-09-10T05:23:20.698254911Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:20.699297 containerd[1588]: time="2025-09-10T05:23:20.699253524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 10 05:23:20.700435 containerd[1588]: time="2025-09-10T05:23:20.700390516Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:20.702584 containerd[1588]: time="2025-09-10T05:23:20.702539259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:20.703041 containerd[1588]: time="2025-09-10T05:23:20.703005156Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.378774826s" Sep 10 05:23:20.703041 containerd[1588]: time="2025-09-10T05:23:20.703039066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 10 05:23:20.707487 containerd[1588]: time="2025-09-10T05:23:20.707451162Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 10 05:23:20.716278 containerd[1588]: time="2025-09-10T05:23:20.716241132Z" level=info msg="Container 02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:20.725360 containerd[1588]: time="2025-09-10T05:23:20.725320519Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\"" Sep 10 05:23:20.725846 containerd[1588]: time="2025-09-10T05:23:20.725807639Z" level=info msg="StartContainer for \"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\"" Sep 10 05:23:20.727282 containerd[1588]: time="2025-09-10T05:23:20.727257478Z" level=info msg="connecting to shim 02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638" address="unix:///run/containerd/s/3ef62adf8bae4b45346f25eb68a307874b9ae26fe5ac14362a9b813db4f25d6b" protocol=ttrpc version=3 Sep 10 05:23:20.755445 systemd[1]: Started cri-containerd-02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638.scope - libcontainer container 02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638. Sep 10 05:23:20.811439 systemd[1]: cri-containerd-02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638.scope: Deactivated successfully. Sep 10 05:23:20.814534 containerd[1588]: time="2025-09-10T05:23:20.814495720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\" id:\"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\" pid:3489 exited_at:{seconds:1757481800 nanos:813984639}" Sep 10 05:23:20.851297 containerd[1588]: time="2025-09-10T05:23:20.851246193Z" level=info msg="received exit event container_id:\"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\" id:\"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\" pid:3489 exited_at:{seconds:1757481800 nanos:813984639}" Sep 10 05:23:20.863543 containerd[1588]: time="2025-09-10T05:23:20.863487295Z" level=info msg="StartContainer for \"02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638\" returns successfully" Sep 10 05:23:20.878974 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-02626547f0d9dc3daac9dd4183e0a15746c7c6ac78bce7303ed2d5299ea9f638-rootfs.mount: Deactivated successfully. Sep 10 05:23:21.467846 kubelet[2750]: E0910 05:23:21.467786 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5hfbv" podUID="18af5fbc-bc21-4384-998a-43c1dd346c16" Sep 10 05:23:21.533091 containerd[1588]: time="2025-09-10T05:23:21.532947519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 10 05:23:23.468682 kubelet[2750]: E0910 05:23:23.468600 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5hfbv" podUID="18af5fbc-bc21-4384-998a-43c1dd346c16" Sep 10 05:23:24.068498 containerd[1588]: time="2025-09-10T05:23:24.068439357Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:24.069367 containerd[1588]: time="2025-09-10T05:23:24.069310035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 10 05:23:24.070674 containerd[1588]: time="2025-09-10T05:23:24.070643570Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:24.072757 containerd[1588]: time="2025-09-10T05:23:24.072717664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:24.073459 containerd[1588]: time="2025-09-10T05:23:24.073418382Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 2.540433234s" Sep 10 05:23:24.073459 containerd[1588]: time="2025-09-10T05:23:24.073448493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 10 05:23:24.078062 containerd[1588]: time="2025-09-10T05:23:24.078020307Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 10 05:23:24.088320 containerd[1588]: time="2025-09-10T05:23:24.088267021Z" level=info msg="Container 09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:24.099490 containerd[1588]: time="2025-09-10T05:23:24.099426040Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\"" Sep 10 05:23:24.100445 containerd[1588]: time="2025-09-10T05:23:24.100385773Z" level=info msg="StartContainer for \"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\"" Sep 10 05:23:24.101777 containerd[1588]: time="2025-09-10T05:23:24.101748237Z" level=info msg="connecting to shim 09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4" address="unix:///run/containerd/s/3ef62adf8bae4b45346f25eb68a307874b9ae26fe5ac14362a9b813db4f25d6b" protocol=ttrpc version=3 Sep 10 05:23:24.125284 systemd[1]: Started cri-containerd-09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4.scope - libcontainer container 09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4. Sep 10 05:23:24.406487 containerd[1588]: time="2025-09-10T05:23:24.406336946Z" level=info msg="StartContainer for \"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\" returns successfully" Sep 10 05:23:25.293202 systemd[1]: cri-containerd-09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4.scope: Deactivated successfully. Sep 10 05:23:25.293557 systemd[1]: cri-containerd-09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4.scope: Consumed 662ms CPU time, 179M memory peak, 3.4M read from disk, 171.3M written to disk. Sep 10 05:23:25.295964 containerd[1588]: time="2025-09-10T05:23:25.295736816Z" level=info msg="received exit event container_id:\"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\" id:\"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\" pid:3550 exited_at:{seconds:1757481805 nanos:295294215}" Sep 10 05:23:25.295964 containerd[1588]: time="2025-09-10T05:23:25.295876594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\" id:\"09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4\" pid:3550 exited_at:{seconds:1757481805 nanos:295294215}" Sep 10 05:23:25.319310 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-09b44b840691a4c15b5566c188b3bfe0017f38d7bcfd400bd23eb25c91e1e6d4-rootfs.mount: Deactivated successfully. Sep 10 05:23:25.358315 kubelet[2750]: I0910 05:23:25.358270 2750 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 10 05:23:25.473884 systemd[1]: Created slice kubepods-besteffort-pod18af5fbc_bc21_4384_998a_43c1dd346c16.slice - libcontainer container kubepods-besteffort-pod18af5fbc_bc21_4384_998a_43c1dd346c16.slice. Sep 10 05:23:25.477718 containerd[1588]: time="2025-09-10T05:23:25.476693659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5hfbv,Uid:18af5fbc-bc21-4384-998a-43c1dd346c16,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:25.504117 systemd[1]: Created slice kubepods-burstable-podc92159af_8c46_4dd3_b395_df88a54bacc1.slice - libcontainer container kubepods-burstable-podc92159af_8c46_4dd3_b395_df88a54bacc1.slice. Sep 10 05:23:25.517367 systemd[1]: Created slice kubepods-burstable-pod4adba1b2_a4b4_48be_93dc_ee175c1a700e.slice - libcontainer container kubepods-burstable-pod4adba1b2_a4b4_48be_93dc_ee175c1a700e.slice. Sep 10 05:23:25.528155 systemd[1]: Created slice kubepods-besteffort-podf8de9b92_ac09_4e62_9005_74b36ff41e7b.slice - libcontainer container kubepods-besteffort-podf8de9b92_ac09_4e62_9005_74b36ff41e7b.slice. Sep 10 05:23:25.541199 systemd[1]: Created slice kubepods-besteffort-pod072923f0_8dfc_44bd_a039_db997b1a5ec5.slice - libcontainer container kubepods-besteffort-pod072923f0_8dfc_44bd_a039_db997b1a5ec5.slice. Sep 10 05:23:25.547836 systemd[1]: Created slice kubepods-besteffort-podc8a6cabe_8b7b_46a3_b357_1c595cd08f45.slice - libcontainer container kubepods-besteffort-podc8a6cabe_8b7b_46a3_b357_1c595cd08f45.slice. Sep 10 05:23:25.550058 kubelet[2750]: I0910 05:23:25.550021 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6vp\" (UniqueName: \"kubernetes.io/projected/4adba1b2-a4b4-48be-93dc-ee175c1a700e-kube-api-access-km6vp\") pod \"coredns-674b8bbfcf-jpkqd\" (UID: \"4adba1b2-a4b4-48be-93dc-ee175c1a700e\") " pod="kube-system/coredns-674b8bbfcf-jpkqd" Sep 10 05:23:25.550160 kubelet[2750]: I0910 05:23:25.550067 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4adba1b2-a4b4-48be-93dc-ee175c1a700e-config-volume\") pod \"coredns-674b8bbfcf-jpkqd\" (UID: \"4adba1b2-a4b4-48be-93dc-ee175c1a700e\") " pod="kube-system/coredns-674b8bbfcf-jpkqd" Sep 10 05:23:25.550160 kubelet[2750]: I0910 05:23:25.550094 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-calico-apiserver-certs\") pod \"calico-apiserver-54575dc547-kddv5\" (UID: \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\") " pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" Sep 10 05:23:25.550160 kubelet[2750]: I0910 05:23:25.550118 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-backend-key-pair\") pod \"whisker-86db9f9ff-q4d9h\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " pod="calico-system/whisker-86db9f9ff-q4d9h" Sep 10 05:23:25.550230 kubelet[2750]: I0910 05:23:25.550196 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnf84\" (UniqueName: \"kubernetes.io/projected/f8de9b92-ac09-4e62-9005-74b36ff41e7b-kube-api-access-hnf84\") pod \"calico-apiserver-55b79d7cf6-vzlbk\" (UID: \"f8de9b92-ac09-4e62-9005-74b36ff41e7b\") " pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" Sep 10 05:23:25.550230 kubelet[2750]: I0910 05:23:25.550219 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/436090e8-7271-45d4-a202-a938b0eefe53-goldmane-key-pair\") pod \"goldmane-54d579b49d-kdrts\" (UID: \"436090e8-7271-45d4-a202-a938b0eefe53\") " pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.550282 kubelet[2750]: I0910 05:23:25.550239 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p79kl\" (UniqueName: \"kubernetes.io/projected/436090e8-7271-45d4-a202-a938b0eefe53-kube-api-access-p79kl\") pod \"goldmane-54d579b49d-kdrts\" (UID: \"436090e8-7271-45d4-a202-a938b0eefe53\") " pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.550282 kubelet[2750]: I0910 05:23:25.550259 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtw4\" (UniqueName: \"kubernetes.io/projected/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-kube-api-access-tgtw4\") pod \"calico-apiserver-54575dc547-kddv5\" (UID: \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\") " pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" Sep 10 05:23:25.550329 kubelet[2750]: I0910 05:23:25.550284 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f8de9b92-ac09-4e62-9005-74b36ff41e7b-calico-apiserver-certs\") pod \"calico-apiserver-55b79d7cf6-vzlbk\" (UID: \"f8de9b92-ac09-4e62-9005-74b36ff41e7b\") " pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" Sep 10 05:23:25.550329 kubelet[2750]: I0910 05:23:25.550303 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7s8j\" (UniqueName: \"kubernetes.io/projected/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-kube-api-access-r7s8j\") pod \"calico-apiserver-54575dc547-rbmsw\" (UID: \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\") " pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" Sep 10 05:23:25.550428 kubelet[2750]: I0910 05:23:25.550375 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/436090e8-7271-45d4-a202-a938b0eefe53-config\") pod \"goldmane-54d579b49d-kdrts\" (UID: \"436090e8-7271-45d4-a202-a938b0eefe53\") " pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.550428 kubelet[2750]: I0910 05:23:25.550403 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btj9c\" (UniqueName: \"kubernetes.io/projected/c92159af-8c46-4dd3-b395-df88a54bacc1-kube-api-access-btj9c\") pod \"coredns-674b8bbfcf-bxpzt\" (UID: \"c92159af-8c46-4dd3-b395-df88a54bacc1\") " pod="kube-system/coredns-674b8bbfcf-bxpzt" Sep 10 05:23:25.550473 kubelet[2750]: I0910 05:23:25.550449 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2sj\" (UniqueName: \"kubernetes.io/projected/072923f0-8dfc-44bd-a039-db997b1a5ec5-kube-api-access-8c2sj\") pod \"whisker-86db9f9ff-q4d9h\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " pod="calico-system/whisker-86db9f9ff-q4d9h" Sep 10 05:23:25.551309 kubelet[2750]: I0910 05:23:25.550495 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-calico-apiserver-certs\") pod \"calico-apiserver-54575dc547-rbmsw\" (UID: \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\") " pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" Sep 10 05:23:25.551309 kubelet[2750]: I0910 05:23:25.550558 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/436090e8-7271-45d4-a202-a938b0eefe53-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-kdrts\" (UID: \"436090e8-7271-45d4-a202-a938b0eefe53\") " pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.551309 kubelet[2750]: I0910 05:23:25.550593 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85794923-79eb-4508-871e-5224c14933a7-tigera-ca-bundle\") pod \"calico-kube-controllers-67598f4498-tl9wq\" (UID: \"85794923-79eb-4508-871e-5224c14933a7\") " pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" Sep 10 05:23:25.551309 kubelet[2750]: I0910 05:23:25.550615 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c92159af-8c46-4dd3-b395-df88a54bacc1-config-volume\") pod \"coredns-674b8bbfcf-bxpzt\" (UID: \"c92159af-8c46-4dd3-b395-df88a54bacc1\") " pod="kube-system/coredns-674b8bbfcf-bxpzt" Sep 10 05:23:25.551309 kubelet[2750]: I0910 05:23:25.550636 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqz8m\" (UniqueName: \"kubernetes.io/projected/85794923-79eb-4508-871e-5224c14933a7-kube-api-access-wqz8m\") pod \"calico-kube-controllers-67598f4498-tl9wq\" (UID: \"85794923-79eb-4508-871e-5224c14933a7\") " pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" Sep 10 05:23:25.551452 kubelet[2750]: I0910 05:23:25.550664 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-ca-bundle\") pod \"whisker-86db9f9ff-q4d9h\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " pod="calico-system/whisker-86db9f9ff-q4d9h" Sep 10 05:23:25.555361 containerd[1588]: time="2025-09-10T05:23:25.554825622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 10 05:23:25.557145 systemd[1]: Created slice kubepods-besteffort-pod436090e8_7271_45d4_a202_a938b0eefe53.slice - libcontainer container kubepods-besteffort-pod436090e8_7271_45d4_a202_a938b0eefe53.slice. Sep 10 05:23:25.567731 systemd[1]: Created slice kubepods-besteffort-pod85794923_79eb_4508_871e_5224c14933a7.slice - libcontainer container kubepods-besteffort-pod85794923_79eb_4508_871e_5224c14933a7.slice. Sep 10 05:23:25.575000 systemd[1]: Created slice kubepods-besteffort-pod0b5e651b_cbd7_4abc_b1e2_a6418c29d0af.slice - libcontainer container kubepods-besteffort-pod0b5e651b_cbd7_4abc_b1e2_a6418c29d0af.slice. Sep 10 05:23:25.606260 containerd[1588]: time="2025-09-10T05:23:25.606154665Z" level=error msg="Failed to destroy network for sandbox \"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.609314 systemd[1]: run-netns-cni\x2d6d32565d\x2d0b57\x2d3962\x2d2b72\x2df807444459fe.mount: Deactivated successfully. Sep 10 05:23:25.624464 containerd[1588]: time="2025-09-10T05:23:25.624384984Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5hfbv,Uid:18af5fbc-bc21-4384-998a-43c1dd346c16,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.631098 kubelet[2750]: E0910 05:23:25.631006 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.631098 kubelet[2750]: E0910 05:23:25.631113 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:25.631098 kubelet[2750]: E0910 05:23:25.631171 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5hfbv" Sep 10 05:23:25.631403 kubelet[2750]: E0910 05:23:25.631303 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5hfbv_calico-system(18af5fbc-bc21-4384-998a-43c1dd346c16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5hfbv_calico-system(18af5fbc-bc21-4384-998a-43c1dd346c16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f905d740ac688873324d6e2b11fd5bff6de985eeb24e3e8dad772410788a809\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5hfbv" podUID="18af5fbc-bc21-4384-998a-43c1dd346c16" Sep 10 05:23:25.810786 containerd[1588]: time="2025-09-10T05:23:25.810621810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bxpzt,Uid:c92159af-8c46-4dd3-b395-df88a54bacc1,Namespace:kube-system,Attempt:0,}" Sep 10 05:23:25.822569 containerd[1588]: time="2025-09-10T05:23:25.822516672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jpkqd,Uid:4adba1b2-a4b4-48be-93dc-ee175c1a700e,Namespace:kube-system,Attempt:0,}" Sep 10 05:23:25.843440 containerd[1588]: time="2025-09-10T05:23:25.843393471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b79d7cf6-vzlbk,Uid:f8de9b92-ac09-4e62-9005-74b36ff41e7b,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:25.847489 containerd[1588]: time="2025-09-10T05:23:25.847370971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86db9f9ff-q4d9h,Uid:072923f0-8dfc-44bd-a039-db997b1a5ec5,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:25.858178 containerd[1588]: time="2025-09-10T05:23:25.858046133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-rbmsw,Uid:c8a6cabe-8b7b-46a3-b357-1c595cd08f45,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:25.866410 containerd[1588]: time="2025-09-10T05:23:25.866352807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kdrts,Uid:436090e8-7271-45d4-a202-a938b0eefe53,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:25.873582 containerd[1588]: time="2025-09-10T05:23:25.873416021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67598f4498-tl9wq,Uid:85794923-79eb-4508-871e-5224c14933a7,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:25.877202 containerd[1588]: time="2025-09-10T05:23:25.877164619Z" level=error msg="Failed to destroy network for sandbox \"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.878306 containerd[1588]: time="2025-09-10T05:23:25.878268643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-kddv5,Uid:0b5e651b-cbd7-4abc-b1e2-a6418c29d0af,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:25.888322 containerd[1588]: time="2025-09-10T05:23:25.888201205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bxpzt,Uid:c92159af-8c46-4dd3-b395-df88a54bacc1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.888603 kubelet[2750]: E0910 05:23:25.888543 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.888603 kubelet[2750]: E0910 05:23:25.888612 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-bxpzt" Sep 10 05:23:25.888782 kubelet[2750]: E0910 05:23:25.888643 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-bxpzt" Sep 10 05:23:25.888782 kubelet[2750]: E0910 05:23:25.888694 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-bxpzt_kube-system(c92159af-8c46-4dd3-b395-df88a54bacc1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-bxpzt_kube-system(c92159af-8c46-4dd3-b395-df88a54bacc1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d797b037d14a0cba2434f4eb7ab3f38cd0857d1a9bbf9f3140b3c0edd95f50c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-bxpzt" podUID="c92159af-8c46-4dd3-b395-df88a54bacc1" Sep 10 05:23:25.966721 containerd[1588]: time="2025-09-10T05:23:25.966621243Z" level=error msg="Failed to destroy network for sandbox \"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.967365 containerd[1588]: time="2025-09-10T05:23:25.967333369Z" level=error msg="Failed to destroy network for sandbox \"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.971873 containerd[1588]: time="2025-09-10T05:23:25.971835690Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b79d7cf6-vzlbk,Uid:f8de9b92-ac09-4e62-9005-74b36ff41e7b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.972270 kubelet[2750]: E0910 05:23:25.972214 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.972359 kubelet[2750]: E0910 05:23:25.972290 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" Sep 10 05:23:25.972359 kubelet[2750]: E0910 05:23:25.972319 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" Sep 10 05:23:25.972443 kubelet[2750]: E0910 05:23:25.972380 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55b79d7cf6-vzlbk_calico-apiserver(f8de9b92-ac09-4e62-9005-74b36ff41e7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55b79d7cf6-vzlbk_calico-apiserver(f8de9b92-ac09-4e62-9005-74b36ff41e7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a585d8031e7a5d1528293a0050f7a6fb70b64e25dd705e36b7fff1f4f083ecdc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" podUID="f8de9b92-ac09-4e62-9005-74b36ff41e7b" Sep 10 05:23:25.976149 containerd[1588]: time="2025-09-10T05:23:25.975425251Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jpkqd,Uid:4adba1b2-a4b4-48be-93dc-ee175c1a700e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.976242 kubelet[2750]: E0910 05:23:25.975627 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.976242 kubelet[2750]: E0910 05:23:25.975698 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jpkqd" Sep 10 05:23:25.976242 kubelet[2750]: E0910 05:23:25.975716 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jpkqd" Sep 10 05:23:25.976325 kubelet[2750]: E0910 05:23:25.975959 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-jpkqd_kube-system(4adba1b2-a4b4-48be-93dc-ee175c1a700e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-jpkqd_kube-system(4adba1b2-a4b4-48be-93dc-ee175c1a700e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f58a8e2e3ce2f14955887b879eed6fa7057a7dfe34fe278ca7b5e225da9a6410\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-jpkqd" podUID="4adba1b2-a4b4-48be-93dc-ee175c1a700e" Sep 10 05:23:25.986947 containerd[1588]: time="2025-09-10T05:23:25.986902694Z" level=error msg="Failed to destroy network for sandbox \"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.987718 containerd[1588]: time="2025-09-10T05:23:25.987574117Z" level=error msg="Failed to destroy network for sandbox \"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.988788 containerd[1588]: time="2025-09-10T05:23:25.988762774Z" level=error msg="Failed to destroy network for sandbox \"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.989225 containerd[1588]: time="2025-09-10T05:23:25.988930790Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-rbmsw,Uid:c8a6cabe-8b7b-46a3-b357-1c595cd08f45,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.989517 kubelet[2750]: E0910 05:23:25.989383 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.989517 kubelet[2750]: E0910 05:23:25.989427 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" Sep 10 05:23:25.989517 kubelet[2750]: E0910 05:23:25.989446 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" Sep 10 05:23:25.989818 kubelet[2750]: E0910 05:23:25.989500 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54575dc547-rbmsw_calico-apiserver(c8a6cabe-8b7b-46a3-b357-1c595cd08f45)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54575dc547-rbmsw_calico-apiserver(c8a6cabe-8b7b-46a3-b357-1c595cd08f45)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"30a887479eb80b10aa59a0857aacf7d93cacf698d2cbf4caee36439fd38c3cc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" podUID="c8a6cabe-8b7b-46a3-b357-1c595cd08f45" Sep 10 05:23:25.990717 containerd[1588]: time="2025-09-10T05:23:25.990665683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kdrts,Uid:436090e8-7271-45d4-a202-a938b0eefe53,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.991465 kubelet[2750]: E0910 05:23:25.990886 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.991465 kubelet[2750]: E0910 05:23:25.990958 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.991465 kubelet[2750]: E0910 05:23:25.990981 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kdrts" Sep 10 05:23:25.991558 kubelet[2750]: E0910 05:23:25.991031 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-kdrts_calico-system(436090e8-7271-45d4-a202-a938b0eefe53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-kdrts_calico-system(436090e8-7271-45d4-a202-a938b0eefe53)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"518b843bc917094a3f604f49c1d121977b2f0743d63829755050e6779d0a90db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-kdrts" podUID="436090e8-7271-45d4-a202-a938b0eefe53" Sep 10 05:23:25.991993 containerd[1588]: time="2025-09-10T05:23:25.991874993Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86db9f9ff-q4d9h,Uid:072923f0-8dfc-44bd-a039-db997b1a5ec5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.992330 kubelet[2750]: E0910 05:23:25.992218 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:25.992330 kubelet[2750]: E0910 05:23:25.992291 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86db9f9ff-q4d9h" Sep 10 05:23:25.992330 kubelet[2750]: E0910 05:23:25.992306 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-86db9f9ff-q4d9h" Sep 10 05:23:25.992479 kubelet[2750]: E0910 05:23:25.992455 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-86db9f9ff-q4d9h_calico-system(072923f0-8dfc-44bd-a039-db997b1a5ec5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-86db9f9ff-q4d9h_calico-system(072923f0-8dfc-44bd-a039-db997b1a5ec5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b3fded31b6fa0a16f6d1ebe95a933a5decd83187d8868ba9f7e54fe282b10aa4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-86db9f9ff-q4d9h" podUID="072923f0-8dfc-44bd-a039-db997b1a5ec5" Sep 10 05:23:26.003626 containerd[1588]: time="2025-09-10T05:23:26.003572495Z" level=error msg="Failed to destroy network for sandbox \"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.006167 containerd[1588]: time="2025-09-10T05:23:26.005364073Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67598f4498-tl9wq,Uid:85794923-79eb-4508-871e-5224c14933a7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.006255 kubelet[2750]: E0910 05:23:26.005681 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.006255 kubelet[2750]: E0910 05:23:26.005748 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" Sep 10 05:23:26.006255 kubelet[2750]: E0910 05:23:26.005769 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" Sep 10 05:23:26.006349 kubelet[2750]: E0910 05:23:26.005831 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-67598f4498-tl9wq_calico-system(85794923-79eb-4508-871e-5224c14933a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-67598f4498-tl9wq_calico-system(85794923-79eb-4508-871e-5224c14933a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d1a3f5ca51388c795a06fb4fe9b832b77ac340913392ec42a9e6a8d405effaf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" podUID="85794923-79eb-4508-871e-5224c14933a7" Sep 10 05:23:26.010543 containerd[1588]: time="2025-09-10T05:23:26.010496350Z" level=error msg="Failed to destroy network for sandbox \"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.011690 containerd[1588]: time="2025-09-10T05:23:26.011650871Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-kddv5,Uid:0b5e651b-cbd7-4abc-b1e2-a6418c29d0af,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.011865 kubelet[2750]: E0910 05:23:26.011817 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 05:23:26.012011 kubelet[2750]: E0910 05:23:26.011877 2750 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" Sep 10 05:23:26.012011 kubelet[2750]: E0910 05:23:26.011902 2750 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" Sep 10 05:23:26.012011 kubelet[2750]: E0910 05:23:26.011960 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54575dc547-kddv5_calico-apiserver(0b5e651b-cbd7-4abc-b1e2-a6418c29d0af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54575dc547-kddv5_calico-apiserver(0b5e651b-cbd7-4abc-b1e2-a6418c29d0af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e856ee0c701295687b1ece196b68b75e49b8d4ed60646f393d6d52970a4ab5be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" podUID="0b5e651b-cbd7-4abc-b1e2-a6418c29d0af" Sep 10 05:23:33.316638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1991864234.mount: Deactivated successfully. Sep 10 05:23:34.935674 containerd[1588]: time="2025-09-10T05:23:34.935594929Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:34.936385 containerd[1588]: time="2025-09-10T05:23:34.936352908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 10 05:23:34.938743 containerd[1588]: time="2025-09-10T05:23:34.938699100Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:34.940946 containerd[1588]: time="2025-09-10T05:23:34.940882524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:34.941495 containerd[1588]: time="2025-09-10T05:23:34.941459448Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 9.386143416s" Sep 10 05:23:34.941549 containerd[1588]: time="2025-09-10T05:23:34.941499939Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 10 05:23:34.964737 containerd[1588]: time="2025-09-10T05:23:34.964677738Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 10 05:23:34.975895 containerd[1588]: time="2025-09-10T05:23:34.975828139Z" level=info msg="Container ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:34.986880 containerd[1588]: time="2025-09-10T05:23:34.986818836Z" level=info msg="CreateContainer within sandbox \"9b071a964bd5ad9e807df42595b4fc218740ee1f903544d10672dcd1c70c529c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\"" Sep 10 05:23:34.987553 containerd[1588]: time="2025-09-10T05:23:34.987523537Z" level=info msg="StartContainer for \"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\"" Sep 10 05:23:34.989469 containerd[1588]: time="2025-09-10T05:23:34.989426807Z" level=info msg="connecting to shim ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c" address="unix:///run/containerd/s/3ef62adf8bae4b45346f25eb68a307874b9ae26fe5ac14362a9b813db4f25d6b" protocol=ttrpc version=3 Sep 10 05:23:35.018408 systemd[1]: Started cri-containerd-ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c.scope - libcontainer container ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c. Sep 10 05:23:35.099688 containerd[1588]: time="2025-09-10T05:23:35.099639423Z" level=info msg="StartContainer for \"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\" returns successfully" Sep 10 05:23:35.187488 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 10 05:23:35.187613 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 10 05:23:35.312019 kubelet[2750]: I0910 05:23:35.311482 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-backend-key-pair\") pod \"072923f0-8dfc-44bd-a039-db997b1a5ec5\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " Sep 10 05:23:35.313089 kubelet[2750]: I0910 05:23:35.312789 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-ca-bundle\") pod \"072923f0-8dfc-44bd-a039-db997b1a5ec5\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " Sep 10 05:23:35.313089 kubelet[2750]: I0910 05:23:35.312835 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c2sj\" (UniqueName: \"kubernetes.io/projected/072923f0-8dfc-44bd-a039-db997b1a5ec5-kube-api-access-8c2sj\") pod \"072923f0-8dfc-44bd-a039-db997b1a5ec5\" (UID: \"072923f0-8dfc-44bd-a039-db997b1a5ec5\") " Sep 10 05:23:35.313692 kubelet[2750]: I0910 05:23:35.313549 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "072923f0-8dfc-44bd-a039-db997b1a5ec5" (UID: "072923f0-8dfc-44bd-a039-db997b1a5ec5"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 10 05:23:35.319429 kubelet[2750]: I0910 05:23:35.319353 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072923f0-8dfc-44bd-a039-db997b1a5ec5-kube-api-access-8c2sj" (OuterVolumeSpecName: "kube-api-access-8c2sj") pod "072923f0-8dfc-44bd-a039-db997b1a5ec5" (UID: "072923f0-8dfc-44bd-a039-db997b1a5ec5"). InnerVolumeSpecName "kube-api-access-8c2sj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 10 05:23:35.320626 kubelet[2750]: I0910 05:23:35.320577 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "072923f0-8dfc-44bd-a039-db997b1a5ec5" (UID: "072923f0-8dfc-44bd-a039-db997b1a5ec5"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 10 05:23:35.414078 kubelet[2750]: I0910 05:23:35.414016 2750 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 10 05:23:35.414078 kubelet[2750]: I0910 05:23:35.414053 2750 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/072923f0-8dfc-44bd-a039-db997b1a5ec5-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 10 05:23:35.414078 kubelet[2750]: I0910 05:23:35.414061 2750 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8c2sj\" (UniqueName: \"kubernetes.io/projected/072923f0-8dfc-44bd-a039-db997b1a5ec5-kube-api-access-8c2sj\") on node \"localhost\" DevicePath \"\"" Sep 10 05:23:35.579804 systemd[1]: Removed slice kubepods-besteffort-pod072923f0_8dfc_44bd_a039_db997b1a5ec5.slice - libcontainer container kubepods-besteffort-pod072923f0_8dfc_44bd_a039_db997b1a5ec5.slice. Sep 10 05:23:35.593432 kubelet[2750]: I0910 05:23:35.592640 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wdstg" podStartSLOduration=1.33040546 podStartE2EDuration="18.592621687s" podCreationTimestamp="2025-09-10 05:23:17 +0000 UTC" firstStartedPulling="2025-09-10 05:23:17.679954415 +0000 UTC m=+21.319395353" lastFinishedPulling="2025-09-10 05:23:34.942170642 +0000 UTC m=+38.581611580" observedRunningTime="2025-09-10 05:23:35.592532998 +0000 UTC m=+39.231973936" watchObservedRunningTime="2025-09-10 05:23:35.592621687 +0000 UTC m=+39.232062625" Sep 10 05:23:35.712160 containerd[1588]: time="2025-09-10T05:23:35.712095333Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\" id:\"ee23f2da7181ec18ca56b8538379b060b6eb64701360d9cccacb8a5a618ced66\" pid:3976 exit_status:1 exited_at:{seconds:1757481815 nanos:711768304}" Sep 10 05:23:35.737352 systemd[1]: Created slice kubepods-besteffort-pod7a9fe9f4_d553_4a85_976a_9b021803df03.slice - libcontainer container kubepods-besteffort-pod7a9fe9f4_d553_4a85_976a_9b021803df03.slice. Sep 10 05:23:35.818102 kubelet[2750]: I0910 05:23:35.818019 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7a9fe9f4-d553-4a85-976a-9b021803df03-whisker-backend-key-pair\") pod \"whisker-555947ccd6-9ggdl\" (UID: \"7a9fe9f4-d553-4a85-976a-9b021803df03\") " pod="calico-system/whisker-555947ccd6-9ggdl" Sep 10 05:23:35.818102 kubelet[2750]: I0910 05:23:35.818083 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a9fe9f4-d553-4a85-976a-9b021803df03-whisker-ca-bundle\") pod \"whisker-555947ccd6-9ggdl\" (UID: \"7a9fe9f4-d553-4a85-976a-9b021803df03\") " pod="calico-system/whisker-555947ccd6-9ggdl" Sep 10 05:23:35.818102 kubelet[2750]: I0910 05:23:35.818105 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqp4p\" (UniqueName: \"kubernetes.io/projected/7a9fe9f4-d553-4a85-976a-9b021803df03-kube-api-access-lqp4p\") pod \"whisker-555947ccd6-9ggdl\" (UID: \"7a9fe9f4-d553-4a85-976a-9b021803df03\") " pod="calico-system/whisker-555947ccd6-9ggdl" Sep 10 05:23:35.950920 systemd[1]: var-lib-kubelet-pods-072923f0\x2d8dfc\x2d44bd\x2da039\x2ddb997b1a5ec5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8c2sj.mount: Deactivated successfully. Sep 10 05:23:35.951065 systemd[1]: var-lib-kubelet-pods-072923f0\x2d8dfc\x2d44bd\x2da039\x2ddb997b1a5ec5-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 10 05:23:36.042035 containerd[1588]: time="2025-09-10T05:23:36.041980892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-555947ccd6-9ggdl,Uid:7a9fe9f4-d553-4a85-976a-9b021803df03,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:36.312592 systemd-networkd[1495]: cali4acbc1a0333: Link UP Sep 10 05:23:36.312887 systemd-networkd[1495]: cali4acbc1a0333: Gained carrier Sep 10 05:23:36.334953 containerd[1588]: 2025-09-10 05:23:36.144 [INFO][3991] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:36.334953 containerd[1588]: 2025-09-10 05:23:36.164 [INFO][3991] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--555947ccd6--9ggdl-eth0 whisker-555947ccd6- calico-system 7a9fe9f4-d553-4a85-976a-9b021803df03 926 0 2025-09-10 05:23:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:555947ccd6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-555947ccd6-9ggdl eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali4acbc1a0333 [] [] }} ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-" Sep 10 05:23:36.334953 containerd[1588]: 2025-09-10 05:23:36.164 [INFO][3991] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.334953 containerd[1588]: 2025-09-10 05:23:36.226 [INFO][4006] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" HandleID="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Workload="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.227 [INFO][4006] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" HandleID="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Workload="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001357f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-555947ccd6-9ggdl", "timestamp":"2025-09-10 05:23:36.226346408 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.227 [INFO][4006] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.227 [INFO][4006] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.227 [INFO][4006] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.235 [INFO][4006] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" host="localhost" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.241 [INFO][4006] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.245 [INFO][4006] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.247 [INFO][4006] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.249 [INFO][4006] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:36.335200 containerd[1588]: 2025-09-10 05:23:36.249 [INFO][4006] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" host="localhost" Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.250 [INFO][4006] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41 Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.287 [INFO][4006] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" host="localhost" Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.301 [INFO][4006] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" host="localhost" Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.301 [INFO][4006] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" host="localhost" Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.301 [INFO][4006] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:36.335418 containerd[1588]: 2025-09-10 05:23:36.301 [INFO][4006] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" HandleID="k8s-pod-network.c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Workload="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.335540 containerd[1588]: 2025-09-10 05:23:36.305 [INFO][3991] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--555947ccd6--9ggdl-eth0", GenerateName:"whisker-555947ccd6-", Namespace:"calico-system", SelfLink:"", UID:"7a9fe9f4-d553-4a85-976a-9b021803df03", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"555947ccd6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-555947ccd6-9ggdl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4acbc1a0333", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:36.335540 containerd[1588]: 2025-09-10 05:23:36.305 [INFO][3991] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.335685 containerd[1588]: 2025-09-10 05:23:36.305 [INFO][3991] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4acbc1a0333 ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.335685 containerd[1588]: 2025-09-10 05:23:36.313 [INFO][3991] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.335740 containerd[1588]: 2025-09-10 05:23:36.313 [INFO][3991] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--555947ccd6--9ggdl-eth0", GenerateName:"whisker-555947ccd6-", Namespace:"calico-system", SelfLink:"", UID:"7a9fe9f4-d553-4a85-976a-9b021803df03", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"555947ccd6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41", Pod:"whisker-555947ccd6-9ggdl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4acbc1a0333", MAC:"76:67:de:16:85:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:36.335800 containerd[1588]: 2025-09-10 05:23:36.330 [INFO][3991] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" Namespace="calico-system" Pod="whisker-555947ccd6-9ggdl" WorkloadEndpoint="localhost-k8s-whisker--555947ccd6--9ggdl-eth0" Sep 10 05:23:36.469316 containerd[1588]: time="2025-09-10T05:23:36.469259193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67598f4498-tl9wq,Uid:85794923-79eb-4508-871e-5224c14933a7,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:36.474705 kubelet[2750]: I0910 05:23:36.474661 2750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072923f0-8dfc-44bd-a039-db997b1a5ec5" path="/var/lib/kubelet/pods/072923f0-8dfc-44bd-a039-db997b1a5ec5/volumes" Sep 10 05:23:36.719732 containerd[1588]: time="2025-09-10T05:23:36.719573969Z" level=info msg="connecting to shim c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41" address="unix:///run/containerd/s/e4513f20431767dd8c92c6e7811b4965c553bfa8939b3b8c5a80e097b7888674" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:36.757594 systemd[1]: Started cri-containerd-c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41.scope - libcontainer container c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41. Sep 10 05:23:36.770198 containerd[1588]: time="2025-09-10T05:23:36.770090592Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\" id:\"d1c71d51aa71cfc881c255eccf9eb5a4c1cff1abe74c38d6f70f161fbca4edfc\" pid:4143 exit_status:1 exited_at:{seconds:1757481816 nanos:769347278}" Sep 10 05:23:36.778192 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:36.796642 systemd-networkd[1495]: cali8680b975f3d: Link UP Sep 10 05:23:36.798395 systemd-networkd[1495]: cali8680b975f3d: Gained carrier Sep 10 05:23:36.817441 containerd[1588]: 2025-09-10 05:23:36.684 [INFO][4124] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:36.817441 containerd[1588]: 2025-09-10 05:23:36.700 [INFO][4124] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0 calico-kube-controllers-67598f4498- calico-system 85794923-79eb-4508-871e-5224c14933a7 857 0 2025-09-10 05:23:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:67598f4498 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-67598f4498-tl9wq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8680b975f3d [] [] }} ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-" Sep 10 05:23:36.817441 containerd[1588]: 2025-09-10 05:23:36.700 [INFO][4124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.817441 containerd[1588]: 2025-09-10 05:23:36.740 [INFO][4166] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" HandleID="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Workload="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.740 [INFO][4166] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" HandleID="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Workload="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002a7370), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-67598f4498-tl9wq", "timestamp":"2025-09-10 05:23:36.740089074 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.740 [INFO][4166] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.740 [INFO][4166] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.740 [INFO][4166] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.755 [INFO][4166] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" host="localhost" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.761 [INFO][4166] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.766 [INFO][4166] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.768 [INFO][4166] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.772 [INFO][4166] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:36.817687 containerd[1588]: 2025-09-10 05:23:36.772 [INFO][4166] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" host="localhost" Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.776 [INFO][4166] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.782 [INFO][4166] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" host="localhost" Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.787 [INFO][4166] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" host="localhost" Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.787 [INFO][4166] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" host="localhost" Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.787 [INFO][4166] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:36.817961 containerd[1588]: 2025-09-10 05:23:36.787 [INFO][4166] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" HandleID="k8s-pod-network.3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Workload="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.818168 containerd[1588]: 2025-09-10 05:23:36.792 [INFO][4124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0", GenerateName:"calico-kube-controllers-67598f4498-", Namespace:"calico-system", SelfLink:"", UID:"85794923-79eb-4508-871e-5224c14933a7", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"67598f4498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-67598f4498-tl9wq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8680b975f3d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:36.818232 containerd[1588]: 2025-09-10 05:23:36.792 [INFO][4124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.818232 containerd[1588]: 2025-09-10 05:23:36.792 [INFO][4124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8680b975f3d ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.818232 containerd[1588]: 2025-09-10 05:23:36.799 [INFO][4124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.818307 containerd[1588]: 2025-09-10 05:23:36.799 [INFO][4124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0", GenerateName:"calico-kube-controllers-67598f4498-", Namespace:"calico-system", SelfLink:"", UID:"85794923-79eb-4508-871e-5224c14933a7", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"67598f4498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b", Pod:"calico-kube-controllers-67598f4498-tl9wq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8680b975f3d", MAC:"be:07:dc:cf:69:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:36.818680 containerd[1588]: 2025-09-10 05:23:36.810 [INFO][4124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" Namespace="calico-system" Pod="calico-kube-controllers-67598f4498-tl9wq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--67598f4498--tl9wq-eth0" Sep 10 05:23:36.823531 containerd[1588]: time="2025-09-10T05:23:36.823484550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-555947ccd6-9ggdl,Uid:7a9fe9f4-d553-4a85-976a-9b021803df03,Namespace:calico-system,Attempt:0,} returns sandbox id \"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41\"" Sep 10 05:23:36.824994 containerd[1588]: time="2025-09-10T05:23:36.824969266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 10 05:23:36.845687 containerd[1588]: time="2025-09-10T05:23:36.845627048Z" level=info msg="connecting to shim 3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b" address="unix:///run/containerd/s/b6fc55422edef77a9256752b4853489d3de2b1eab957c6ef455f604024efa776" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:36.884258 systemd[1]: Started cri-containerd-3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b.scope - libcontainer container 3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b. Sep 10 05:23:36.898251 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:36.980474 containerd[1588]: time="2025-09-10T05:23:36.980329928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67598f4498-tl9wq,Uid:85794923-79eb-4508-871e-5224c14933a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b\"" Sep 10 05:23:37.468405 containerd[1588]: time="2025-09-10T05:23:37.468339143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-rbmsw,Uid:c8a6cabe-8b7b-46a3-b357-1c595cd08f45,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:37.468405 containerd[1588]: time="2025-09-10T05:23:37.468395416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-kddv5,Uid:0b5e651b-cbd7-4abc-b1e2-a6418c29d0af,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:37.468909 containerd[1588]: time="2025-09-10T05:23:37.468412521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b79d7cf6-vzlbk,Uid:f8de9b92-ac09-4e62-9005-74b36ff41e7b,Namespace:calico-apiserver,Attempt:0,}" Sep 10 05:23:37.468909 containerd[1588]: time="2025-09-10T05:23:37.468358913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jpkqd,Uid:4adba1b2-a4b4-48be-93dc-ee175c1a700e,Namespace:kube-system,Attempt:0,}" Sep 10 05:23:37.644325 systemd-networkd[1495]: cali4acbc1a0333: Gained IPv6LL Sep 10 05:23:38.120420 systemd-networkd[1495]: calie930cb6935d: Link UP Sep 10 05:23:38.121018 systemd-networkd[1495]: calie930cb6935d: Gained carrier Sep 10 05:23:38.223612 containerd[1588]: 2025-09-10 05:23:37.858 [INFO][4295] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:38.223612 containerd[1588]: 2025-09-10 05:23:37.912 [INFO][4295] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0 calico-apiserver-54575dc547- calico-apiserver c8a6cabe-8b7b-46a3-b357-1c595cd08f45 852 0 2025-09-10 05:23:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54575dc547 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-54575dc547-rbmsw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie930cb6935d [] [] }} ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-" Sep 10 05:23:38.223612 containerd[1588]: 2025-09-10 05:23:37.913 [INFO][4295] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.223612 containerd[1588]: 2025-09-10 05:23:37.940 [INFO][4309] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.941 [INFO][4309] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6050), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-54575dc547-rbmsw", "timestamp":"2025-09-10 05:23:37.940791557 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.941 [INFO][4309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.941 [INFO][4309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.941 [INFO][4309] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.947 [INFO][4309] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" host="localhost" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.951 [INFO][4309] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.955 [INFO][4309] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.957 [INFO][4309] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.959 [INFO][4309] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.223961 containerd[1588]: 2025-09-10 05:23:37.959 [INFO][4309] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" host="localhost" Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:37.960 [INFO][4309] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30 Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:38.041 [INFO][4309] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" host="localhost" Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:38.111 [INFO][4309] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" host="localhost" Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:38.111 [INFO][4309] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" host="localhost" Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:38.111 [INFO][4309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:38.224637 containerd[1588]: 2025-09-10 05:23:38.111 [INFO][4309] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.224762 containerd[1588]: 2025-09-10 05:23:38.117 [INFO][4295] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0", GenerateName:"calico-apiserver-54575dc547-", Namespace:"calico-apiserver", SelfLink:"", UID:"c8a6cabe-8b7b-46a3-b357-1c595cd08f45", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54575dc547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-54575dc547-rbmsw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie930cb6935d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.224820 containerd[1588]: 2025-09-10 05:23:38.117 [INFO][4295] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.224820 containerd[1588]: 2025-09-10 05:23:38.117 [INFO][4295] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie930cb6935d ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.224820 containerd[1588]: 2025-09-10 05:23:38.122 [INFO][4295] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.224953 containerd[1588]: 2025-09-10 05:23:38.123 [INFO][4295] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0", GenerateName:"calico-apiserver-54575dc547-", Namespace:"calico-apiserver", SelfLink:"", UID:"c8a6cabe-8b7b-46a3-b357-1c595cd08f45", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54575dc547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30", Pod:"calico-apiserver-54575dc547-rbmsw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie930cb6935d", MAC:"5a:2b:0b:ed:c6:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.225077 containerd[1588]: 2025-09-10 05:23:38.220 [INFO][4295] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-rbmsw" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:23:38.285856 systemd-networkd[1495]: calib27e21b1546: Link UP Sep 10 05:23:38.286794 systemd-networkd[1495]: calib27e21b1546: Gained carrier Sep 10 05:23:38.303684 containerd[1588]: 2025-09-10 05:23:38.021 [INFO][4319] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:38.303684 containerd[1588]: 2025-09-10 05:23:38.066 [INFO][4319] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0 calico-apiserver-55b79d7cf6- calico-apiserver f8de9b92-ac09-4e62-9005-74b36ff41e7b 851 0 2025-09-10 05:23:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55b79d7cf6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-55b79d7cf6-vzlbk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib27e21b1546 [] [] }} ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-" Sep 10 05:23:38.303684 containerd[1588]: 2025-09-10 05:23:38.066 [INFO][4319] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.303684 containerd[1588]: 2025-09-10 05:23:38.140 [INFO][4334] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" HandleID="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Workload="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.140 [INFO][4334] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" HandleID="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Workload="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138550), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-55b79d7cf6-vzlbk", "timestamp":"2025-09-10 05:23:38.140351941 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.140 [INFO][4334] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.140 [INFO][4334] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.140 [INFO][4334] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.218 [INFO][4334] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" host="localhost" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.229 [INFO][4334] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.237 [INFO][4334] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.239 [INFO][4334] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.241 [INFO][4334] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.303986 containerd[1588]: 2025-09-10 05:23:38.241 [INFO][4334] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" host="localhost" Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.242 [INFO][4334] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68 Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.267 [INFO][4334] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" host="localhost" Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4334] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" host="localhost" Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4334] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" host="localhost" Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4334] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:38.304429 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4334] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" HandleID="k8s-pod-network.cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Workload="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.304603 containerd[1588]: 2025-09-10 05:23:38.281 [INFO][4319] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0", GenerateName:"calico-apiserver-55b79d7cf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f8de9b92-ac09-4e62-9005-74b36ff41e7b", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b79d7cf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-55b79d7cf6-vzlbk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib27e21b1546", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.304748 containerd[1588]: 2025-09-10 05:23:38.281 [INFO][4319] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.304748 containerd[1588]: 2025-09-10 05:23:38.281 [INFO][4319] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib27e21b1546 ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.304748 containerd[1588]: 2025-09-10 05:23:38.287 [INFO][4319] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.304827 containerd[1588]: 2025-09-10 05:23:38.287 [INFO][4319] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0", GenerateName:"calico-apiserver-55b79d7cf6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f8de9b92-ac09-4e62-9005-74b36ff41e7b", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b79d7cf6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68", Pod:"calico-apiserver-55b79d7cf6-vzlbk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib27e21b1546", MAC:"3e:09:1b:b3:61:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.304905 containerd[1588]: 2025-09-10 05:23:38.299 [INFO][4319] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" Namespace="calico-apiserver" Pod="calico-apiserver-55b79d7cf6-vzlbk" WorkloadEndpoint="localhost-k8s-calico--apiserver--55b79d7cf6--vzlbk-eth0" Sep 10 05:23:38.330355 containerd[1588]: time="2025-09-10T05:23:38.329892431Z" level=info msg="connecting to shim e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" address="unix:///run/containerd/s/ca1bccbfde4ff45d6ef3d0b8a858d44141e5444f3d0ef71bfd9d3b5ec919ef4e" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:38.338403 containerd[1588]: time="2025-09-10T05:23:38.338340247Z" level=info msg="connecting to shim cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68" address="unix:///run/containerd/s/d32d425538de70f1c433a9eec24a12ae25b053d7ef4f01087b29f905dc9bbd78" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:38.376483 systemd[1]: Started cri-containerd-e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30.scope - libcontainer container e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30. Sep 10 05:23:38.380503 systemd[1]: Started cri-containerd-cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68.scope - libcontainer container cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68. Sep 10 05:23:38.392828 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:38.398573 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:38.468439 containerd[1588]: time="2025-09-10T05:23:38.468384008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bxpzt,Uid:c92159af-8c46-4dd3-b395-df88a54bacc1,Namespace:kube-system,Attempt:0,}" Sep 10 05:23:38.570798 containerd[1588]: time="2025-09-10T05:23:38.569631249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b79d7cf6-vzlbk,Uid:f8de9b92-ac09-4e62-9005-74b36ff41e7b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68\"" Sep 10 05:23:38.570798 containerd[1588]: time="2025-09-10T05:23:38.569654466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-rbmsw,Uid:c8a6cabe-8b7b-46a3-b357-1c595cd08f45,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\"" Sep 10 05:23:38.570389 systemd-networkd[1495]: cali93c6026d85e: Link UP Sep 10 05:23:38.572640 systemd-networkd[1495]: cali93c6026d85e: Gained carrier Sep 10 05:23:38.595238 containerd[1588]: 2025-09-10 05:23:38.184 [INFO][4359] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:38.595238 containerd[1588]: 2025-09-10 05:23:38.224 [INFO][4359] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0 coredns-674b8bbfcf- kube-system 4adba1b2-a4b4-48be-93dc-ee175c1a700e 848 0 2025-09-10 05:23:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-jpkqd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali93c6026d85e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-" Sep 10 05:23:38.595238 containerd[1588]: 2025-09-10 05:23:38.225 [INFO][4359] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.595238 containerd[1588]: 2025-09-10 05:23:38.262 [INFO][4381] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" HandleID="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Workload="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.262 [INFO][4381] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" HandleID="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Workload="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f930), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-jpkqd", "timestamp":"2025-09-10 05:23:38.262365222 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.262 [INFO][4381] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4381] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.275 [INFO][4381] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.321 [INFO][4381] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" host="localhost" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.328 [INFO][4381] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.339 [INFO][4381] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.350 [INFO][4381] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.354 [INFO][4381] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.595579 containerd[1588]: 2025-09-10 05:23:38.354 [INFO][4381] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" host="localhost" Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.356 [INFO][4381] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877 Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.375 [INFO][4381] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" host="localhost" Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.559 [INFO][4381] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" host="localhost" Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.559 [INFO][4381] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" host="localhost" Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.559 [INFO][4381] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:38.595898 containerd[1588]: 2025-09-10 05:23:38.559 [INFO][4381] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" HandleID="k8s-pod-network.b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Workload="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.596077 containerd[1588]: 2025-09-10 05:23:38.563 [INFO][4359] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4adba1b2-a4b4-48be-93dc-ee175c1a700e", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-jpkqd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93c6026d85e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.596261 containerd[1588]: 2025-09-10 05:23:38.563 [INFO][4359] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.596261 containerd[1588]: 2025-09-10 05:23:38.563 [INFO][4359] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93c6026d85e ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.596261 containerd[1588]: 2025-09-10 05:23:38.572 [INFO][4359] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.596369 containerd[1588]: 2025-09-10 05:23:38.574 [INFO][4359] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4adba1b2-a4b4-48be-93dc-ee175c1a700e", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877", Pod:"coredns-674b8bbfcf-jpkqd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93c6026d85e", MAC:"8e:18:e4:60:39:f0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.596369 containerd[1588]: 2025-09-10 05:23:38.588 [INFO][4359] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" Namespace="kube-system" Pod="coredns-674b8bbfcf-jpkqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--jpkqd-eth0" Sep 10 05:23:38.604360 systemd-networkd[1495]: cali8680b975f3d: Gained IPv6LL Sep 10 05:23:38.625200 systemd-networkd[1495]: calia4f5eeb5f70: Link UP Sep 10 05:23:38.627347 systemd-networkd[1495]: calia4f5eeb5f70: Gained carrier Sep 10 05:23:38.635320 containerd[1588]: time="2025-09-10T05:23:38.635270372Z" level=info msg="connecting to shim b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877" address="unix:///run/containerd/s/637645bf1a47c2f5bbe69992b96741e3994279b1663f6798cfb69bbd8d966e4a" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.149 [INFO][4342] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.222 [INFO][4342] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0 calico-apiserver-54575dc547- calico-apiserver 0b5e651b-cbd7-4abc-b1e2-a6418c29d0af 855 0 2025-09-10 05:23:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54575dc547 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-54575dc547-kddv5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia4f5eeb5f70 [] [] }} ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.222 [INFO][4342] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.282 [INFO][4378] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.283 [INFO][4378] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000429e10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-54575dc547-kddv5", "timestamp":"2025-09-10 05:23:38.282868057 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.283 [INFO][4378] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.559 [INFO][4378] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.560 [INFO][4378] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.570 [INFO][4378] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.580 [INFO][4378] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.591 [INFO][4378] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.594 [INFO][4378] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.599 [INFO][4378] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.599 [INFO][4378] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.601 [INFO][4378] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6 Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.607 [INFO][4378] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.616 [INFO][4378] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.616 [INFO][4378] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" host="localhost" Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.616 [INFO][4378] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:38.655531 containerd[1588]: 2025-09-10 05:23:38.616 [INFO][4378] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.620 [INFO][4342] cni-plugin/k8s.go 418: Populated endpoint ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0", GenerateName:"calico-apiserver-54575dc547-", Namespace:"calico-apiserver", SelfLink:"", UID:"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54575dc547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-54575dc547-kddv5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia4f5eeb5f70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.620 [INFO][4342] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.620 [INFO][4342] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia4f5eeb5f70 ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.628 [INFO][4342] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.629 [INFO][4342] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0", GenerateName:"calico-apiserver-54575dc547-", Namespace:"calico-apiserver", SelfLink:"", UID:"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54575dc547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6", Pod:"calico-apiserver-54575dc547-kddv5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia4f5eeb5f70", MAC:"22:e6:d6:cd:5c:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.656209 containerd[1588]: 2025-09-10 05:23:38.647 [INFO][4342] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Namespace="calico-apiserver" Pod="calico-apiserver-54575dc547-kddv5" WorkloadEndpoint="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:38.679590 systemd[1]: Started cri-containerd-b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877.scope - libcontainer container b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877. Sep 10 05:23:38.682748 containerd[1588]: time="2025-09-10T05:23:38.682693676Z" level=info msg="connecting to shim 60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" address="unix:///run/containerd/s/c41f4ae66989235fe657f0b8a142be7a9d68c709f98a44e324c89f8c340ab8a8" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:38.704443 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:38.717932 systemd[1]: Started cri-containerd-60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6.scope - libcontainer container 60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6. Sep 10 05:23:38.731467 systemd-networkd[1495]: calida39a66bdc9: Link UP Sep 10 05:23:38.733302 systemd-networkd[1495]: calida39a66bdc9: Gained carrier Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.614 [INFO][4496] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.628 [INFO][4496] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0 coredns-674b8bbfcf- kube-system c92159af-8c46-4dd3-b395-df88a54bacc1 844 0 2025-09-10 05:23:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-bxpzt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calida39a66bdc9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.628 [INFO][4496] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.675 [INFO][4543] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" HandleID="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Workload="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.675 [INFO][4543] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" HandleID="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Workload="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000135520), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-bxpzt", "timestamp":"2025-09-10 05:23:38.675636428 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.676 [INFO][4543] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.676 [INFO][4543] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.676 [INFO][4543] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.684 [INFO][4543] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.692 [INFO][4543] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.697 [INFO][4543] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.701 [INFO][4543] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.703 [INFO][4543] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.703 [INFO][4543] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.705 [INFO][4543] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.709 [INFO][4543] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.717 [INFO][4543] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.717 [INFO][4543] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" host="localhost" Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.717 [INFO][4543] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:38.754075 containerd[1588]: 2025-09-10 05:23:38.717 [INFO][4543] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" HandleID="k8s-pod-network.80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Workload="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.725 [INFO][4496] cni-plugin/k8s.go 418: Populated endpoint ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c92159af-8c46-4dd3-b395-df88a54bacc1", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-bxpzt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida39a66bdc9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.727 [INFO][4496] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.727 [INFO][4496] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida39a66bdc9 ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.732 [INFO][4496] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.733 [INFO][4496] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c92159af-8c46-4dd3-b395-df88a54bacc1", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e", Pod:"coredns-674b8bbfcf-bxpzt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida39a66bdc9", MAC:"f6:89:46:9c:93:a9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:38.754802 containerd[1588]: 2025-09-10 05:23:38.749 [INFO][4496] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" Namespace="kube-system" Pod="coredns-674b8bbfcf-bxpzt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--bxpzt-eth0" Sep 10 05:23:38.761611 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:38.763779 containerd[1588]: time="2025-09-10T05:23:38.763731711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jpkqd,Uid:4adba1b2-a4b4-48be-93dc-ee175c1a700e,Namespace:kube-system,Attempt:0,} returns sandbox id \"b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877\"" Sep 10 05:23:38.772469 containerd[1588]: time="2025-09-10T05:23:38.772397555Z" level=info msg="CreateContainer within sandbox \"b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 10 05:23:38.790359 containerd[1588]: time="2025-09-10T05:23:38.790305415Z" level=info msg="Container 7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:38.796375 containerd[1588]: time="2025-09-10T05:23:38.796164382Z" level=info msg="connecting to shim 80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e" address="unix:///run/containerd/s/49a8ff36029b146afe7891145d371087a345f2b30ccd7458cd5119c616feb7f3" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:38.816097 containerd[1588]: time="2025-09-10T05:23:38.816051401Z" level=info msg="CreateContainer within sandbox \"b9787ebbef9c73a927455b4211d48da83cfdb111456b1f53b34f60b509781877\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817\"" Sep 10 05:23:38.819788 containerd[1588]: time="2025-09-10T05:23:38.819339433Z" level=info msg="StartContainer for \"7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817\"" Sep 10 05:23:38.824155 containerd[1588]: time="2025-09-10T05:23:38.823647139Z" level=info msg="connecting to shim 7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817" address="unix:///run/containerd/s/637645bf1a47c2f5bbe69992b96741e3994279b1663f6798cfb69bbd8d966e4a" protocol=ttrpc version=3 Sep 10 05:23:38.840668 containerd[1588]: time="2025-09-10T05:23:38.840631099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54575dc547-kddv5,Uid:0b5e651b-cbd7-4abc-b1e2-a6418c29d0af,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\"" Sep 10 05:23:38.852726 systemd[1]: Started cri-containerd-7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817.scope - libcontainer container 7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817. Sep 10 05:23:38.870290 systemd[1]: Started cri-containerd-80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e.scope - libcontainer container 80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e. Sep 10 05:23:38.898454 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:38.915899 containerd[1588]: time="2025-09-10T05:23:38.915695137Z" level=info msg="StartContainer for \"7c99bce495f0d70221219967c6389f123a953328dafd14b5a65ed06a0403d817\" returns successfully" Sep 10 05:23:38.947375 containerd[1588]: time="2025-09-10T05:23:38.947324737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-bxpzt,Uid:c92159af-8c46-4dd3-b395-df88a54bacc1,Namespace:kube-system,Attempt:0,} returns sandbox id \"80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e\"" Sep 10 05:23:38.955156 containerd[1588]: time="2025-09-10T05:23:38.954680134Z" level=info msg="CreateContainer within sandbox \"80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 10 05:23:38.966849 containerd[1588]: time="2025-09-10T05:23:38.966804671Z" level=info msg="Container 6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:38.974323 containerd[1588]: time="2025-09-10T05:23:38.974293955Z" level=info msg="CreateContainer within sandbox \"80c7429d40d97958fe44098091f851eb876e647c0df11291703633763fc6234e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e\"" Sep 10 05:23:38.976003 containerd[1588]: time="2025-09-10T05:23:38.974920973Z" level=info msg="StartContainer for \"6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e\"" Sep 10 05:23:38.979466 containerd[1588]: time="2025-09-10T05:23:38.979443128Z" level=info msg="connecting to shim 6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e" address="unix:///run/containerd/s/49a8ff36029b146afe7891145d371087a345f2b30ccd7458cd5119c616feb7f3" protocol=ttrpc version=3 Sep 10 05:23:39.006326 systemd[1]: Started cri-containerd-6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e.scope - libcontainer container 6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e. Sep 10 05:23:39.037929 containerd[1588]: time="2025-09-10T05:23:39.037875770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:39.038566 containerd[1588]: time="2025-09-10T05:23:39.038535260Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 10 05:23:39.039902 containerd[1588]: time="2025-09-10T05:23:39.039853059Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:39.044410 containerd[1588]: time="2025-09-10T05:23:39.044341836Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:39.046384 containerd[1588]: time="2025-09-10T05:23:39.046355758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.221358726s" Sep 10 05:23:39.046450 containerd[1588]: time="2025-09-10T05:23:39.046386479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 10 05:23:39.048497 containerd[1588]: time="2025-09-10T05:23:39.048472887Z" level=info msg="StartContainer for \"6d186e79cc1debbfc440d0a1b307783b4822341e0c5ed2e0494f8b31886c231e\" returns successfully" Sep 10 05:23:39.049458 containerd[1588]: time="2025-09-10T05:23:39.049272939Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 10 05:23:39.055042 containerd[1588]: time="2025-09-10T05:23:39.054998843Z" level=info msg="CreateContainer within sandbox \"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 10 05:23:39.069501 containerd[1588]: time="2025-09-10T05:23:39.068811378Z" level=info msg="Container 2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:39.077167 containerd[1588]: time="2025-09-10T05:23:39.077099133Z" level=info msg="CreateContainer within sandbox \"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32\"" Sep 10 05:23:39.077945 containerd[1588]: time="2025-09-10T05:23:39.077925157Z" level=info msg="StartContainer for \"2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32\"" Sep 10 05:23:39.079364 containerd[1588]: time="2025-09-10T05:23:39.079319649Z" level=info msg="connecting to shim 2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32" address="unix:///run/containerd/s/e4513f20431767dd8c92c6e7811b4965c553bfa8939b3b8c5a80e097b7888674" protocol=ttrpc version=3 Sep 10 05:23:39.105267 systemd[1]: Started cri-containerd-2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32.scope - libcontainer container 2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32. Sep 10 05:23:39.159810 containerd[1588]: time="2025-09-10T05:23:39.159688534Z" level=info msg="StartContainer for \"2a1797a7c0fe0128172e72bc98f0376476810a68f1e4386dc5f8930cddf86e32\" returns successfully" Sep 10 05:23:39.468862 containerd[1588]: time="2025-09-10T05:23:39.468700502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5hfbv,Uid:18af5fbc-bc21-4384-998a-43c1dd346c16,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:39.576778 systemd-networkd[1495]: cali50a9ac2e0dd: Link UP Sep 10 05:23:39.577022 systemd-networkd[1495]: cali50a9ac2e0dd: Gained carrier Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.496 [INFO][4812] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.509 [INFO][4812] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--5hfbv-eth0 csi-node-driver- calico-system 18af5fbc-bc21-4384-998a-43c1dd346c16 753 0 2025-09-10 05:23:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-5hfbv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali50a9ac2e0dd [] [] }} ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.509 [INFO][4812] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.541 [INFO][4828] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" HandleID="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Workload="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.541 [INFO][4828] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" HandleID="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Workload="localhost-k8s-csi--node--driver--5hfbv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-5hfbv", "timestamp":"2025-09-10 05:23:39.541181144 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.541 [INFO][4828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.541 [INFO][4828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.541 [INFO][4828] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.548 [INFO][4828] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.553 [INFO][4828] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.556 [INFO][4828] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.558 [INFO][4828] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.560 [INFO][4828] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.560 [INFO][4828] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.561 [INFO][4828] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.564 [INFO][4828] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.570 [INFO][4828] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.570 [INFO][4828] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" host="localhost" Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.570 [INFO][4828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:39.592208 containerd[1588]: 2025-09-10 05:23:39.570 [INFO][4828] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" HandleID="k8s-pod-network.e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Workload="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.574 [INFO][4812] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5hfbv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"18af5fbc-bc21-4384-998a-43c1dd346c16", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-5hfbv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali50a9ac2e0dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.574 [INFO][4812] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.574 [INFO][4812] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50a9ac2e0dd ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.577 [INFO][4812] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.577 [INFO][4812] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5hfbv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"18af5fbc-bc21-4384-998a-43c1dd346c16", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da", Pod:"csi-node-driver-5hfbv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali50a9ac2e0dd", MAC:"ea:9d:e6:f2:e8:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:39.592784 containerd[1588]: 2025-09-10 05:23:39.588 [INFO][4812] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" Namespace="calico-system" Pod="csi-node-driver-5hfbv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5hfbv-eth0" Sep 10 05:23:39.619314 kubelet[2750]: I0910 05:23:39.619229 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-bxpzt" podStartSLOduration=38.616109046 podStartE2EDuration="38.616109046s" podCreationTimestamp="2025-09-10 05:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:23:39.61554148 +0000 UTC m=+43.254982418" watchObservedRunningTime="2025-09-10 05:23:39.616109046 +0000 UTC m=+43.255549994" Sep 10 05:23:39.622242 containerd[1588]: time="2025-09-10T05:23:39.622110171Z" level=info msg="connecting to shim e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da" address="unix:///run/containerd/s/b1ea0750ff8238f09d5c50664b8270bf220c8e1d42342373f6b4097956fcfad4" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:39.671329 systemd[1]: Started cri-containerd-e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da.scope - libcontainer container e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da. Sep 10 05:23:39.688208 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:39.757690 containerd[1588]: time="2025-09-10T05:23:39.757581173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5hfbv,Uid:18af5fbc-bc21-4384-998a-43c1dd346c16,Namespace:calico-system,Attempt:0,} returns sandbox id \"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da\"" Sep 10 05:23:39.821406 systemd-networkd[1495]: calie930cb6935d: Gained IPv6LL Sep 10 05:23:39.869706 kubelet[2750]: I0910 05:23:39.869640 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-jpkqd" podStartSLOduration=38.869624057 podStartE2EDuration="38.869624057s" podCreationTimestamp="2025-09-10 05:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 05:23:39.869314025 +0000 UTC m=+43.508754963" watchObservedRunningTime="2025-09-10 05:23:39.869624057 +0000 UTC m=+43.509064995" Sep 10 05:23:39.884294 systemd-networkd[1495]: calida39a66bdc9: Gained IPv6LL Sep 10 05:23:40.012277 systemd-networkd[1495]: calia4f5eeb5f70: Gained IPv6LL Sep 10 05:23:40.204298 systemd-networkd[1495]: calib27e21b1546: Gained IPv6LL Sep 10 05:23:40.396316 systemd-networkd[1495]: cali93c6026d85e: Gained IPv6LL Sep 10 05:23:40.468949 containerd[1588]: time="2025-09-10T05:23:40.468904998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kdrts,Uid:436090e8-7271-45d4-a202-a938b0eefe53,Namespace:calico-system,Attempt:0,}" Sep 10 05:23:40.549562 systemd[1]: Started sshd@7-10.0.0.54:22-10.0.0.1:37812.service - OpenSSH per-connection server daemon (10.0.0.1:37812). Sep 10 05:23:40.601577 systemd-networkd[1495]: calia7251b657a6: Link UP Sep 10 05:23:40.601832 systemd-networkd[1495]: calia7251b657a6: Gained carrier Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.490 [INFO][4915] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.507 [INFO][4915] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--kdrts-eth0 goldmane-54d579b49d- calico-system 436090e8-7271-45d4-a202-a938b0eefe53 854 0 2025-09-10 05:23:16 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-kdrts eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia7251b657a6 [] [] }} ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.507 [INFO][4915] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.537 [INFO][4933] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" HandleID="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Workload="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.537 [INFO][4933] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" HandleID="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Workload="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ae000), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-kdrts", "timestamp":"2025-09-10 05:23:40.537585465 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.537 [INFO][4933] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.538 [INFO][4933] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.538 [INFO][4933] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.552 [INFO][4933] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.558 [INFO][4933] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.564 [INFO][4933] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.576 [INFO][4933] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.583 [INFO][4933] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.583 [INFO][4933] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.585 [INFO][4933] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28 Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.588 [INFO][4933] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.594 [INFO][4933] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.594 [INFO][4933] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" host="localhost" Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.594 [INFO][4933] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:40.617648 containerd[1588]: 2025-09-10 05:23:40.594 [INFO][4933] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" HandleID="k8s-pod-network.591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Workload="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.598 [INFO][4915] cni-plugin/k8s.go 418: Populated endpoint ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kdrts-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"436090e8-7271-45d4-a202-a938b0eefe53", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-kdrts", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia7251b657a6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.598 [INFO][4915] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.598 [INFO][4915] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7251b657a6 ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.599 [INFO][4915] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.600 [INFO][4915] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kdrts-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"436090e8-7271-45d4-a202-a938b0eefe53", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 5, 23, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28", Pod:"goldmane-54d579b49d-kdrts", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia7251b657a6", MAC:"ae:79:f4:a7:4f:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 05:23:40.618260 containerd[1588]: 2025-09-10 05:23:40.614 [INFO][4915] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" Namespace="calico-system" Pod="goldmane-54d579b49d-kdrts" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kdrts-eth0" Sep 10 05:23:40.649212 sshd[4941]: Accepted publickey for core from 10.0.0.1 port 37812 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:40.651651 sshd-session[4941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:40.656917 systemd-logind[1570]: New session 8 of user core. Sep 10 05:23:40.664289 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 10 05:23:40.667191 containerd[1588]: time="2025-09-10T05:23:40.667142241Z" level=info msg="connecting to shim 591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28" address="unix:///run/containerd/s/0562c9809a158b4a381a69bfe22cb7189109b2aacfb3d85df9a958bee46fa366" namespace=k8s.io protocol=ttrpc version=3 Sep 10 05:23:40.706287 systemd[1]: Started cri-containerd-591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28.scope - libcontainer container 591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28. Sep 10 05:23:40.722913 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 10 05:23:40.837303 containerd[1588]: time="2025-09-10T05:23:40.837248588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kdrts,Uid:436090e8-7271-45d4-a202-a938b0eefe53,Namespace:calico-system,Attempt:0,} returns sandbox id \"591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28\"" Sep 10 05:23:40.915265 sshd[4962]: Connection closed by 10.0.0.1 port 37812 Sep 10 05:23:40.915540 sshd-session[4941]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:40.921330 systemd-logind[1570]: Session 8 logged out. Waiting for processes to exit. Sep 10 05:23:40.921672 systemd[1]: sshd@7-10.0.0.54:22-10.0.0.1:37812.service: Deactivated successfully. Sep 10 05:23:40.924384 systemd[1]: session-8.scope: Deactivated successfully. Sep 10 05:23:40.926184 systemd-logind[1570]: Removed session 8. Sep 10 05:23:41.377971 containerd[1588]: time="2025-09-10T05:23:41.377912647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:41.378708 containerd[1588]: time="2025-09-10T05:23:41.378652755Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 10 05:23:41.379914 containerd[1588]: time="2025-09-10T05:23:41.379854406Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:41.381770 containerd[1588]: time="2025-09-10T05:23:41.381715954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:41.382297 containerd[1588]: time="2025-09-10T05:23:41.382246273Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.33259142s" Sep 10 05:23:41.382297 containerd[1588]: time="2025-09-10T05:23:41.382290452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 10 05:23:41.383706 containerd[1588]: time="2025-09-10T05:23:41.383472854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 10 05:23:41.398193 containerd[1588]: time="2025-09-10T05:23:41.398146024Z" level=info msg="CreateContainer within sandbox \"3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 10 05:23:41.406699 containerd[1588]: time="2025-09-10T05:23:41.406649019Z" level=info msg="Container 761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:41.417323 containerd[1588]: time="2025-09-10T05:23:41.417265585Z" level=info msg="CreateContainer within sandbox \"3469cef1089d43a60911cd0374a8796b41368dbc9913332386213c3cde8b418b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\"" Sep 10 05:23:41.419107 containerd[1588]: time="2025-09-10T05:23:41.417999101Z" level=info msg="StartContainer for \"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\"" Sep 10 05:23:41.419356 containerd[1588]: time="2025-09-10T05:23:41.419334819Z" level=info msg="connecting to shim 761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa" address="unix:///run/containerd/s/b6fc55422edef77a9256752b4853489d3de2b1eab957c6ef455f604024efa776" protocol=ttrpc version=3 Sep 10 05:23:41.463266 systemd[1]: Started cri-containerd-761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa.scope - libcontainer container 761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa. Sep 10 05:23:41.485304 systemd-networkd[1495]: cali50a9ac2e0dd: Gained IPv6LL Sep 10 05:23:41.539597 containerd[1588]: time="2025-09-10T05:23:41.539545642Z" level=info msg="StartContainer for \"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\" returns successfully" Sep 10 05:23:41.636853 kubelet[2750]: I0910 05:23:41.636660 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-67598f4498-tl9wq" podStartSLOduration=20.235273935 podStartE2EDuration="24.636638277s" podCreationTimestamp="2025-09-10 05:23:17 +0000 UTC" firstStartedPulling="2025-09-10 05:23:36.981960485 +0000 UTC m=+40.621401424" lastFinishedPulling="2025-09-10 05:23:41.383324828 +0000 UTC m=+45.022765766" observedRunningTime="2025-09-10 05:23:41.636230532 +0000 UTC m=+45.275671470" watchObservedRunningTime="2025-09-10 05:23:41.636638277 +0000 UTC m=+45.276079215" Sep 10 05:23:42.572325 systemd-networkd[1495]: calia7251b657a6: Gained IPv6LL Sep 10 05:23:42.674342 containerd[1588]: time="2025-09-10T05:23:42.674297937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\" id:\"f74bbb1768c7873cfa047a7dfbeb2927c0dc205eee37aa30ec3c15873ece9402\" pid:5126 exited_at:{seconds:1757481822 nanos:673975113}" Sep 10 05:23:44.659231 containerd[1588]: time="2025-09-10T05:23:44.659166877Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:44.659881 containerd[1588]: time="2025-09-10T05:23:44.659848063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 10 05:23:44.661032 containerd[1588]: time="2025-09-10T05:23:44.660998233Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:44.663077 containerd[1588]: time="2025-09-10T05:23:44.663033717Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:44.663600 containerd[1588]: time="2025-09-10T05:23:44.663568021Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.280067122s" Sep 10 05:23:44.663600 containerd[1588]: time="2025-09-10T05:23:44.663594524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 10 05:23:44.664653 containerd[1588]: time="2025-09-10T05:23:44.664467874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 10 05:23:44.668014 containerd[1588]: time="2025-09-10T05:23:44.667982112Z" level=info msg="CreateContainer within sandbox \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 10 05:23:44.676267 containerd[1588]: time="2025-09-10T05:23:44.675614743Z" level=info msg="Container bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:44.683933 containerd[1588]: time="2025-09-10T05:23:44.683891386Z" level=info msg="CreateContainer within sandbox \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\"" Sep 10 05:23:44.684360 containerd[1588]: time="2025-09-10T05:23:44.684308797Z" level=info msg="StartContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\"" Sep 10 05:23:44.685387 containerd[1588]: time="2025-09-10T05:23:44.685360873Z" level=info msg="connecting to shim bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963" address="unix:///run/containerd/s/ca1bccbfde4ff45d6ef3d0b8a858d44141e5444f3d0ef71bfd9d3b5ec919ef4e" protocol=ttrpc version=3 Sep 10 05:23:44.705312 systemd[1]: Started cri-containerd-bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963.scope - libcontainer container bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963. Sep 10 05:23:44.753362 containerd[1588]: time="2025-09-10T05:23:44.753256678Z" level=info msg="StartContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" returns successfully" Sep 10 05:23:44.967922 kubelet[2750]: I0910 05:23:44.967793 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:45.085086 containerd[1588]: time="2025-09-10T05:23:45.085030075Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:45.085926 containerd[1588]: time="2025-09-10T05:23:45.085864976Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 10 05:23:45.087916 containerd[1588]: time="2025-09-10T05:23:45.087886428Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 423.374738ms" Sep 10 05:23:45.087916 containerd[1588]: time="2025-09-10T05:23:45.087915667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 10 05:23:45.088883 containerd[1588]: time="2025-09-10T05:23:45.088835968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 10 05:23:45.099328 containerd[1588]: time="2025-09-10T05:23:45.098509215Z" level=info msg="CreateContainer within sandbox \"cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 10 05:23:45.113507 containerd[1588]: time="2025-09-10T05:23:45.113464723Z" level=info msg="Container ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:45.236592 containerd[1588]: time="2025-09-10T05:23:45.236421669Z" level=info msg="CreateContainer within sandbox \"cb8fc733c1229718e74d01b995cf32bc89da780ba28caf920b59c13bdf0fee68\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f\"" Sep 10 05:23:45.238606 containerd[1588]: time="2025-09-10T05:23:45.238565395Z" level=info msg="StartContainer for \"ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f\"" Sep 10 05:23:45.240589 containerd[1588]: time="2025-09-10T05:23:45.240558961Z" level=info msg="connecting to shim ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f" address="unix:///run/containerd/s/d32d425538de70f1c433a9eec24a12ae25b053d7ef4f01087b29f905dc9bbd78" protocol=ttrpc version=3 Sep 10 05:23:45.302354 systemd[1]: Started cri-containerd-ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f.scope - libcontainer container ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f. Sep 10 05:23:45.361348 containerd[1588]: time="2025-09-10T05:23:45.361305689Z" level=info msg="StartContainer for \"ef0c620bfb114774e0ec3df42f2546b2b9d93166051dcf498f1e381edea3a68f\" returns successfully" Sep 10 05:23:45.533693 containerd[1588]: time="2025-09-10T05:23:45.533644654Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:45.535569 containerd[1588]: time="2025-09-10T05:23:45.535540175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 10 05:23:45.536940 containerd[1588]: time="2025-09-10T05:23:45.536897085Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 448.026428ms" Sep 10 05:23:45.536991 containerd[1588]: time="2025-09-10T05:23:45.536955772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 10 05:23:45.538658 containerd[1588]: time="2025-09-10T05:23:45.538620344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 10 05:23:45.548393 containerd[1588]: time="2025-09-10T05:23:45.548359012Z" level=info msg="CreateContainer within sandbox \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 10 05:23:45.560074 containerd[1588]: time="2025-09-10T05:23:45.560035174Z" level=info msg="Container 309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:45.569572 containerd[1588]: time="2025-09-10T05:23:45.569516330Z" level=info msg="CreateContainer within sandbox \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\"" Sep 10 05:23:45.570862 containerd[1588]: time="2025-09-10T05:23:45.570838902Z" level=info msg="StartContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\"" Sep 10 05:23:45.572152 containerd[1588]: time="2025-09-10T05:23:45.572078147Z" level=info msg="connecting to shim 309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386" address="unix:///run/containerd/s/c41f4ae66989235fe657f0b8a142be7a9d68c709f98a44e324c89f8c340ab8a8" protocol=ttrpc version=3 Sep 10 05:23:45.598259 systemd[1]: Started cri-containerd-309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386.scope - libcontainer container 309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386. Sep 10 05:23:45.663102 kubelet[2750]: I0910 05:23:45.662996 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-55b79d7cf6-vzlbk" podStartSLOduration=24.147502985 podStartE2EDuration="30.662972116s" podCreationTimestamp="2025-09-10 05:23:15 +0000 UTC" firstStartedPulling="2025-09-10 05:23:38.573252609 +0000 UTC m=+42.212693547" lastFinishedPulling="2025-09-10 05:23:45.08872173 +0000 UTC m=+48.728162678" observedRunningTime="2025-09-10 05:23:45.65326494 +0000 UTC m=+49.292705878" watchObservedRunningTime="2025-09-10 05:23:45.662972116 +0000 UTC m=+49.302413044" Sep 10 05:23:45.670501 containerd[1588]: time="2025-09-10T05:23:45.670461357Z" level=info msg="StartContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" returns successfully" Sep 10 05:23:45.800057 systemd-networkd[1495]: vxlan.calico: Link UP Sep 10 05:23:45.800068 systemd-networkd[1495]: vxlan.calico: Gained carrier Sep 10 05:23:45.929841 systemd[1]: Started sshd@8-10.0.0.54:22-10.0.0.1:37824.service - OpenSSH per-connection server daemon (10.0.0.1:37824). Sep 10 05:23:46.048507 sshd[5370]: Accepted publickey for core from 10.0.0.1 port 37824 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:46.050908 sshd-session[5370]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:46.056475 systemd-logind[1570]: New session 9 of user core. Sep 10 05:23:46.063553 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 10 05:23:46.262691 sshd[5393]: Connection closed by 10.0.0.1 port 37824 Sep 10 05:23:46.264443 sshd-session[5370]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:46.270565 systemd-logind[1570]: Session 9 logged out. Waiting for processes to exit. Sep 10 05:23:46.272761 systemd[1]: sshd@8-10.0.0.54:22-10.0.0.1:37824.service: Deactivated successfully. Sep 10 05:23:46.275706 systemd[1]: session-9.scope: Deactivated successfully. Sep 10 05:23:46.280295 systemd-logind[1570]: Removed session 9. Sep 10 05:23:46.766712 kubelet[2750]: I0910 05:23:46.766261 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:46.766712 kubelet[2750]: I0910 05:23:46.766696 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:46.840175 kubelet[2750]: I0910 05:23:46.839942 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54575dc547-kddv5" podStartSLOduration=26.145027771 podStartE2EDuration="32.839919207s" podCreationTimestamp="2025-09-10 05:23:14 +0000 UTC" firstStartedPulling="2025-09-10 05:23:38.843004566 +0000 UTC m=+42.482445505" lastFinishedPulling="2025-09-10 05:23:45.537896003 +0000 UTC m=+49.177336941" observedRunningTime="2025-09-10 05:23:46.839432919 +0000 UTC m=+50.478873857" watchObservedRunningTime="2025-09-10 05:23:46.839919207 +0000 UTC m=+50.479360145" Sep 10 05:23:46.840461 kubelet[2750]: I0910 05:23:46.840411 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54575dc547-rbmsw" podStartSLOduration=26.747673477 podStartE2EDuration="32.840376646s" podCreationTimestamp="2025-09-10 05:23:14 +0000 UTC" firstStartedPulling="2025-09-10 05:23:38.571639766 +0000 UTC m=+42.211080704" lastFinishedPulling="2025-09-10 05:23:44.664342945 +0000 UTC m=+48.303783873" observedRunningTime="2025-09-10 05:23:45.669378111 +0000 UTC m=+49.308819050" watchObservedRunningTime="2025-09-10 05:23:46.840376646 +0000 UTC m=+50.479817584" Sep 10 05:23:47.628383 systemd-networkd[1495]: vxlan.calico: Gained IPv6LL Sep 10 05:23:47.766160 kubelet[2750]: I0910 05:23:47.765906 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:47.885927 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount347477116.mount: Deactivated successfully. Sep 10 05:23:47.917545 containerd[1588]: time="2025-09-10T05:23:47.917482243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:47.918434 containerd[1588]: time="2025-09-10T05:23:47.918383042Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 10 05:23:47.919568 containerd[1588]: time="2025-09-10T05:23:47.919535992Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:47.921671 containerd[1588]: time="2025-09-10T05:23:47.921643620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:47.922343 containerd[1588]: time="2025-09-10T05:23:47.922315194Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.383652725s" Sep 10 05:23:47.922397 containerd[1588]: time="2025-09-10T05:23:47.922345694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 10 05:23:47.924412 containerd[1588]: time="2025-09-10T05:23:47.924388693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 10 05:23:47.928089 containerd[1588]: time="2025-09-10T05:23:47.928058173Z" level=info msg="CreateContainer within sandbox \"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 10 05:23:47.936876 containerd[1588]: time="2025-09-10T05:23:47.936830679Z" level=info msg="Container b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:47.945973 containerd[1588]: time="2025-09-10T05:23:47.945930257Z" level=info msg="CreateContainer within sandbox \"c67970f0870c93ce82f11d96d426537a619007d2368ff6109bb6d76e7e353d41\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a\"" Sep 10 05:23:47.946506 containerd[1588]: time="2025-09-10T05:23:47.946484969Z" level=info msg="StartContainer for \"b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a\"" Sep 10 05:23:47.947892 containerd[1588]: time="2025-09-10T05:23:47.947857265Z" level=info msg="connecting to shim b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a" address="unix:///run/containerd/s/e4513f20431767dd8c92c6e7811b4965c553bfa8939b3b8c5a80e097b7888674" protocol=ttrpc version=3 Sep 10 05:23:47.974322 systemd[1]: Started cri-containerd-b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a.scope - libcontainer container b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a. Sep 10 05:23:48.021875 containerd[1588]: time="2025-09-10T05:23:48.021836270Z" level=info msg="StartContainer for \"b6ae4790f5c8882ce4444147a9e6018f6e4c85f8ab4ab865cf5370a13b2d9d0a\" returns successfully" Sep 10 05:23:48.798436 kubelet[2750]: I0910 05:23:48.798366 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-555947ccd6-9ggdl" podStartSLOduration=2.698812697 podStartE2EDuration="13.798345014s" podCreationTimestamp="2025-09-10 05:23:35 +0000 UTC" firstStartedPulling="2025-09-10 05:23:36.82470817 +0000 UTC m=+40.464149109" lastFinishedPulling="2025-09-10 05:23:47.924240478 +0000 UTC m=+51.563681426" observedRunningTime="2025-09-10 05:23:48.796867892 +0000 UTC m=+52.436308850" watchObservedRunningTime="2025-09-10 05:23:48.798345014 +0000 UTC m=+52.437785952" Sep 10 05:23:48.915311 kubelet[2750]: I0910 05:23:48.915265 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:49.577335 containerd[1588]: time="2025-09-10T05:23:49.577283219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:49.578192 containerd[1588]: time="2025-09-10T05:23:49.578152744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 10 05:23:49.579667 containerd[1588]: time="2025-09-10T05:23:49.579613071Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:49.581575 containerd[1588]: time="2025-09-10T05:23:49.581523862Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:49.582186 containerd[1588]: time="2025-09-10T05:23:49.582153291Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.657739317s" Sep 10 05:23:49.582237 containerd[1588]: time="2025-09-10T05:23:49.582187879Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 10 05:23:49.583320 containerd[1588]: time="2025-09-10T05:23:49.583297260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 10 05:23:49.587256 containerd[1588]: time="2025-09-10T05:23:49.587217387Z" level=info msg="CreateContainer within sandbox \"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 10 05:23:49.598081 containerd[1588]: time="2025-09-10T05:23:49.598008384Z" level=info msg="Container 5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:49.607896 containerd[1588]: time="2025-09-10T05:23:49.607854436Z" level=info msg="CreateContainer within sandbox \"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec\"" Sep 10 05:23:49.608614 containerd[1588]: time="2025-09-10T05:23:49.608539075Z" level=info msg="StartContainer for \"5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec\"" Sep 10 05:23:49.610178 containerd[1588]: time="2025-09-10T05:23:49.610124329Z" level=info msg="connecting to shim 5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec" address="unix:///run/containerd/s/b1ea0750ff8238f09d5c50664b8270bf220c8e1d42342373f6b4097956fcfad4" protocol=ttrpc version=3 Sep 10 05:23:49.630274 systemd[1]: Started cri-containerd-5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec.scope - libcontainer container 5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec. Sep 10 05:23:49.817797 containerd[1588]: time="2025-09-10T05:23:49.817741365Z" level=info msg="StartContainer for \"5c66f66d7cf692824e080940d91872596b84f1eea13c90f9629825d4254c1dec\" returns successfully" Sep 10 05:23:51.279723 systemd[1]: Started sshd@9-10.0.0.54:22-10.0.0.1:37276.service - OpenSSH per-connection server daemon (10.0.0.1:37276). Sep 10 05:23:51.372229 sshd[5556]: Accepted publickey for core from 10.0.0.1 port 37276 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:51.374031 sshd-session[5556]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:51.378766 systemd-logind[1570]: New session 10 of user core. Sep 10 05:23:51.389285 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 10 05:23:51.537086 sshd[5559]: Connection closed by 10.0.0.1 port 37276 Sep 10 05:23:51.537435 sshd-session[5556]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:51.541616 systemd[1]: sshd@9-10.0.0.54:22-10.0.0.1:37276.service: Deactivated successfully. Sep 10 05:23:51.544215 systemd[1]: session-10.scope: Deactivated successfully. Sep 10 05:23:51.546243 systemd-logind[1570]: Session 10 logged out. Waiting for processes to exit. Sep 10 05:23:51.548327 systemd-logind[1570]: Removed session 10. Sep 10 05:23:52.402569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3334949663.mount: Deactivated successfully. Sep 10 05:23:53.478026 containerd[1588]: time="2025-09-10T05:23:53.477968250Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:53.478955 containerd[1588]: time="2025-09-10T05:23:53.478895025Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 10 05:23:53.480069 containerd[1588]: time="2025-09-10T05:23:53.480038438Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:53.482453 containerd[1588]: time="2025-09-10T05:23:53.482374101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:53.482936 containerd[1588]: time="2025-09-10T05:23:53.482893378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.899569705s" Sep 10 05:23:53.482936 containerd[1588]: time="2025-09-10T05:23:53.482924630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 10 05:23:53.484055 containerd[1588]: time="2025-09-10T05:23:53.484019056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 10 05:23:53.488711 containerd[1588]: time="2025-09-10T05:23:53.488673868Z" level=info msg="CreateContainer within sandbox \"591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 10 05:23:53.497169 containerd[1588]: time="2025-09-10T05:23:53.496496290Z" level=info msg="Container 1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:53.506391 containerd[1588]: time="2025-09-10T05:23:53.506352708Z" level=info msg="CreateContainer within sandbox \"591790f49c2d2b76fbf322747b882edbf682ae553bf0dadb9f608728819cfc28\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\"" Sep 10 05:23:53.507113 containerd[1588]: time="2025-09-10T05:23:53.507073454Z" level=info msg="StartContainer for \"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\"" Sep 10 05:23:53.508537 containerd[1588]: time="2025-09-10T05:23:53.508470118Z" level=info msg="connecting to shim 1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5" address="unix:///run/containerd/s/0562c9809a158b4a381a69bfe22cb7189109b2aacfb3d85df9a958bee46fa366" protocol=ttrpc version=3 Sep 10 05:23:53.535274 systemd[1]: Started cri-containerd-1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5.scope - libcontainer container 1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5. Sep 10 05:23:53.586978 containerd[1588]: time="2025-09-10T05:23:53.586935193Z" level=info msg="StartContainer for \"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" returns successfully" Sep 10 05:23:53.850842 kubelet[2750]: I0910 05:23:53.850447 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-kdrts" podStartSLOduration=25.205390027 podStartE2EDuration="37.850429398s" podCreationTimestamp="2025-09-10 05:23:16 +0000 UTC" firstStartedPulling="2025-09-10 05:23:40.838763086 +0000 UTC m=+44.478204024" lastFinishedPulling="2025-09-10 05:23:53.483802457 +0000 UTC m=+57.123243395" observedRunningTime="2025-09-10 05:23:53.850175216 +0000 UTC m=+57.489616154" watchObservedRunningTime="2025-09-10 05:23:53.850429398 +0000 UTC m=+57.489870336" Sep 10 05:23:53.937289 containerd[1588]: time="2025-09-10T05:23:53.937222205Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" id:\"675d6a46ab810921a4109cc8fe7d04e0a71208dc927e1af7d4c1b56974518888\" pid:5641 exit_status:1 exited_at:{seconds:1757481833 nanos:936716043}" Sep 10 05:23:54.999812 containerd[1588]: time="2025-09-10T05:23:54.999764709Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" id:\"33ce89cb40565e80600a62524a8fc88bded6481709042016f9b12d098657dd25\" pid:5666 exit_status:1 exited_at:{seconds:1757481834 nanos:999476589}" Sep 10 05:23:55.428284 containerd[1588]: time="2025-09-10T05:23:55.428189364Z" level=info msg="TaskExit event in podsandbox handler container_id:\"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\" id:\"52a5e14b1fbfb0ef9f17ed03b3b88c51480a392b21648860274c5686485e000d\" pid:5695 exited_at:{seconds:1757481835 nanos:427669557}" Sep 10 05:23:55.479480 containerd[1588]: time="2025-09-10T05:23:55.479416393Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:55.480114 containerd[1588]: time="2025-09-10T05:23:55.480079424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 10 05:23:55.481301 containerd[1588]: time="2025-09-10T05:23:55.481261632Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:55.483214 containerd[1588]: time="2025-09-10T05:23:55.483166097Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 05:23:55.483727 containerd[1588]: time="2025-09-10T05:23:55.483687397Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.999641689s" Sep 10 05:23:55.483769 containerd[1588]: time="2025-09-10T05:23:55.483724922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 10 05:23:55.488040 containerd[1588]: time="2025-09-10T05:23:55.488010125Z" level=info msg="CreateContainer within sandbox \"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 10 05:23:55.499621 containerd[1588]: time="2025-09-10T05:23:55.498309830Z" level=info msg="Container a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c: CDI devices from CRI Config.CDIDevices: []" Sep 10 05:23:55.508492 containerd[1588]: time="2025-09-10T05:23:55.508448877Z" level=info msg="CreateContainer within sandbox \"e3be6a411a53dbd63814f80d9a8f94361d731ee4582bda9b3906d42d667b42da\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c\"" Sep 10 05:23:55.509209 containerd[1588]: time="2025-09-10T05:23:55.509186937Z" level=info msg="StartContainer for \"a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c\"" Sep 10 05:23:55.510569 containerd[1588]: time="2025-09-10T05:23:55.510513470Z" level=info msg="connecting to shim a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c" address="unix:///run/containerd/s/b1ea0750ff8238f09d5c50664b8270bf220c8e1d42342373f6b4097956fcfad4" protocol=ttrpc version=3 Sep 10 05:23:55.535286 systemd[1]: Started cri-containerd-a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c.scope - libcontainer container a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c. Sep 10 05:23:55.712215 containerd[1588]: time="2025-09-10T05:23:55.712064066Z" level=info msg="StartContainer for \"a77542557f08cf314d06af231d56555db0774a13a7f1fbb4a780e78a30fbb16c\" returns successfully" Sep 10 05:23:55.870251 kubelet[2750]: I0910 05:23:55.870176 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5hfbv" podStartSLOduration=23.150911495 podStartE2EDuration="38.870157657s" podCreationTimestamp="2025-09-10 05:23:17 +0000 UTC" firstStartedPulling="2025-09-10 05:23:39.765169947 +0000 UTC m=+43.404610885" lastFinishedPulling="2025-09-10 05:23:55.484416109 +0000 UTC m=+59.123857047" observedRunningTime="2025-09-10 05:23:55.868025461 +0000 UTC m=+59.507466389" watchObservedRunningTime="2025-09-10 05:23:55.870157657 +0000 UTC m=+59.509598595" Sep 10 05:23:55.940495 containerd[1588]: time="2025-09-10T05:23:55.940438810Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" id:\"73e6a4c34211c981c0c0add5147ee57f1faca2f62e69b81d5b9722254927698c\" pid:5754 exit_status:1 exited_at:{seconds:1757481835 nanos:940188916}" Sep 10 05:23:56.550484 systemd[1]: Started sshd@10-10.0.0.54:22-10.0.0.1:37280.service - OpenSSH per-connection server daemon (10.0.0.1:37280). Sep 10 05:23:56.554488 kubelet[2750]: I0910 05:23:56.554392 2750 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 10 05:23:56.555474 kubelet[2750]: I0910 05:23:56.555416 2750 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 10 05:23:56.627934 sshd[5777]: Accepted publickey for core from 10.0.0.1 port 37280 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:56.630057 sshd-session[5777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:56.634795 systemd-logind[1570]: New session 11 of user core. Sep 10 05:23:56.642317 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 10 05:23:56.830672 sshd[5780]: Connection closed by 10.0.0.1 port 37280 Sep 10 05:23:56.830984 sshd-session[5777]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:56.840206 systemd[1]: sshd@10-10.0.0.54:22-10.0.0.1:37280.service: Deactivated successfully. Sep 10 05:23:56.842576 systemd[1]: session-11.scope: Deactivated successfully. Sep 10 05:23:56.843422 systemd-logind[1570]: Session 11 logged out. Waiting for processes to exit. Sep 10 05:23:56.847731 systemd[1]: Started sshd@11-10.0.0.54:22-10.0.0.1:37290.service - OpenSSH per-connection server daemon (10.0.0.1:37290). Sep 10 05:23:56.849504 systemd-logind[1570]: Removed session 11. Sep 10 05:23:56.923949 sshd[5797]: Accepted publickey for core from 10.0.0.1 port 37290 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:56.926197 sshd-session[5797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:56.934941 systemd-logind[1570]: New session 12 of user core. Sep 10 05:23:56.944395 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 10 05:23:57.151975 sshd[5800]: Connection closed by 10.0.0.1 port 37290 Sep 10 05:23:57.152614 sshd-session[5797]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:57.165146 systemd[1]: sshd@11-10.0.0.54:22-10.0.0.1:37290.service: Deactivated successfully. Sep 10 05:23:57.168653 systemd[1]: session-12.scope: Deactivated successfully. Sep 10 05:23:57.170738 systemd-logind[1570]: Session 12 logged out. Waiting for processes to exit. Sep 10 05:23:57.176108 systemd[1]: Started sshd@12-10.0.0.54:22-10.0.0.1:37296.service - OpenSSH per-connection server daemon (10.0.0.1:37296). Sep 10 05:23:57.177050 systemd-logind[1570]: Removed session 12. Sep 10 05:23:57.237240 sshd[5812]: Accepted publickey for core from 10.0.0.1 port 37296 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:23:57.238552 sshd-session[5812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:23:57.243758 systemd-logind[1570]: New session 13 of user core. Sep 10 05:23:57.250252 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 10 05:23:57.368229 sshd[5815]: Connection closed by 10.0.0.1 port 37296 Sep 10 05:23:57.368694 sshd-session[5812]: pam_unix(sshd:session): session closed for user core Sep 10 05:23:57.373737 systemd[1]: sshd@12-10.0.0.54:22-10.0.0.1:37296.service: Deactivated successfully. Sep 10 05:23:57.375838 systemd[1]: session-13.scope: Deactivated successfully. Sep 10 05:23:57.376562 systemd-logind[1570]: Session 13 logged out. Waiting for processes to exit. Sep 10 05:23:57.377943 systemd-logind[1570]: Removed session 13. Sep 10 05:23:58.287763 kubelet[2750]: I0910 05:23:58.287710 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:58.328658 kubelet[2750]: I0910 05:23:58.328606 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 05:23:58.364702 containerd[1588]: time="2025-09-10T05:23:58.364630559Z" level=info msg="StopContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" with timeout 30 (s)" Sep 10 05:23:58.369565 containerd[1588]: time="2025-09-10T05:23:58.369514254Z" level=info msg="Stop container \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" with signal terminated" Sep 10 05:23:58.390339 systemd[1]: cri-containerd-309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386.scope: Deactivated successfully. Sep 10 05:23:58.392802 containerd[1588]: time="2025-09-10T05:23:58.392762149Z" level=info msg="TaskExit event in podsandbox handler container_id:\"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" id:\"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" pid:5301 exit_status:1 exited_at:{seconds:1757481838 nanos:391316166}" Sep 10 05:23:58.392937 containerd[1588]: time="2025-09-10T05:23:58.392817779Z" level=info msg="received exit event container_id:\"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" id:\"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" pid:5301 exit_status:1 exited_at:{seconds:1757481838 nanos:391316166}" Sep 10 05:23:58.420122 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386-rootfs.mount: Deactivated successfully. Sep 10 05:23:58.440008 containerd[1588]: time="2025-09-10T05:23:58.439960116Z" level=info msg="StopContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" returns successfully" Sep 10 05:23:58.442722 containerd[1588]: time="2025-09-10T05:23:58.442683448Z" level=info msg="StopPodSandbox for \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\"" Sep 10 05:23:58.456241 containerd[1588]: time="2025-09-10T05:23:58.456203822Z" level=info msg="Container to stop \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 10 05:23:58.470607 systemd[1]: cri-containerd-60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6.scope: Deactivated successfully. Sep 10 05:23:58.477788 containerd[1588]: time="2025-09-10T05:23:58.477733441Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" id:\"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" pid:4604 exit_status:137 exited_at:{seconds:1757481838 nanos:477470215}" Sep 10 05:23:58.512736 containerd[1588]: time="2025-09-10T05:23:58.512673976Z" level=info msg="shim disconnected" id=60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6 namespace=k8s.io Sep 10 05:23:58.512736 containerd[1588]: time="2025-09-10T05:23:58.512709992Z" level=warning msg="cleaning up after shim disconnected" id=60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6 namespace=k8s.io Sep 10 05:23:58.512736 containerd[1588]: time="2025-09-10T05:23:58.512717084Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 10 05:23:58.514405 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6-rootfs.mount: Deactivated successfully. Sep 10 05:23:58.569307 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6-shm.mount: Deactivated successfully. Sep 10 05:23:58.573462 containerd[1588]: time="2025-09-10T05:23:58.573374756Z" level=info msg="received exit event sandbox_id:\"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" exit_status:137 exited_at:{seconds:1757481838 nanos:477470215}" Sep 10 05:23:58.641733 systemd-networkd[1495]: calia4f5eeb5f70: Link DOWN Sep 10 05:23:58.641744 systemd-networkd[1495]: calia4f5eeb5f70: Lost carrier Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.640 [INFO][5901] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.640 [INFO][5901] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" iface="eth0" netns="/var/run/netns/cni-1b8dfed5-dd8d-0647-c259-13a9486c28a3" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.641 [INFO][5901] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" iface="eth0" netns="/var/run/netns/cni-1b8dfed5-dd8d-0647-c259-13a9486c28a3" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.648 [INFO][5901] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" after=7.614892ms iface="eth0" netns="/var/run/netns/cni-1b8dfed5-dd8d-0647-c259-13a9486c28a3" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.648 [INFO][5901] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.648 [INFO][5901] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.674 [INFO][5913] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.674 [INFO][5913] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.674 [INFO][5913] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.711 [INFO][5913] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.711 [INFO][5913] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" HandleID="k8s-pod-network.60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Workload="localhost-k8s-calico--apiserver--54575dc547--kddv5-eth0" Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.713 [INFO][5913] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:23:58.720746 containerd[1588]: 2025-09-10 05:23:58.716 [INFO][5901] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6" Sep 10 05:23:58.724893 systemd[1]: run-netns-cni\x2d1b8dfed5\x2ddd8d\x2d0647\x2dc259\x2d13a9486c28a3.mount: Deactivated successfully. Sep 10 05:23:58.725752 containerd[1588]: time="2025-09-10T05:23:58.725699811Z" level=info msg="TearDown network for sandbox \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" successfully" Sep 10 05:23:58.725752 containerd[1588]: time="2025-09-10T05:23:58.725744371Z" level=info msg="StopPodSandbox for \"60cdceb349598007e3a4399a1a0988481a0a7256457f710896f3bceae96697c6\" returns successfully" Sep 10 05:23:58.792037 kubelet[2750]: I0910 05:23:58.791963 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtw4\" (UniqueName: \"kubernetes.io/projected/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-kube-api-access-tgtw4\") pod \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\" (UID: \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\") " Sep 10 05:23:58.792037 kubelet[2750]: I0910 05:23:58.792042 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-calico-apiserver-certs\") pod \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\" (UID: \"0b5e651b-cbd7-4abc-b1e2-a6418c29d0af\") " Sep 10 05:23:58.796879 kubelet[2750]: I0910 05:23:58.796826 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "0b5e651b-cbd7-4abc-b1e2-a6418c29d0af" (UID: "0b5e651b-cbd7-4abc-b1e2-a6418c29d0af"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 10 05:23:58.796956 kubelet[2750]: I0910 05:23:58.796883 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-kube-api-access-tgtw4" (OuterVolumeSpecName: "kube-api-access-tgtw4") pod "0b5e651b-cbd7-4abc-b1e2-a6418c29d0af" (UID: "0b5e651b-cbd7-4abc-b1e2-a6418c29d0af"). InnerVolumeSpecName "kube-api-access-tgtw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 10 05:23:58.799687 systemd[1]: var-lib-kubelet-pods-0b5e651b\x2dcbd7\x2d4abc\x2db1e2\x2da6418c29d0af-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtgtw4.mount: Deactivated successfully. Sep 10 05:23:58.799838 systemd[1]: var-lib-kubelet-pods-0b5e651b\x2dcbd7\x2d4abc\x2db1e2\x2da6418c29d0af-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 10 05:23:58.866343 kubelet[2750]: I0910 05:23:58.866227 2750 scope.go:117] "RemoveContainer" containerID="309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386" Sep 10 05:23:58.868660 containerd[1588]: time="2025-09-10T05:23:58.868620220Z" level=info msg="RemoveContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\"" Sep 10 05:23:58.870814 systemd[1]: Removed slice kubepods-besteffort-pod0b5e651b_cbd7_4abc_b1e2_a6418c29d0af.slice - libcontainer container kubepods-besteffort-pod0b5e651b_cbd7_4abc_b1e2_a6418c29d0af.slice. Sep 10 05:23:58.879388 containerd[1588]: time="2025-09-10T05:23:58.879330676Z" level=info msg="RemoveContainer for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" returns successfully" Sep 10 05:23:58.881053 kubelet[2750]: I0910 05:23:58.881014 2750 scope.go:117] "RemoveContainer" containerID="309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386" Sep 10 05:23:58.881521 containerd[1588]: time="2025-09-10T05:23:58.881329558Z" level=error msg="ContainerStatus for \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\": not found" Sep 10 05:23:58.881689 kubelet[2750]: E0910 05:23:58.881634 2750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\": not found" containerID="309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386" Sep 10 05:23:58.881742 kubelet[2750]: I0910 05:23:58.881670 2750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386"} err="failed to get container status \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\": rpc error: code = NotFound desc = an error occurred when try to find container \"309eecb440d3fa3d8490a3d8a4900e2a14a969accfac0f586f0e449c2d9ba386\": not found" Sep 10 05:23:58.892649 kubelet[2750]: I0910 05:23:58.892580 2750 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 10 05:23:58.892649 kubelet[2750]: I0910 05:23:58.892622 2750 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgtw4\" (UniqueName: \"kubernetes.io/projected/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af-kube-api-access-tgtw4\") on node \"localhost\" DevicePath \"\"" Sep 10 05:24:00.471473 kubelet[2750]: I0910 05:24:00.471415 2750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5e651b-cbd7-4abc-b1e2-a6418c29d0af" path="/var/lib/kubelet/pods/0b5e651b-cbd7-4abc-b1e2-a6418c29d0af/volumes" Sep 10 05:24:02.383255 systemd[1]: Started sshd@13-10.0.0.54:22-10.0.0.1:42470.service - OpenSSH per-connection server daemon (10.0.0.1:42470). Sep 10 05:24:02.434818 sshd[5936]: Accepted publickey for core from 10.0.0.1 port 42470 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:02.436661 sshd-session[5936]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:02.444337 systemd-logind[1570]: New session 14 of user core. Sep 10 05:24:02.447323 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 10 05:24:02.566170 sshd[5939]: Connection closed by 10.0.0.1 port 42470 Sep 10 05:24:02.566688 sshd-session[5936]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:02.570862 systemd[1]: sshd@13-10.0.0.54:22-10.0.0.1:42470.service: Deactivated successfully. Sep 10 05:24:02.572944 systemd[1]: session-14.scope: Deactivated successfully. Sep 10 05:24:02.573898 systemd-logind[1570]: Session 14 logged out. Waiting for processes to exit. Sep 10 05:24:02.575275 systemd-logind[1570]: Removed session 14. Sep 10 05:24:06.680604 containerd[1588]: time="2025-09-10T05:24:06.680549568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\" id:\"53a76fbd03c7897e9a7698160e2afb436f7be9d7d95da659611c2aaac4db6186\" pid:5972 exited_at:{seconds:1757481846 nanos:680239237}" Sep 10 05:24:07.578240 systemd[1]: Started sshd@14-10.0.0.54:22-10.0.0.1:42476.service - OpenSSH per-connection server daemon (10.0.0.1:42476). Sep 10 05:24:07.650151 sshd[5986]: Accepted publickey for core from 10.0.0.1 port 42476 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:07.652290 sshd-session[5986]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:07.657746 systemd-logind[1570]: New session 15 of user core. Sep 10 05:24:07.666296 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 10 05:24:07.813014 sshd[5989]: Connection closed by 10.0.0.1 port 42476 Sep 10 05:24:07.813453 sshd-session[5986]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:07.818396 systemd[1]: sshd@14-10.0.0.54:22-10.0.0.1:42476.service: Deactivated successfully. Sep 10 05:24:07.820446 systemd[1]: session-15.scope: Deactivated successfully. Sep 10 05:24:07.821466 systemd-logind[1570]: Session 15 logged out. Waiting for processes to exit. Sep 10 05:24:07.822704 systemd-logind[1570]: Removed session 15. Sep 10 05:24:12.666098 containerd[1588]: time="2025-09-10T05:24:12.666003827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"761baf4aad47e7717d0b4cb1eeeb07c1fc4b04b2c9b40b81a8dede1f361e71aa\" id:\"c7aca7de31150c3dd7287a91febd7d77a1a29fdfe5c24593576e92c8f0f1c8c1\" pid:6016 exited_at:{seconds:1757481852 nanos:665649038}" Sep 10 05:24:12.829925 systemd[1]: Started sshd@15-10.0.0.54:22-10.0.0.1:45990.service - OpenSSH per-connection server daemon (10.0.0.1:45990). Sep 10 05:24:12.887087 sshd[6027]: Accepted publickey for core from 10.0.0.1 port 45990 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:12.888973 sshd-session[6027]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:12.894045 systemd-logind[1570]: New session 16 of user core. Sep 10 05:24:12.903389 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 10 05:24:13.023438 sshd[6030]: Connection closed by 10.0.0.1 port 45990 Sep 10 05:24:13.023817 sshd-session[6027]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:13.029242 systemd[1]: sshd@15-10.0.0.54:22-10.0.0.1:45990.service: Deactivated successfully. Sep 10 05:24:13.031984 systemd[1]: session-16.scope: Deactivated successfully. Sep 10 05:24:13.033033 systemd-logind[1570]: Session 16 logged out. Waiting for processes to exit. Sep 10 05:24:13.035157 systemd-logind[1570]: Removed session 16. Sep 10 05:24:18.037450 systemd[1]: Started sshd@16-10.0.0.54:22-10.0.0.1:45996.service - OpenSSH per-connection server daemon (10.0.0.1:45996). Sep 10 05:24:18.082157 sshd[6044]: Accepted publickey for core from 10.0.0.1 port 45996 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:18.084214 sshd-session[6044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:18.089080 systemd-logind[1570]: New session 17 of user core. Sep 10 05:24:18.096398 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 10 05:24:18.279888 sshd[6047]: Connection closed by 10.0.0.1 port 45996 Sep 10 05:24:18.280558 sshd-session[6044]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:18.293824 systemd[1]: sshd@16-10.0.0.54:22-10.0.0.1:45996.service: Deactivated successfully. Sep 10 05:24:18.295900 systemd[1]: session-17.scope: Deactivated successfully. Sep 10 05:24:18.297675 systemd-logind[1570]: Session 17 logged out. Waiting for processes to exit. Sep 10 05:24:18.301534 systemd[1]: Started sshd@17-10.0.0.54:22-10.0.0.1:46006.service - OpenSSH per-connection server daemon (10.0.0.1:46006). Sep 10 05:24:18.302781 systemd-logind[1570]: Removed session 17. Sep 10 05:24:18.407662 sshd[6061]: Accepted publickey for core from 10.0.0.1 port 46006 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:18.412574 sshd-session[6061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:18.418774 systemd-logind[1570]: New session 18 of user core. Sep 10 05:24:18.431466 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 10 05:24:19.584695 sshd[6064]: Connection closed by 10.0.0.1 port 46006 Sep 10 05:24:19.585172 sshd-session[6061]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:19.599123 systemd[1]: sshd@17-10.0.0.54:22-10.0.0.1:46006.service: Deactivated successfully. Sep 10 05:24:19.601495 systemd[1]: session-18.scope: Deactivated successfully. Sep 10 05:24:19.602576 systemd-logind[1570]: Session 18 logged out. Waiting for processes to exit. Sep 10 05:24:19.605467 systemd-logind[1570]: Removed session 18. Sep 10 05:24:19.606972 systemd[1]: Started sshd@18-10.0.0.54:22-10.0.0.1:46016.service - OpenSSH per-connection server daemon (10.0.0.1:46016). Sep 10 05:24:19.666610 sshd[6077]: Accepted publickey for core from 10.0.0.1 port 46016 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:19.668413 sshd-session[6077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:19.672798 systemd-logind[1570]: New session 19 of user core. Sep 10 05:24:19.686249 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 10 05:24:20.423651 sshd[6080]: Connection closed by 10.0.0.1 port 46016 Sep 10 05:24:20.424048 sshd-session[6077]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:20.435914 systemd[1]: sshd@18-10.0.0.54:22-10.0.0.1:46016.service: Deactivated successfully. Sep 10 05:24:20.440596 systemd[1]: session-19.scope: Deactivated successfully. Sep 10 05:24:20.442746 systemd-logind[1570]: Session 19 logged out. Waiting for processes to exit. Sep 10 05:24:20.447985 systemd[1]: Started sshd@19-10.0.0.54:22-10.0.0.1:33740.service - OpenSSH per-connection server daemon (10.0.0.1:33740). Sep 10 05:24:20.451222 systemd-logind[1570]: Removed session 19. Sep 10 05:24:20.507565 sshd[6098]: Accepted publickey for core from 10.0.0.1 port 33740 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:20.509452 sshd-session[6098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:20.514278 systemd-logind[1570]: New session 20 of user core. Sep 10 05:24:20.523314 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 10 05:24:21.127265 sshd[6101]: Connection closed by 10.0.0.1 port 33740 Sep 10 05:24:21.128438 sshd-session[6098]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:21.138335 systemd[1]: sshd@19-10.0.0.54:22-10.0.0.1:33740.service: Deactivated successfully. Sep 10 05:24:21.140630 systemd[1]: session-20.scope: Deactivated successfully. Sep 10 05:24:21.141971 systemd-logind[1570]: Session 20 logged out. Waiting for processes to exit. Sep 10 05:24:21.147019 systemd[1]: Started sshd@20-10.0.0.54:22-10.0.0.1:33750.service - OpenSSH per-connection server daemon (10.0.0.1:33750). Sep 10 05:24:21.149261 systemd-logind[1570]: Removed session 20. Sep 10 05:24:21.217633 sshd[6113]: Accepted publickey for core from 10.0.0.1 port 33750 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:21.219180 sshd-session[6113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:21.223922 systemd-logind[1570]: New session 21 of user core. Sep 10 05:24:21.235281 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 10 05:24:21.350264 sshd[6116]: Connection closed by 10.0.0.1 port 33750 Sep 10 05:24:21.351359 sshd-session[6113]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:21.355679 systemd-logind[1570]: Session 21 logged out. Waiting for processes to exit. Sep 10 05:24:21.356099 systemd[1]: sshd@20-10.0.0.54:22-10.0.0.1:33750.service: Deactivated successfully. Sep 10 05:24:21.359162 systemd[1]: session-21.scope: Deactivated successfully. Sep 10 05:24:21.361627 systemd-logind[1570]: Removed session 21. Sep 10 05:24:23.732329 containerd[1588]: time="2025-09-10T05:24:23.732275315Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" id:\"f6c614611914f748220c9b7c20c7550319b366f1c93c2b8ce75bba96681a05f6\" pid:6144 exited_at:{seconds:1757481863 nanos:723549813}" Sep 10 05:24:23.916253 containerd[1588]: time="2025-09-10T05:24:23.916146961Z" level=info msg="StopContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" with timeout 30 (s)" Sep 10 05:24:23.918669 containerd[1588]: time="2025-09-10T05:24:23.918628084Z" level=info msg="Stop container \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" with signal terminated" Sep 10 05:24:23.937912 systemd[1]: cri-containerd-bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963.scope: Deactivated successfully. Sep 10 05:24:23.938473 systemd[1]: cri-containerd-bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963.scope: Consumed 1.508s CPU time, 59.9M memory peak, 820K read from disk. Sep 10 05:24:23.940233 containerd[1588]: time="2025-09-10T05:24:23.940193829Z" level=info msg="received exit event container_id:\"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" id:\"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" pid:5184 exit_status:1 exited_at:{seconds:1757481863 nanos:939628234}" Sep 10 05:24:23.940944 containerd[1588]: time="2025-09-10T05:24:23.940918835Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" id:\"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" pid:5184 exit_status:1 exited_at:{seconds:1757481863 nanos:939628234}" Sep 10 05:24:23.969062 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963-rootfs.mount: Deactivated successfully. Sep 10 05:24:23.983554 containerd[1588]: time="2025-09-10T05:24:23.983474861Z" level=info msg="StopContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" returns successfully" Sep 10 05:24:23.984185 containerd[1588]: time="2025-09-10T05:24:23.984151114Z" level=info msg="StopPodSandbox for \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\"" Sep 10 05:24:23.984261 containerd[1588]: time="2025-09-10T05:24:23.984238569Z" level=info msg="Container to stop \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 10 05:24:23.991689 systemd[1]: cri-containerd-e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30.scope: Deactivated successfully. Sep 10 05:24:23.994584 containerd[1588]: time="2025-09-10T05:24:23.994546462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" id:\"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" pid:4468 exit_status:137 exited_at:{seconds:1757481863 nanos:994152058}" Sep 10 05:24:24.024602 containerd[1588]: time="2025-09-10T05:24:24.024558700Z" level=info msg="shim disconnected" id=e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30 namespace=k8s.io Sep 10 05:24:24.024743 containerd[1588]: time="2025-09-10T05:24:24.024659470Z" level=warning msg="cleaning up after shim disconnected" id=e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30 namespace=k8s.io Sep 10 05:24:24.024852 containerd[1588]: time="2025-09-10T05:24:24.024671383Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 10 05:24:24.027444 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30-rootfs.mount: Deactivated successfully. Sep 10 05:24:24.050551 containerd[1588]: time="2025-09-10T05:24:24.048888218Z" level=info msg="received exit event sandbox_id:\"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" exit_status:137 exited_at:{seconds:1757481863 nanos:994152058}" Sep 10 05:24:24.052995 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30-shm.mount: Deactivated successfully. Sep 10 05:24:24.133630 systemd-networkd[1495]: calie930cb6935d: Link DOWN Sep 10 05:24:24.133640 systemd-networkd[1495]: calie930cb6935d: Lost carrier Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.130 [INFO][6229] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.131 [INFO][6229] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" iface="eth0" netns="/var/run/netns/cni-54328825-6436-a3d3-bce2-9c36629f4321" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.131 [INFO][6229] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" iface="eth0" netns="/var/run/netns/cni-54328825-6436-a3d3-bce2-9c36629f4321" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.147 [INFO][6229] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" after=16.701367ms iface="eth0" netns="/var/run/netns/cni-54328825-6436-a3d3-bce2-9c36629f4321" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.147 [INFO][6229] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.147 [INFO][6229] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.176 [INFO][6242] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.176 [INFO][6242] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.176 [INFO][6242] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.202 [INFO][6242] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.202 [INFO][6242] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" HandleID="k8s-pod-network.e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Workload="localhost-k8s-calico--apiserver--54575dc547--rbmsw-eth0" Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.203 [INFO][6242] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 05:24:24.211333 containerd[1588]: 2025-09-10 05:24:24.208 [INFO][6229] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30" Sep 10 05:24:24.211799 containerd[1588]: time="2025-09-10T05:24:24.211666879Z" level=info msg="TearDown network for sandbox \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" successfully" Sep 10 05:24:24.211799 containerd[1588]: time="2025-09-10T05:24:24.211754213Z" level=info msg="StopPodSandbox for \"e6e370ecb667d9e83f2306b31cb7da2f5afd23066cc9644d97388e61a6a00a30\" returns successfully" Sep 10 05:24:24.214456 systemd[1]: run-netns-cni\x2d54328825\x2d6436\x2da3d3\x2dbce2\x2d9c36629f4321.mount: Deactivated successfully. Sep 10 05:24:24.357031 kubelet[2750]: I0910 05:24:24.356415 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7s8j\" (UniqueName: \"kubernetes.io/projected/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-kube-api-access-r7s8j\") pod \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\" (UID: \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\") " Sep 10 05:24:24.357031 kubelet[2750]: I0910 05:24:24.356466 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-calico-apiserver-certs\") pod \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\" (UID: \"c8a6cabe-8b7b-46a3-b357-1c595cd08f45\") " Sep 10 05:24:24.362673 kubelet[2750]: I0910 05:24:24.362631 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "c8a6cabe-8b7b-46a3-b357-1c595cd08f45" (UID: "c8a6cabe-8b7b-46a3-b357-1c595cd08f45"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 10 05:24:24.363029 kubelet[2750]: I0910 05:24:24.362966 2750 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-kube-api-access-r7s8j" (OuterVolumeSpecName: "kube-api-access-r7s8j") pod "c8a6cabe-8b7b-46a3-b357-1c595cd08f45" (UID: "c8a6cabe-8b7b-46a3-b357-1c595cd08f45"). InnerVolumeSpecName "kube-api-access-r7s8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 10 05:24:24.365232 systemd[1]: var-lib-kubelet-pods-c8a6cabe\x2d8b7b\x2d46a3\x2db357\x2d1c595cd08f45-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr7s8j.mount: Deactivated successfully. Sep 10 05:24:24.365341 systemd[1]: var-lib-kubelet-pods-c8a6cabe\x2d8b7b\x2d46a3\x2db357\x2d1c595cd08f45-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 10 05:24:24.457050 kubelet[2750]: I0910 05:24:24.456974 2750 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 10 05:24:24.457050 kubelet[2750]: I0910 05:24:24.457026 2750 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7s8j\" (UniqueName: \"kubernetes.io/projected/c8a6cabe-8b7b-46a3-b357-1c595cd08f45-kube-api-access-r7s8j\") on node \"localhost\" DevicePath \"\"" Sep 10 05:24:24.476776 systemd[1]: Removed slice kubepods-besteffort-podc8a6cabe_8b7b_46a3_b357_1c595cd08f45.slice - libcontainer container kubepods-besteffort-podc8a6cabe_8b7b_46a3_b357_1c595cd08f45.slice. Sep 10 05:24:24.476882 systemd[1]: kubepods-besteffort-podc8a6cabe_8b7b_46a3_b357_1c595cd08f45.slice: Consumed 1.535s CPU time, 60.2M memory peak, 820K read from disk. Sep 10 05:24:24.935330 kubelet[2750]: I0910 05:24:24.935119 2750 scope.go:117] "RemoveContainer" containerID="bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963" Sep 10 05:24:24.938743 containerd[1588]: time="2025-09-10T05:24:24.938691364Z" level=info msg="RemoveContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\"" Sep 10 05:24:24.945630 containerd[1588]: time="2025-09-10T05:24:24.945534817Z" level=info msg="RemoveContainer for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" returns successfully" Sep 10 05:24:24.946037 containerd[1588]: time="2025-09-10T05:24:24.946007859Z" level=error msg="ContainerStatus for \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\": not found" Sep 10 05:24:24.946296 kubelet[2750]: I0910 05:24:24.945791 2750 scope.go:117] "RemoveContainer" containerID="bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963" Sep 10 05:24:24.946296 kubelet[2750]: E0910 05:24:24.946226 2750 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\": not found" containerID="bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963" Sep 10 05:24:24.946296 kubelet[2750]: I0910 05:24:24.946252 2750 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963"} err="failed to get container status \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\": rpc error: code = NotFound desc = an error occurred when try to find container \"bc7fa45a6c558f79cf759715acea34f6f1ae3e8aba9e690e76847e8d79b0e963\": not found" Sep 10 05:24:25.938707 containerd[1588]: time="2025-09-10T05:24:25.938653123Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a4d31960f52eaa67b2eba54ff94fd802c236984ddab6826f4be53f1da6695f5\" id:\"a56687c68e6c200bf1ea938bdd868009ba06050a11a9a0389d6bfb222834fc27\" pid:6269 exited_at:{seconds:1757481865 nanos:938394895}" Sep 10 05:24:26.368960 systemd[1]: Started sshd@21-10.0.0.54:22-10.0.0.1:33764.service - OpenSSH per-connection server daemon (10.0.0.1:33764). Sep 10 05:24:26.413654 sshd[6284]: Accepted publickey for core from 10.0.0.1 port 33764 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:26.415117 sshd-session[6284]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:26.420030 systemd-logind[1570]: New session 22 of user core. Sep 10 05:24:26.431398 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 10 05:24:26.470719 kubelet[2750]: I0910 05:24:26.470663 2750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a6cabe-8b7b-46a3-b357-1c595cd08f45" path="/var/lib/kubelet/pods/c8a6cabe-8b7b-46a3-b357-1c595cd08f45/volumes" Sep 10 05:24:26.541148 sshd[6287]: Connection closed by 10.0.0.1 port 33764 Sep 10 05:24:26.541478 sshd-session[6284]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:26.546238 systemd[1]: sshd@21-10.0.0.54:22-10.0.0.1:33764.service: Deactivated successfully. Sep 10 05:24:26.548430 systemd[1]: session-22.scope: Deactivated successfully. Sep 10 05:24:26.549199 systemd-logind[1570]: Session 22 logged out. Waiting for processes to exit. Sep 10 05:24:26.550624 systemd-logind[1570]: Removed session 22. Sep 10 05:24:31.558400 systemd[1]: Started sshd@22-10.0.0.54:22-10.0.0.1:51416.service - OpenSSH per-connection server daemon (10.0.0.1:51416). Sep 10 05:24:31.617530 sshd[6310]: Accepted publickey for core from 10.0.0.1 port 51416 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:31.619278 sshd-session[6310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:31.624063 systemd-logind[1570]: New session 23 of user core. Sep 10 05:24:31.640292 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 10 05:24:31.772815 sshd[6313]: Connection closed by 10.0.0.1 port 51416 Sep 10 05:24:31.773201 sshd-session[6310]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:31.776782 systemd[1]: sshd@22-10.0.0.54:22-10.0.0.1:51416.service: Deactivated successfully. Sep 10 05:24:31.779022 systemd[1]: session-23.scope: Deactivated successfully. Sep 10 05:24:31.781565 systemd-logind[1570]: Session 23 logged out. Waiting for processes to exit. Sep 10 05:24:31.782697 systemd-logind[1570]: Removed session 23. Sep 10 05:24:36.660078 containerd[1588]: time="2025-09-10T05:24:36.660021439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab63bdf76fdcbfab7c60d2d2110dc4443c580e425fc361a2bb4da72b6a894f3c\" id:\"8d9a809d9e6922a8347016622b245ee0bed98ecdfa47467cbaf040ea90b54843\" pid:6340 exited_at:{seconds:1757481876 nanos:659645904}" Sep 10 05:24:36.789959 systemd[1]: Started sshd@23-10.0.0.54:22-10.0.0.1:51420.service - OpenSSH per-connection server daemon (10.0.0.1:51420). Sep 10 05:24:36.855091 sshd[6354]: Accepted publickey for core from 10.0.0.1 port 51420 ssh2: RSA SHA256:qpDargCduG3u5EHDPY1E75xJ4U0856bd7w1a8ggs8mU Sep 10 05:24:36.857097 sshd-session[6354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 05:24:36.862011 systemd-logind[1570]: New session 24 of user core. Sep 10 05:24:36.874412 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 10 05:24:36.999937 sshd[6357]: Connection closed by 10.0.0.1 port 51420 Sep 10 05:24:37.000457 sshd-session[6354]: pam_unix(sshd:session): session closed for user core Sep 10 05:24:37.005330 systemd[1]: sshd@23-10.0.0.54:22-10.0.0.1:51420.service: Deactivated successfully. Sep 10 05:24:37.007803 systemd[1]: session-24.scope: Deactivated successfully. Sep 10 05:24:37.008632 systemd-logind[1570]: Session 24 logged out. Waiting for processes to exit. Sep 10 05:24:37.009986 systemd-logind[1570]: Removed session 24.