Sep 4 04:18:28.884034 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 4 02:15:54 -00 2025 Sep 4 04:18:28.884070 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d1884c9a158af3462973a912ddb17d2a643da411fd9cba6f05e0fc855c1b0a44 Sep 4 04:18:28.884080 kernel: BIOS-provided physical RAM map: Sep 4 04:18:28.884088 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 4 04:18:28.884095 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 4 04:18:28.884103 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 4 04:18:28.884112 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Sep 4 04:18:28.884129 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Sep 4 04:18:28.884144 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 4 04:18:28.884158 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 4 04:18:28.884166 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 4 04:18:28.884174 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 4 04:18:28.884182 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 4 04:18:28.884189 kernel: NX (Execute Disable) protection: active Sep 4 04:18:28.884202 kernel: APIC: Static calls initialized Sep 4 04:18:28.884216 kernel: SMBIOS 2.8 present. Sep 4 04:18:28.884242 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Sep 4 04:18:28.884258 kernel: DMI: Memory slots populated: 1/1 Sep 4 04:18:28.884267 kernel: Hypervisor detected: KVM Sep 4 04:18:28.884275 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 4 04:18:28.884284 kernel: kvm-clock: using sched offset of 5295860258 cycles Sep 4 04:18:28.884295 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 4 04:18:28.884306 kernel: tsc: Detected 2794.750 MHz processor Sep 4 04:18:28.884320 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 04:18:28.884331 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 04:18:28.884342 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Sep 4 04:18:28.884353 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 4 04:18:28.884370 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 04:18:28.884381 kernel: Using GB pages for direct mapping Sep 4 04:18:28.884392 kernel: ACPI: Early table checksum verification disabled Sep 4 04:18:28.884402 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Sep 4 04:18:28.884413 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884427 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884437 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884445 kernel: ACPI: FACS 0x000000009CFE0000 000040 Sep 4 04:18:28.884454 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884462 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884470 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884479 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 04:18:28.884488 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Sep 4 04:18:28.884502 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Sep 4 04:18:28.884516 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Sep 4 04:18:28.884525 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Sep 4 04:18:28.884534 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Sep 4 04:18:28.884551 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Sep 4 04:18:28.884566 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Sep 4 04:18:28.884592 kernel: No NUMA configuration found Sep 4 04:18:28.884607 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Sep 4 04:18:28.884616 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Sep 4 04:18:28.884625 kernel: Zone ranges: Sep 4 04:18:28.884634 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 04:18:28.884646 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Sep 4 04:18:28.884657 kernel: Normal empty Sep 4 04:18:28.884674 kernel: Device empty Sep 4 04:18:28.884688 kernel: Movable zone start for each node Sep 4 04:18:28.884700 kernel: Early memory node ranges Sep 4 04:18:28.884709 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 4 04:18:28.884718 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Sep 4 04:18:28.884729 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Sep 4 04:18:28.884743 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 04:18:28.884751 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 4 04:18:28.884760 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 4 04:18:28.884792 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 4 04:18:28.884804 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 4 04:18:28.884813 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 4 04:18:28.884826 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 04:18:28.884835 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 4 04:18:28.884846 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 04:18:28.884854 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 4 04:18:28.884863 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 4 04:18:28.884872 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 04:18:28.884881 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 4 04:18:28.884890 kernel: TSC deadline timer available Sep 4 04:18:28.884898 kernel: CPU topo: Max. logical packages: 1 Sep 4 04:18:28.884910 kernel: CPU topo: Max. logical dies: 1 Sep 4 04:18:28.884918 kernel: CPU topo: Max. dies per package: 1 Sep 4 04:18:28.884933 kernel: CPU topo: Max. threads per core: 1 Sep 4 04:18:28.884942 kernel: CPU topo: Num. cores per package: 4 Sep 4 04:18:28.884960 kernel: CPU topo: Num. threads per package: 4 Sep 4 04:18:28.884970 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 4 04:18:28.884978 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 4 04:18:28.884993 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 4 04:18:28.885004 kernel: kvm-guest: setup PV sched yield Sep 4 04:18:28.885016 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 4 04:18:28.885025 kernel: Booting paravirtualized kernel on KVM Sep 4 04:18:28.885034 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 04:18:28.885043 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 4 04:18:28.885052 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 4 04:18:28.885061 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 4 04:18:28.885070 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 4 04:18:28.885078 kernel: kvm-guest: PV spinlocks enabled Sep 4 04:18:28.885087 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 4 04:18:28.885100 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d1884c9a158af3462973a912ddb17d2a643da411fd9cba6f05e0fc855c1b0a44 Sep 4 04:18:28.885109 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 04:18:28.885118 kernel: random: crng init done Sep 4 04:18:28.885141 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 4 04:18:28.885159 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 04:18:28.885170 kernel: Fallback order for Node 0: 0 Sep 4 04:18:28.885182 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Sep 4 04:18:28.885193 kernel: Policy zone: DMA32 Sep 4 04:18:28.885205 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 04:18:28.885213 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 4 04:18:28.885223 kernel: ftrace: allocating 40102 entries in 157 pages Sep 4 04:18:28.885232 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 04:18:28.885246 kernel: Dynamic Preempt: voluntary Sep 4 04:18:28.885255 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 04:18:28.885265 kernel: rcu: RCU event tracing is enabled. Sep 4 04:18:28.885274 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 4 04:18:28.885283 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 04:18:28.885295 kernel: Rude variant of Tasks RCU enabled. Sep 4 04:18:28.885307 kernel: Tracing variant of Tasks RCU enabled. Sep 4 04:18:28.885321 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 04:18:28.885331 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 4 04:18:28.885339 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 04:18:28.885348 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 04:18:28.885358 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 04:18:28.885378 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 4 04:18:28.885390 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 04:18:28.885414 kernel: Console: colour VGA+ 80x25 Sep 4 04:18:28.885426 kernel: printk: legacy console [ttyS0] enabled Sep 4 04:18:28.885437 kernel: ACPI: Core revision 20240827 Sep 4 04:18:28.885452 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 4 04:18:28.885463 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 04:18:28.885475 kernel: x2apic enabled Sep 4 04:18:28.885486 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 04:18:28.885501 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 4 04:18:28.885513 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 4 04:18:28.885527 kernel: kvm-guest: setup PV IPIs Sep 4 04:18:28.885539 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 04:18:28.885551 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 4 04:18:28.885572 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 4 04:18:28.885591 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 4 04:18:28.885605 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 4 04:18:28.885615 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 4 04:18:28.885625 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 04:18:28.885637 kernel: Spectre V2 : Mitigation: Retpolines Sep 4 04:18:28.885646 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 4 04:18:28.885658 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 4 04:18:28.885667 kernel: active return thunk: retbleed_return_thunk Sep 4 04:18:28.885677 kernel: RETBleed: Mitigation: untrained return thunk Sep 4 04:18:28.885686 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 04:18:28.885696 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 04:18:28.885705 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 4 04:18:28.885715 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 4 04:18:28.885727 kernel: active return thunk: srso_return_thunk Sep 4 04:18:28.885736 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 4 04:18:28.885745 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 04:18:28.885754 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 04:18:28.885764 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 04:18:28.885804 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 04:18:28.885813 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 4 04:18:28.885822 kernel: Freeing SMP alternatives memory: 32K Sep 4 04:18:28.885834 kernel: pid_max: default: 32768 minimum: 301 Sep 4 04:18:28.885844 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 04:18:28.885853 kernel: landlock: Up and running. Sep 4 04:18:28.885862 kernel: SELinux: Initializing. Sep 4 04:18:28.885879 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 04:18:28.885888 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 04:18:28.885898 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 4 04:18:28.885907 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 4 04:18:28.885916 kernel: ... version: 0 Sep 4 04:18:28.885930 kernel: ... bit width: 48 Sep 4 04:18:28.885940 kernel: ... generic registers: 6 Sep 4 04:18:28.885949 kernel: ... value mask: 0000ffffffffffff Sep 4 04:18:28.885958 kernel: ... max period: 00007fffffffffff Sep 4 04:18:28.885968 kernel: ... fixed-purpose events: 0 Sep 4 04:18:28.885977 kernel: ... event mask: 000000000000003f Sep 4 04:18:28.885986 kernel: signal: max sigframe size: 1776 Sep 4 04:18:28.885995 kernel: rcu: Hierarchical SRCU implementation. Sep 4 04:18:28.886004 kernel: rcu: Max phase no-delay instances is 400. Sep 4 04:18:28.886016 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 4 04:18:28.886025 kernel: smp: Bringing up secondary CPUs ... Sep 4 04:18:28.886034 kernel: smpboot: x86: Booting SMP configuration: Sep 4 04:18:28.886044 kernel: .... node #0, CPUs: #1 #2 #3 Sep 4 04:18:28.886053 kernel: smp: Brought up 1 node, 4 CPUs Sep 4 04:18:28.886062 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 4 04:18:28.886073 kernel: Memory: 2426868K/2571752K available (14336K kernel code, 2428K rwdata, 9988K rodata, 57768K init, 1248K bss, 138952K reserved, 0K cma-reserved) Sep 4 04:18:28.886084 kernel: devtmpfs: initialized Sep 4 04:18:28.886100 kernel: x86/mm: Memory block size: 128MB Sep 4 04:18:28.886115 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 04:18:28.886125 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 4 04:18:28.886135 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 04:18:28.886150 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 04:18:28.886172 kernel: audit: initializing netlink subsys (disabled) Sep 4 04:18:28.886192 kernel: audit: type=2000 audit(1756959505.621:1): state=initialized audit_enabled=0 res=1 Sep 4 04:18:28.886211 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 04:18:28.886220 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 04:18:28.886228 kernel: cpuidle: using governor menu Sep 4 04:18:28.886240 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 04:18:28.886248 kernel: dca service started, version 1.12.1 Sep 4 04:18:28.886259 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 4 04:18:28.886278 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 4 04:18:28.886298 kernel: PCI: Using configuration type 1 for base access Sep 4 04:18:28.886315 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 04:18:28.886324 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 04:18:28.886333 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 04:18:28.886343 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 04:18:28.886366 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 04:18:28.886386 kernel: ACPI: Added _OSI(Module Device) Sep 4 04:18:28.886396 kernel: ACPI: Added _OSI(Processor Device) Sep 4 04:18:28.886419 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 04:18:28.886434 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 04:18:28.886448 kernel: ACPI: Interpreter enabled Sep 4 04:18:28.886456 kernel: ACPI: PM: (supports S0 S3 S5) Sep 4 04:18:28.886463 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 04:18:28.886483 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 04:18:28.886505 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 04:18:28.886519 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 4 04:18:28.886527 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 4 04:18:28.887096 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 04:18:28.887285 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 4 04:18:28.887494 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 4 04:18:28.887519 kernel: PCI host bridge to bus 0000:00 Sep 4 04:18:28.887802 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 04:18:28.887949 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 04:18:28.888063 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 04:18:28.888195 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Sep 4 04:18:28.888622 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 4 04:18:28.888810 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 4 04:18:28.888969 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 4 04:18:28.889238 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 4 04:18:28.889433 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 4 04:18:28.889623 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Sep 4 04:18:28.889804 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Sep 4 04:18:28.889962 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Sep 4 04:18:28.890126 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 04:18:28.890310 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 4 04:18:28.890496 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Sep 4 04:18:28.890650 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Sep 4 04:18:28.890898 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Sep 4 04:18:28.891079 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 4 04:18:28.891235 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Sep 4 04:18:28.891390 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Sep 4 04:18:28.891592 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Sep 4 04:18:28.891811 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 4 04:18:28.891984 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Sep 4 04:18:28.892160 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Sep 4 04:18:28.892347 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Sep 4 04:18:28.892525 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Sep 4 04:18:28.892744 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 4 04:18:28.892951 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 4 04:18:28.893127 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 4 04:18:28.893258 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Sep 4 04:18:28.893420 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Sep 4 04:18:28.893622 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 4 04:18:28.893796 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 4 04:18:28.893814 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 4 04:18:28.893831 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 4 04:18:28.893842 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 04:18:28.893853 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 4 04:18:28.893863 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 4 04:18:28.893873 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 4 04:18:28.893884 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 4 04:18:28.893894 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 4 04:18:28.893904 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 4 04:18:28.893915 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 4 04:18:28.893928 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 4 04:18:28.893938 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 4 04:18:28.893948 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 4 04:18:28.893959 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 4 04:18:28.893969 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 4 04:18:28.893979 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 4 04:18:28.893989 kernel: iommu: Default domain type: Translated Sep 4 04:18:28.893999 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 04:18:28.894010 kernel: PCI: Using ACPI for IRQ routing Sep 4 04:18:28.894024 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 04:18:28.894034 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 4 04:18:28.894044 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Sep 4 04:18:28.894201 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 4 04:18:28.894338 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 4 04:18:28.894495 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 04:18:28.894509 kernel: vgaarb: loaded Sep 4 04:18:28.894518 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 4 04:18:28.894530 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 4 04:18:28.894538 kernel: clocksource: Switched to clocksource kvm-clock Sep 4 04:18:28.894547 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 04:18:28.894556 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 04:18:28.894564 kernel: pnp: PnP ACPI init Sep 4 04:18:28.894738 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 4 04:18:28.894755 kernel: pnp: PnP ACPI: found 6 devices Sep 4 04:18:28.894797 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 04:18:28.894815 kernel: NET: Registered PF_INET protocol family Sep 4 04:18:28.894826 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 04:18:28.894837 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 4 04:18:28.894848 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 04:18:28.894859 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 04:18:28.894870 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 4 04:18:28.894881 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 4 04:18:28.894891 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 04:18:28.894901 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 04:18:28.894912 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 04:18:28.894921 kernel: NET: Registered PF_XDP protocol family Sep 4 04:18:28.895084 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 04:18:28.895253 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 04:18:28.895468 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 04:18:28.895663 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Sep 4 04:18:28.895871 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 4 04:18:28.896038 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 4 04:18:28.896076 kernel: PCI: CLS 0 bytes, default 64 Sep 4 04:18:28.896093 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 4 04:18:28.896104 kernel: Initialise system trusted keyrings Sep 4 04:18:28.896115 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 4 04:18:28.896128 kernel: Key type asymmetric registered Sep 4 04:18:28.896139 kernel: Asymmetric key parser 'x509' registered Sep 4 04:18:28.896150 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 04:18:28.896160 kernel: io scheduler mq-deadline registered Sep 4 04:18:28.896170 kernel: io scheduler kyber registered Sep 4 04:18:28.896185 kernel: io scheduler bfq registered Sep 4 04:18:28.896196 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 04:18:28.896208 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 4 04:18:28.896218 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 4 04:18:28.896229 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 4 04:18:28.896240 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 04:18:28.896251 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 04:18:28.896262 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 4 04:18:28.896273 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 04:18:28.896287 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 04:18:28.896543 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 4 04:18:28.896568 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 04:18:28.896728 kernel: rtc_cmos 00:04: registered as rtc0 Sep 4 04:18:28.896920 kernel: rtc_cmos 00:04: setting system clock to 2025-09-04T04:18:28 UTC (1756959508) Sep 4 04:18:28.897070 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 4 04:18:28.897085 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 4 04:18:28.897096 kernel: NET: Registered PF_INET6 protocol family Sep 4 04:18:28.897113 kernel: Segment Routing with IPv6 Sep 4 04:18:28.897124 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 04:18:28.897135 kernel: NET: Registered PF_PACKET protocol family Sep 4 04:18:28.897145 kernel: Key type dns_resolver registered Sep 4 04:18:28.897154 kernel: IPI shorthand broadcast: enabled Sep 4 04:18:28.897162 kernel: sched_clock: Marking stable (3485004878, 123631059)->(3641574512, -32938575) Sep 4 04:18:28.897170 kernel: registered taskstats version 1 Sep 4 04:18:28.897179 kernel: Loading compiled-in X.509 certificates Sep 4 04:18:28.897187 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 2c6c093c583f207375cbe16db1a23ce651c8380d' Sep 4 04:18:28.897198 kernel: Demotion targets for Node 0: null Sep 4 04:18:28.897206 kernel: Key type .fscrypt registered Sep 4 04:18:28.897224 kernel: Key type fscrypt-provisioning registered Sep 4 04:18:28.897239 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 04:18:28.897253 kernel: ima: Allocated hash algorithm: sha1 Sep 4 04:18:28.897263 kernel: ima: No architecture policies found Sep 4 04:18:28.897271 kernel: clk: Disabling unused clocks Sep 4 04:18:28.897279 kernel: Warning: unable to open an initial console. Sep 4 04:18:28.897288 kernel: Freeing unused kernel image (initmem) memory: 57768K Sep 4 04:18:28.897299 kernel: Write protecting the kernel read-only data: 24576k Sep 4 04:18:28.897308 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 4 04:18:28.897316 kernel: Run /init as init process Sep 4 04:18:28.897325 kernel: with arguments: Sep 4 04:18:28.897333 kernel: /init Sep 4 04:18:28.897343 kernel: with environment: Sep 4 04:18:28.897353 kernel: HOME=/ Sep 4 04:18:28.897362 kernel: TERM=linux Sep 4 04:18:28.897372 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 04:18:28.897387 systemd[1]: Successfully made /usr/ read-only. Sep 4 04:18:28.897415 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 04:18:28.897430 systemd[1]: Detected virtualization kvm. Sep 4 04:18:28.897441 systemd[1]: Detected architecture x86-64. Sep 4 04:18:28.897451 systemd[1]: Running in initrd. Sep 4 04:18:28.897463 systemd[1]: No hostname configured, using default hostname. Sep 4 04:18:28.897472 systemd[1]: Hostname set to . Sep 4 04:18:28.897483 systemd[1]: Initializing machine ID from VM UUID. Sep 4 04:18:28.897492 systemd[1]: Queued start job for default target initrd.target. Sep 4 04:18:28.897500 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 04:18:28.897509 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 04:18:28.897519 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 04:18:28.897533 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 04:18:28.897552 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 04:18:28.897562 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 04:18:28.897582 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 04:18:28.897595 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 04:18:28.897607 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 04:18:28.897619 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 04:18:28.897630 systemd[1]: Reached target paths.target - Path Units. Sep 4 04:18:28.897655 systemd[1]: Reached target slices.target - Slice Units. Sep 4 04:18:28.897669 systemd[1]: Reached target swap.target - Swaps. Sep 4 04:18:28.897683 systemd[1]: Reached target timers.target - Timer Units. Sep 4 04:18:28.897692 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 04:18:28.897701 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 04:18:28.897713 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 04:18:28.897725 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 04:18:28.897737 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 04:18:28.897753 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 04:18:28.897764 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 04:18:28.897817 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 04:18:28.897830 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 04:18:28.898202 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 04:18:28.898222 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 04:18:28.898237 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 04:18:28.898249 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 04:18:28.898261 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 04:18:28.898273 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 04:18:28.898285 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 04:18:28.898296 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 04:18:28.898311 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 04:18:28.898324 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 04:18:28.898336 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 04:18:28.898453 systemd-journald[218]: Collecting audit messages is disabled. Sep 4 04:18:28.898499 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 04:18:28.898562 systemd-journald[218]: Journal started Sep 4 04:18:28.898605 systemd-journald[218]: Runtime Journal (/run/log/journal/53050d42310f4dea96543924f57657a3) is 6M, max 48.6M, 42.5M free. Sep 4 04:18:28.881325 systemd-modules-load[220]: Inserted module 'overlay' Sep 4 04:18:28.902836 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 04:18:28.912840 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 04:18:28.915178 systemd-modules-load[220]: Inserted module 'br_netfilter' Sep 4 04:18:28.948069 kernel: Bridge firewalling registered Sep 4 04:18:28.948890 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 04:18:28.952134 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 04:18:28.960815 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 04:18:28.966134 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 04:18:28.977828 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 04:18:29.002855 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 04:18:29.009283 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 04:18:29.019041 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 04:18:29.022453 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 04:18:29.024848 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 04:18:29.026978 systemd-tmpfiles[245]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 04:18:29.038954 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 04:18:29.042747 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 04:18:29.058168 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d1884c9a158af3462973a912ddb17d2a643da411fd9cba6f05e0fc855c1b0a44 Sep 4 04:18:29.101750 systemd-resolved[264]: Positive Trust Anchors: Sep 4 04:18:29.101790 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 04:18:29.101821 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 04:18:29.104814 systemd-resolved[264]: Defaulting to hostname 'linux'. Sep 4 04:18:29.106096 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 04:18:29.121493 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 04:18:29.219823 kernel: SCSI subsystem initialized Sep 4 04:18:29.229834 kernel: Loading iSCSI transport class v2.0-870. Sep 4 04:18:29.240830 kernel: iscsi: registered transport (tcp) Sep 4 04:18:29.273847 kernel: iscsi: registered transport (qla4xxx) Sep 4 04:18:29.273954 kernel: QLogic iSCSI HBA Driver Sep 4 04:18:29.295726 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 04:18:29.331162 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 04:18:29.332903 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 04:18:29.407496 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 04:18:29.409426 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 04:18:29.486829 kernel: raid6: avx2x4 gen() 21600 MB/s Sep 4 04:18:29.503809 kernel: raid6: avx2x2 gen() 29015 MB/s Sep 4 04:18:29.521019 kernel: raid6: avx2x1 gen() 25032 MB/s Sep 4 04:18:29.521042 kernel: raid6: using algorithm avx2x2 gen() 29015 MB/s Sep 4 04:18:29.539299 kernel: raid6: .... xor() 13612 MB/s, rmw enabled Sep 4 04:18:29.539327 kernel: raid6: using avx2x2 recovery algorithm Sep 4 04:18:29.560806 kernel: xor: automatically using best checksumming function avx Sep 4 04:18:29.776952 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 04:18:29.787892 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 04:18:29.791819 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 04:18:29.827437 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 4 04:18:29.833410 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 04:18:29.834558 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 04:18:29.867672 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Sep 4 04:18:29.905820 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 04:18:29.910799 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 04:18:30.012792 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 04:18:30.017740 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 04:18:30.071824 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 4 04:18:30.079611 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 4 04:18:30.083855 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 04:18:30.091856 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 4 04:18:30.091907 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 04:18:30.095440 kernel: GPT:9289727 != 19775487 Sep 4 04:18:30.095509 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 04:18:30.095521 kernel: GPT:9289727 != 19775487 Sep 4 04:18:30.095531 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 04:18:30.095542 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 04:18:30.112973 kernel: libata version 3.00 loaded. Sep 4 04:18:30.115840 kernel: AES CTR mode by8 optimization enabled Sep 4 04:18:30.129200 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 04:18:30.129392 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 04:18:30.147360 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 04:18:30.154449 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 04:18:30.159311 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 04:18:30.166963 kernel: ahci 0000:00:1f.2: version 3.0 Sep 4 04:18:30.167251 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 4 04:18:30.172814 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 4 04:18:30.173080 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 4 04:18:30.173277 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 4 04:18:30.177799 kernel: scsi host0: ahci Sep 4 04:18:30.178326 kernel: scsi host1: ahci Sep 4 04:18:30.179800 kernel: scsi host2: ahci Sep 4 04:18:30.181916 kernel: scsi host3: ahci Sep 4 04:18:30.182132 kernel: scsi host4: ahci Sep 4 04:18:30.182436 kernel: scsi host5: ahci Sep 4 04:18:30.187797 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 1 Sep 4 04:18:30.187847 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 1 Sep 4 04:18:30.187861 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 1 Sep 4 04:18:30.187873 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 1 Sep 4 04:18:30.187893 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 1 Sep 4 04:18:30.189475 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 1 Sep 4 04:18:30.196090 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 4 04:18:30.227996 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 04:18:30.242987 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 04:18:30.263278 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 4 04:18:30.282031 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 4 04:18:30.283469 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 4 04:18:30.289092 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 04:18:30.331523 disk-uuid[634]: Primary Header is updated. Sep 4 04:18:30.331523 disk-uuid[634]: Secondary Entries is updated. Sep 4 04:18:30.331523 disk-uuid[634]: Secondary Header is updated. Sep 4 04:18:30.335810 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 04:18:30.340818 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 04:18:30.502114 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 4 04:18:30.502192 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 4 04:18:30.502203 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 4 04:18:30.503817 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 4 04:18:30.503900 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 4 04:18:30.504805 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 4 04:18:30.505810 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 04:18:30.507249 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 4 04:18:30.507272 kernel: ata3.00: applying bridge limits Sep 4 04:18:30.508026 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 04:18:30.508044 kernel: ata3.00: configured for UDMA/100 Sep 4 04:18:30.510808 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 4 04:18:30.554826 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 4 04:18:30.555187 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 4 04:18:30.568815 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 4 04:18:30.930054 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 04:18:30.933811 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 04:18:30.937197 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 04:18:30.940166 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 04:18:30.944194 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 04:18:30.974800 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 04:18:31.342817 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 04:18:31.343338 disk-uuid[635]: The operation has completed successfully. Sep 4 04:18:31.379376 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 04:18:31.379515 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 04:18:31.414510 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 04:18:31.446100 sh[664]: Success Sep 4 04:18:31.465140 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 04:18:31.465208 kernel: device-mapper: uevent: version 1.0.3 Sep 4 04:18:31.466323 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 04:18:31.475788 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 4 04:18:31.507752 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 04:18:31.511699 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 04:18:31.524813 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 04:18:31.530861 kernel: BTRFS: device fsid c26d2db4-0109-42a5-bc6f-bbb834b82868 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (676) Sep 4 04:18:31.530905 kernel: BTRFS info (device dm-0): first mount of filesystem c26d2db4-0109-42a5-bc6f-bbb834b82868 Sep 4 04:18:31.532799 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 04:18:31.537299 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 04:18:31.537324 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 04:18:31.538585 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 04:18:31.541103 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 04:18:31.543831 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 04:18:31.547181 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 04:18:31.576199 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 04:18:31.611853 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (709) Sep 4 04:18:31.613964 kernel: BTRFS info (device vda6): first mount of filesystem 1535a26e-7205-4f17-83f6-e5f828340771 Sep 4 04:18:31.614032 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 04:18:31.617805 kernel: BTRFS info (device vda6): turning on async discard Sep 4 04:18:31.617838 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 04:18:31.623855 kernel: BTRFS info (device vda6): last unmount of filesystem 1535a26e-7205-4f17-83f6-e5f828340771 Sep 4 04:18:31.625808 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 04:18:31.628877 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 04:18:31.805013 ignition[762]: Ignition 2.22.0 Sep 4 04:18:31.805034 ignition[762]: Stage: fetch-offline Sep 4 04:18:31.805112 ignition[762]: no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:31.805128 ignition[762]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:31.805244 ignition[762]: parsed url from cmdline: "" Sep 4 04:18:31.805250 ignition[762]: no config URL provided Sep 4 04:18:31.805257 ignition[762]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 04:18:31.805269 ignition[762]: no config at "/usr/lib/ignition/user.ign" Sep 4 04:18:31.805328 ignition[762]: op(1): [started] loading QEMU firmware config module Sep 4 04:18:31.805335 ignition[762]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 4 04:18:31.821247 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 04:18:31.842036 ignition[762]: op(1): [finished] loading QEMU firmware config module Sep 4 04:18:31.848965 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 04:18:31.919089 ignition[762]: parsing config with SHA512: 1f5b31774c6316cafca9890b5a38682df1c159a9fa464ff534f36474e29533dd85a841ae3159d12b654c4c49f51e807031a6e0f9a584c596929172f48622f119 Sep 4 04:18:31.923309 unknown[762]: fetched base config from "system" Sep 4 04:18:31.923322 unknown[762]: fetched user config from "qemu" Sep 4 04:18:31.923746 ignition[762]: fetch-offline: fetch-offline passed Sep 4 04:18:31.926531 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 04:18:31.923827 ignition[762]: Ignition finished successfully Sep 4 04:18:31.931123 systemd-networkd[856]: lo: Link UP Sep 4 04:18:31.931128 systemd-networkd[856]: lo: Gained carrier Sep 4 04:18:31.933148 systemd-networkd[856]: Enumeration completed Sep 4 04:18:31.933305 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 04:18:31.933583 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 04:18:31.933589 systemd-networkd[856]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 04:18:31.946948 systemd-networkd[856]: eth0: Link UP Sep 4 04:18:31.947123 systemd-networkd[856]: eth0: Gained carrier Sep 4 04:18:31.947133 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 04:18:31.949389 systemd[1]: Reached target network.target - Network. Sep 4 04:18:31.953005 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 04:18:31.955535 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 04:18:32.000957 systemd-networkd[856]: eth0: DHCPv4 address 10.0.0.54/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 04:18:32.029211 ignition[860]: Ignition 2.22.0 Sep 4 04:18:32.029227 ignition[860]: Stage: kargs Sep 4 04:18:32.029366 ignition[860]: no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:32.029376 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:32.030134 ignition[860]: kargs: kargs passed Sep 4 04:18:32.030184 ignition[860]: Ignition finished successfully Sep 4 04:18:32.048429 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 04:18:32.050838 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 04:18:32.088958 ignition[869]: Ignition 2.22.0 Sep 4 04:18:32.088974 ignition[869]: Stage: disks Sep 4 04:18:32.089136 ignition[869]: no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:32.089147 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:32.090258 ignition[869]: disks: disks passed Sep 4 04:18:32.090320 ignition[869]: Ignition finished successfully Sep 4 04:18:32.117334 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 04:18:32.118642 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 04:18:32.120523 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 04:18:32.122732 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 04:18:32.124953 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 04:18:32.126852 systemd[1]: Reached target basic.target - Basic System. Sep 4 04:18:32.129718 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 04:18:32.175899 systemd-fsck[879]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 4 04:18:32.315949 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 04:18:32.319507 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 04:18:32.444826 kernel: EXT4-fs (vda9): mounted filesystem d147a273-ffc0-4c78-a5f1-46a3b3f6b4ff r/w with ordered data mode. Quota mode: none. Sep 4 04:18:32.445473 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 04:18:32.447077 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 04:18:32.449661 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 04:18:32.450684 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 04:18:32.451936 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 04:18:32.452003 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 04:18:32.452038 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 04:18:32.469849 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 04:18:32.473681 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 04:18:32.479401 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (887) Sep 4 04:18:32.479430 kernel: BTRFS info (device vda6): first mount of filesystem 1535a26e-7205-4f17-83f6-e5f828340771 Sep 4 04:18:32.479442 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 04:18:32.481170 kernel: BTRFS info (device vda6): turning on async discard Sep 4 04:18:32.481196 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 04:18:32.483911 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 04:18:32.517586 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 04:18:32.523598 initrd-setup-root[918]: cut: /sysroot/etc/group: No such file or directory Sep 4 04:18:32.531574 initrd-setup-root[925]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 04:18:32.536899 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 04:18:32.652369 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 04:18:32.656053 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 04:18:32.657271 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 04:18:32.684284 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 04:18:32.685586 kernel: BTRFS info (device vda6): last unmount of filesystem 1535a26e-7205-4f17-83f6-e5f828340771 Sep 4 04:18:32.713005 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 04:18:32.751162 ignition[1001]: INFO : Ignition 2.22.0 Sep 4 04:18:32.751162 ignition[1001]: INFO : Stage: mount Sep 4 04:18:32.753717 ignition[1001]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:32.753717 ignition[1001]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:32.756792 ignition[1001]: INFO : mount: mount passed Sep 4 04:18:32.757599 ignition[1001]: INFO : Ignition finished successfully Sep 4 04:18:32.761275 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 04:18:32.763909 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 04:18:32.794264 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 04:18:32.828806 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1013) Sep 4 04:18:32.831256 kernel: BTRFS info (device vda6): first mount of filesystem 1535a26e-7205-4f17-83f6-e5f828340771 Sep 4 04:18:32.831290 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 04:18:32.834881 kernel: BTRFS info (device vda6): turning on async discard Sep 4 04:18:32.834912 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 04:18:32.836922 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 04:18:32.891240 ignition[1030]: INFO : Ignition 2.22.0 Sep 4 04:18:32.891240 ignition[1030]: INFO : Stage: files Sep 4 04:18:32.893341 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:32.893341 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:32.896888 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Sep 4 04:18:32.899147 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 04:18:32.899147 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 04:18:32.904398 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 04:18:32.906205 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 04:18:32.908250 unknown[1030]: wrote ssh authorized keys file for user: core Sep 4 04:18:32.909595 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 04:18:32.912410 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 4 04:18:32.914808 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 4 04:18:33.006729 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 04:18:33.168478 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 4 04:18:33.168478 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 04:18:33.173765 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 04:18:33.189850 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 4 04:18:33.577929 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 04:18:33.991951 systemd-networkd[856]: eth0: Gained IPv6LL Sep 4 04:18:34.791238 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 4 04:18:34.791238 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 04:18:34.800263 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 4 04:18:34.822910 ignition[1030]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 4 04:18:34.930225 ignition[1030]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 04:18:34.940719 ignition[1030]: INFO : files: files passed Sep 4 04:18:34.940719 ignition[1030]: INFO : Ignition finished successfully Sep 4 04:18:34.951122 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 04:18:34.956874 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 04:18:34.961997 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 04:18:34.993668 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 04:18:34.993940 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 04:18:35.003912 initrd-setup-root-after-ignition[1059]: grep: /sysroot/oem/oem-release: No such file or directory Sep 4 04:18:35.010306 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 04:18:35.010306 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 04:18:35.017464 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 04:18:35.023333 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 04:18:35.025856 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 04:18:35.033824 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 04:18:35.116269 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 04:18:35.116524 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 04:18:35.119176 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 04:18:35.120297 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 04:18:35.120682 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 04:18:35.121685 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 04:18:35.160356 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 04:18:35.165724 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 04:18:35.196829 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 04:18:35.199966 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 04:18:35.200294 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 04:18:35.200669 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 04:18:35.200848 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 04:18:35.214523 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 04:18:35.216865 systemd[1]: Stopped target basic.target - Basic System. Sep 4 04:18:35.218983 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 04:18:35.221584 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 04:18:35.224306 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 04:18:35.226906 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 04:18:35.229532 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 04:18:35.231072 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 04:18:35.231680 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 04:18:35.232227 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 04:18:35.232580 systemd[1]: Stopped target swap.target - Swaps. Sep 4 04:18:35.233177 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 04:18:35.233365 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 04:18:35.243344 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 04:18:35.243750 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 04:18:35.259388 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 04:18:35.259529 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 04:18:35.261765 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 04:18:35.261943 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 04:18:35.264708 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 04:18:35.264883 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 04:18:35.265297 systemd[1]: Stopped target paths.target - Path Units. Sep 4 04:18:35.268873 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 04:18:35.273865 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 04:18:35.275251 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 04:18:35.277588 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 04:18:35.279400 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 04:18:35.279494 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 04:18:35.281255 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 04:18:35.281341 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 04:18:35.283152 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 04:18:35.283270 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 04:18:35.285315 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 04:18:35.285421 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 04:18:35.288289 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 04:18:35.291984 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 04:18:35.295070 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 04:18:35.297288 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 04:18:35.298528 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 04:18:35.298738 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 04:18:35.306961 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 04:18:35.307115 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 04:18:35.335095 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 04:18:35.346909 ignition[1085]: INFO : Ignition 2.22.0 Sep 4 04:18:35.346909 ignition[1085]: INFO : Stage: umount Sep 4 04:18:35.348751 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 04:18:35.348751 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 04:18:35.348751 ignition[1085]: INFO : umount: umount passed Sep 4 04:18:35.348751 ignition[1085]: INFO : Ignition finished successfully Sep 4 04:18:35.352526 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 04:18:35.352678 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 04:18:35.359889 systemd[1]: Stopped target network.target - Network. Sep 4 04:18:35.361284 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 04:18:35.361352 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 04:18:35.363451 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 04:18:35.363525 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 04:18:35.365581 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 04:18:35.365663 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 04:18:35.366607 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 04:18:35.366674 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 04:18:35.367265 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 04:18:35.371256 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 04:18:35.380490 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 04:18:35.380650 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 04:18:35.384584 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 04:18:35.384864 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 04:18:35.384981 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 04:18:35.389303 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 04:18:35.390170 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 04:18:35.390905 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 04:18:35.390960 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 04:18:35.394469 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 04:18:35.395456 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 04:18:35.395521 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 04:18:35.396018 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 04:18:35.396070 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 04:18:35.400981 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 04:18:35.401054 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 04:18:35.402655 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 04:18:35.402706 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 04:18:35.407812 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 04:18:35.412533 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 04:18:35.412644 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 04:18:35.442954 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 04:18:35.443179 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 04:18:35.445836 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 04:18:35.445896 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 04:18:35.446987 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 04:18:35.447034 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 04:18:35.449997 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 04:18:35.450065 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 04:18:35.454229 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 04:18:35.454300 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 04:18:35.455579 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 04:18:35.455667 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 04:18:35.457471 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 04:18:35.461242 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 04:18:35.461413 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 04:18:35.489606 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 04:18:35.489712 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 04:18:35.493445 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 4 04:18:35.493506 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 04:18:35.501524 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 04:18:35.501595 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 04:18:35.503945 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 04:18:35.504011 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 04:18:35.507895 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 4 04:18:35.507974 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 4 04:18:35.508035 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 4 04:18:35.508100 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 04:18:35.508538 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 04:18:35.508692 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 04:18:35.511029 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 04:18:35.511170 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 04:18:35.560384 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 04:18:35.560565 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 04:18:35.561934 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 04:18:35.563851 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 04:18:35.563928 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 04:18:35.567626 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 04:18:35.599888 systemd[1]: Switching root. Sep 4 04:18:35.643569 systemd-journald[218]: Journal stopped Sep 4 04:18:37.262950 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Sep 4 04:18:37.263048 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 04:18:37.263069 kernel: SELinux: policy capability open_perms=1 Sep 4 04:18:37.263085 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 04:18:37.263101 kernel: SELinux: policy capability always_check_network=0 Sep 4 04:18:37.263117 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 04:18:37.263136 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 04:18:37.263163 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 04:18:37.263190 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 04:18:37.263206 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 04:18:37.263223 kernel: audit: type=1403 audit(1756959516.110:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 04:18:37.263240 systemd[1]: Successfully loaded SELinux policy in 74.744ms. Sep 4 04:18:37.263278 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.241ms. Sep 4 04:18:37.263302 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 04:18:37.263320 systemd[1]: Detected virtualization kvm. Sep 4 04:18:37.263344 systemd[1]: Detected architecture x86-64. Sep 4 04:18:37.263361 systemd[1]: Detected first boot. Sep 4 04:18:37.263378 systemd[1]: Initializing machine ID from VM UUID. Sep 4 04:18:37.263395 zram_generator::config[1130]: No configuration found. Sep 4 04:18:37.263413 kernel: Guest personality initialized and is inactive Sep 4 04:18:37.263429 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 4 04:18:37.263444 kernel: Initialized host personality Sep 4 04:18:37.263460 kernel: NET: Registered PF_VSOCK protocol family Sep 4 04:18:37.263477 systemd[1]: Populated /etc with preset unit settings. Sep 4 04:18:37.263499 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 04:18:37.263516 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 04:18:37.263532 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 04:18:37.263550 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 04:18:37.263568 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 04:18:37.263600 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 04:18:37.263625 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 04:18:37.263643 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 04:18:37.263661 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 04:18:37.263687 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 04:18:37.263705 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 04:18:37.263721 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 04:18:37.263738 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 04:18:37.263756 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 04:18:37.263799 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 04:18:37.263819 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 04:18:37.263836 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 04:18:37.263863 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 04:18:37.263880 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 04:18:37.263897 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 04:18:37.263914 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 04:18:37.263930 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 04:18:37.263948 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 04:18:37.263964 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 04:18:37.263982 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 04:18:37.264007 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 04:18:37.264026 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 04:18:37.264043 systemd[1]: Reached target slices.target - Slice Units. Sep 4 04:18:37.264059 systemd[1]: Reached target swap.target - Swaps. Sep 4 04:18:37.264076 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 04:18:37.264094 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 04:18:37.264111 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 04:18:37.264128 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 04:18:37.264145 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 04:18:37.264171 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 04:18:37.264189 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 04:18:37.264209 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 04:18:37.264228 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 04:18:37.264248 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 04:18:37.264266 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:37.264283 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 04:18:37.264300 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 04:18:37.264317 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 04:18:37.264340 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 04:18:37.264358 systemd[1]: Reached target machines.target - Containers. Sep 4 04:18:37.264375 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 04:18:37.264392 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 04:18:37.264411 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 04:18:37.264427 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 04:18:37.264445 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 04:18:37.264462 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 04:18:37.264487 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 04:18:37.264505 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 04:18:37.264522 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 04:18:37.264539 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 04:18:37.264557 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 04:18:37.264584 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 04:18:37.264601 kernel: fuse: init (API version 7.41) Sep 4 04:18:37.264619 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 04:18:37.264636 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 04:18:37.264658 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 04:18:37.264677 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 04:18:37.264693 kernel: loop: module loaded Sep 4 04:18:37.264710 kernel: ACPI: bus type drm_connector registered Sep 4 04:18:37.264728 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 04:18:37.264745 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 04:18:37.264762 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 04:18:37.264816 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 04:18:37.264838 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 04:18:37.264864 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 04:18:37.264883 systemd[1]: Stopped verity-setup.service. Sep 4 04:18:37.264901 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:37.264925 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 04:18:37.264942 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 04:18:37.264959 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 04:18:37.265021 systemd-journald[1194]: Collecting audit messages is disabled. Sep 4 04:18:37.265055 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 04:18:37.265082 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 04:18:37.265106 systemd-journald[1194]: Journal started Sep 4 04:18:37.265138 systemd-journald[1194]: Runtime Journal (/run/log/journal/53050d42310f4dea96543924f57657a3) is 6M, max 48.6M, 42.5M free. Sep 4 04:18:36.789289 systemd[1]: Queued start job for default target multi-user.target. Sep 4 04:18:36.815386 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 4 04:18:36.815971 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 04:18:37.287581 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 04:18:37.290001 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 04:18:37.303948 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 04:18:37.305829 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 04:18:37.306283 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 04:18:37.308162 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 04:18:37.308415 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 04:18:37.324285 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 04:18:37.324717 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 04:18:37.326708 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 04:18:37.327007 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 04:18:37.329026 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 04:18:37.330855 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 04:18:37.331150 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 04:18:37.332834 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 04:18:37.333101 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 04:18:37.334831 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 04:18:37.336475 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 04:18:37.338355 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 04:18:37.340229 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 04:18:37.365112 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 04:18:37.368627 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 04:18:37.372488 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 04:18:37.373894 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 04:18:37.373931 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 04:18:37.377024 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 04:18:37.393931 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 04:18:37.395703 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 04:18:37.398470 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 04:18:37.404020 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 04:18:37.405461 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 04:18:37.410188 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 04:18:37.412065 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 04:18:37.413548 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 04:18:37.416707 systemd-journald[1194]: Time spent on flushing to /var/log/journal/53050d42310f4dea96543924f57657a3 is 16.956ms for 984 entries. Sep 4 04:18:37.416707 systemd-journald[1194]: System Journal (/var/log/journal/53050d42310f4dea96543924f57657a3) is 8M, max 195.6M, 187.6M free. Sep 4 04:18:37.704452 systemd-journald[1194]: Received client request to flush runtime journal. Sep 4 04:18:37.704519 kernel: loop0: detected capacity change from 0 to 110984 Sep 4 04:18:37.704546 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 04:18:37.704572 kernel: loop1: detected capacity change from 0 to 128016 Sep 4 04:18:37.420015 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 04:18:37.426123 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 04:18:37.430054 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 04:18:37.438673 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 04:18:37.440153 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 04:18:37.485350 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 04:18:37.490243 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Sep 4 04:18:37.490256 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Sep 4 04:18:37.496124 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 04:18:37.510417 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 04:18:37.512069 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 04:18:37.517312 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 04:18:37.523054 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 04:18:37.706145 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 04:18:37.708822 kernel: loop2: detected capacity change from 0 to 224512 Sep 4 04:18:37.776007 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 04:18:37.777836 kernel: loop3: detected capacity change from 0 to 110984 Sep 4 04:18:37.780917 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 04:18:37.815983 systemd-tmpfiles[1272]: ACLs are not supported, ignoring. Sep 4 04:18:37.816001 systemd-tmpfiles[1272]: ACLs are not supported, ignoring. Sep 4 04:18:37.820697 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 04:18:37.832825 kernel: loop4: detected capacity change from 0 to 128016 Sep 4 04:18:37.853813 kernel: loop5: detected capacity change from 0 to 224512 Sep 4 04:18:37.865805 (sd-merge)[1270]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 4 04:18:37.866513 (sd-merge)[1270]: Merged extensions into '/usr'. Sep 4 04:18:37.895584 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 04:18:37.895608 systemd[1]: Reloading... Sep 4 04:18:37.994747 zram_generator::config[1298]: No configuration found. Sep 4 04:18:38.177674 ldconfig[1244]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 04:18:38.277183 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 04:18:38.277553 systemd[1]: Reloading finished in 381 ms. Sep 4 04:18:38.304104 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 04:18:38.308643 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 04:18:38.310482 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 04:18:38.342053 systemd[1]: Starting ensure-sysext.service... Sep 4 04:18:38.349906 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 04:18:38.380439 systemd-tmpfiles[1340]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 04:18:38.380495 systemd-tmpfiles[1340]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 04:18:38.380887 systemd-tmpfiles[1340]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 04:18:38.381212 systemd-tmpfiles[1340]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 04:18:38.382418 systemd-tmpfiles[1340]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 04:18:38.382859 systemd-tmpfiles[1340]: ACLs are not supported, ignoring. Sep 4 04:18:38.382974 systemd-tmpfiles[1340]: ACLs are not supported, ignoring. Sep 4 04:18:38.388757 systemd[1]: Reload requested from client PID 1339 ('systemctl') (unit ensure-sysext.service)... Sep 4 04:18:38.388794 systemd[1]: Reloading... Sep 4 04:18:38.410039 systemd-tmpfiles[1340]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 04:18:38.410062 systemd-tmpfiles[1340]: Skipping /boot Sep 4 04:18:38.423812 systemd-tmpfiles[1340]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 04:18:38.423837 systemd-tmpfiles[1340]: Skipping /boot Sep 4 04:18:38.506064 zram_generator::config[1367]: No configuration found. Sep 4 04:18:38.713394 systemd[1]: Reloading finished in 324 ms. Sep 4 04:18:38.739467 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 04:18:38.771180 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 04:18:38.810341 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 04:18:38.813845 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 04:18:38.819901 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 04:18:38.827706 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 04:18:38.833323 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:38.834266 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 04:18:38.836046 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 04:18:38.846697 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 04:18:38.849850 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 04:18:38.851317 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 04:18:38.851472 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 04:18:38.864562 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 04:18:38.870046 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:38.872962 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 04:18:38.873185 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 04:18:38.875471 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 04:18:38.875826 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 04:18:38.878368 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 04:18:38.878688 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 04:18:38.888109 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 04:18:38.903919 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 04:18:38.913648 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 04:18:38.922936 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:38.923231 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 04:18:38.925070 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 04:18:38.927752 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 04:18:38.932015 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 04:18:38.940150 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 04:18:38.941794 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 04:18:38.941951 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 04:18:38.945018 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 04:18:38.949132 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 04:18:38.950618 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 04:18:38.954176 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 04:18:38.956226 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 04:18:38.957177 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 04:18:38.962108 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 04:18:38.963364 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 04:18:38.965488 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 04:18:38.970244 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 04:18:38.975377 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 04:18:38.975657 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 04:18:38.977749 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 04:18:38.978833 augenrules[1450]: No rules Sep 4 04:18:38.980934 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 04:18:38.981299 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 04:18:38.990791 systemd[1]: Finished ensure-sysext.service. Sep 4 04:18:38.997990 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 04:18:38.998079 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 04:18:39.000930 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 04:18:39.004374 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 04:18:39.007129 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 04:18:39.017854 systemd-udevd[1440]: Using default interface naming scheme 'v255'. Sep 4 04:18:39.040887 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 04:18:39.051932 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 04:18:39.068557 systemd-resolved[1410]: Positive Trust Anchors: Sep 4 04:18:39.068580 systemd-resolved[1410]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 04:18:39.068623 systemd-resolved[1410]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 04:18:39.081372 systemd-resolved[1410]: Defaulting to hostname 'linux'. Sep 4 04:18:39.084326 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 04:18:39.085860 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 04:18:39.132626 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 04:18:39.134350 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 04:18:39.135670 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 04:18:39.137924 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 04:18:39.139660 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 04:18:39.141275 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 04:18:39.142959 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 04:18:39.142992 systemd[1]: Reached target paths.target - Path Units. Sep 4 04:18:39.144421 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 04:18:39.146289 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 04:18:39.148005 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 04:18:39.149588 systemd[1]: Reached target timers.target - Timer Units. Sep 4 04:18:39.152548 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 04:18:39.155748 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 04:18:39.164978 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 04:18:39.166794 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 4 04:18:39.168430 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 4 04:18:39.180043 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 04:18:39.181890 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 04:18:39.184162 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 04:18:39.189015 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 04:18:39.195806 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 04:18:39.195667 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 04:18:39.211682 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 04:18:39.213011 systemd[1]: Reached target basic.target - Basic System. Sep 4 04:18:39.214317 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 04:18:39.214347 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 04:18:39.216462 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 04:18:39.228796 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 04:18:39.220956 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 04:18:39.225633 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 04:18:39.229603 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 04:18:39.230890 systemd-networkd[1477]: lo: Link UP Sep 4 04:18:39.230899 systemd-networkd[1477]: lo: Gained carrier Sep 4 04:18:39.231111 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 04:18:39.234022 systemd-networkd[1477]: Enumeration completed Sep 4 04:18:39.234532 systemd-networkd[1477]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 04:18:39.234538 systemd-networkd[1477]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 04:18:39.235689 systemd-networkd[1477]: eth0: Link UP Sep 4 04:18:39.236117 systemd-networkd[1477]: eth0: Gained carrier Sep 4 04:18:39.236143 systemd-networkd[1477]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 04:18:39.240839 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 04:18:39.255877 systemd-networkd[1477]: eth0: DHCPv4 address 10.0.0.54/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 04:18:39.257811 systemd-timesyncd[1462]: Network configuration changed, trying to establish connection. Sep 4 04:18:39.258301 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 04:18:39.892730 systemd-timesyncd[1462]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 4 04:18:39.892791 systemd-timesyncd[1462]: Initial clock synchronization to Thu 2025-09-04 04:18:39.892597 UTC. Sep 4 04:18:39.893113 systemd-resolved[1410]: Clock change detected. Flushing caches. Sep 4 04:18:39.894707 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 04:18:39.900002 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 04:18:39.903048 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 04:18:39.910251 jq[1509]: false Sep 4 04:18:39.910181 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 04:18:39.919548 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Refreshing passwd entry cache Sep 4 04:18:39.919548 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Failure getting users, quitting Sep 4 04:18:39.919548 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 04:18:39.919548 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Refreshing group entry cache Sep 4 04:18:39.905649 oslogin_cache_refresh[1511]: Refreshing passwd entry cache Sep 4 04:18:39.915801 oslogin_cache_refresh[1511]: Failure getting users, quitting Sep 4 04:18:39.915844 oslogin_cache_refresh[1511]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 04:18:39.915941 oslogin_cache_refresh[1511]: Refreshing group entry cache Sep 4 04:18:39.921943 extend-filesystems[1510]: Found /dev/vda6 Sep 4 04:18:39.923261 oslogin_cache_refresh[1511]: Failure getting groups, quitting Sep 4 04:18:39.923477 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Failure getting groups, quitting Sep 4 04:18:39.923477 google_oslogin_nss_cache[1511]: oslogin_cache_refresh[1511]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 04:18:39.923281 oslogin_cache_refresh[1511]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 04:18:39.928599 extend-filesystems[1510]: Found /dev/vda9 Sep 4 04:18:39.929240 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 04:18:39.931620 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 04:18:39.932742 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 04:18:39.934335 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 04:18:39.937533 extend-filesystems[1510]: Checking size of /dev/vda9 Sep 4 04:18:39.946308 kernel: ACPI: button: Power Button [PWRF] Sep 4 04:18:39.937010 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 04:18:39.939175 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 04:18:39.948125 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 04:18:39.965567 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 04:18:39.965964 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 04:18:39.966670 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 04:18:39.967386 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 04:18:39.968887 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 04:18:39.969483 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 04:18:39.972249 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 04:18:39.972530 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 04:18:39.987343 jq[1525]: true Sep 4 04:18:39.986217 systemd[1]: Reached target network.target - Network. Sep 4 04:18:39.996825 extend-filesystems[1510]: Resized partition /dev/vda9 Sep 4 04:18:39.999182 update_engine[1524]: I20250904 04:18:39.991168 1524 main.cc:92] Flatcar Update Engine starting Sep 4 04:18:39.999545 extend-filesystems[1547]: resize2fs 1.47.3 (8-Jul-2025) Sep 4 04:18:39.997954 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 04:18:40.001944 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 04:18:40.006692 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 04:18:40.009446 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 4 04:18:40.009816 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 4 04:18:40.010591 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 04:18:40.087678 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 4 04:18:40.103950 jq[1551]: true Sep 4 04:18:40.122548 (ntainerd)[1573]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 04:18:40.163373 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 04:18:40.232624 tar[1535]: linux-amd64/LICENSE Sep 4 04:18:40.232979 tar[1535]: linux-amd64/helm Sep 4 04:18:40.250093 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 04:18:40.268611 systemd-logind[1521]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 04:18:40.326088 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 4 04:18:40.682243 kernel: kvm_amd: TSC scaling supported Sep 4 04:18:40.682331 kernel: kvm_amd: Nested Virtualization enabled Sep 4 04:18:40.682348 kernel: kvm_amd: Nested Paging enabled Sep 4 04:18:40.682361 kernel: kvm_amd: LBR virtualization supported Sep 4 04:18:40.682375 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 4 04:18:40.682410 kernel: kvm_amd: Virtual GIF supported Sep 4 04:18:40.682425 kernel: EDAC MC: Ver: 3.0.0 Sep 4 04:18:40.364269 systemd-logind[1521]: New seat seat0. Sep 4 04:18:40.389915 dbus-daemon[1507]: [system] SELinux support is enabled Sep 4 04:18:40.682903 update_engine[1524]: I20250904 04:18:40.410463 1524 update_check_scheduler.cc:74] Next update check in 6m16s Sep 4 04:18:40.375462 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 04:18:40.401950 dbus-daemon[1507]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 4 04:18:40.386272 systemd-logind[1521]: Watching system buttons on /dev/input/event2 (Power Button) Sep 4 04:18:40.390297 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 04:18:40.397857 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 04:18:40.397878 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 04:18:40.399196 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 04:18:40.399212 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 04:18:40.409206 systemd[1]: Started update-engine.service - Update Engine. Sep 4 04:18:40.413119 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 04:18:40.648037 locksmithd[1594]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 04:18:40.686109 sshd_keygen[1542]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 04:18:40.726612 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 04:18:40.736682 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 04:18:40.779086 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 04:18:40.779448 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 04:18:40.785235 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 04:18:40.795751 extend-filesystems[1547]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 4 04:18:40.795751 extend-filesystems[1547]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 4 04:18:40.795751 extend-filesystems[1547]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 4 04:18:40.803931 extend-filesystems[1510]: Resized filesystem in /dev/vda9 Sep 4 04:18:40.802465 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 04:18:40.802795 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 04:18:40.824485 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 04:18:40.872308 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 04:18:40.876398 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 04:18:40.882248 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 04:18:40.889450 bash[1587]: Updated "/home/core/.ssh/authorized_keys" Sep 4 04:18:40.890748 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 04:18:40.894603 containerd[1573]: time="2025-09-04T04:18:40Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 04:18:40.895264 containerd[1573]: time="2025-09-04T04:18:40.895217092Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 4 04:18:40.908723 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 04:18:40.915627 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 4 04:18:40.919466 containerd[1573]: time="2025-09-04T04:18:40.919405105Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.101µs" Sep 4 04:18:40.919466 containerd[1573]: time="2025-09-04T04:18:40.919442365Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 04:18:40.919466 containerd[1573]: time="2025-09-04T04:18:40.919461461Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 04:18:40.919677 containerd[1573]: time="2025-09-04T04:18:40.919655184Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 04:18:40.919768 containerd[1573]: time="2025-09-04T04:18:40.919744862Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 04:18:40.919836 containerd[1573]: time="2025-09-04T04:18:40.919777754Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 04:18:40.919883 containerd[1573]: time="2025-09-04T04:18:40.919858876Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 04:18:40.919973 containerd[1573]: time="2025-09-04T04:18:40.919932554Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920297 containerd[1573]: time="2025-09-04T04:18:40.920268003Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920406 containerd[1573]: time="2025-09-04T04:18:40.920379863Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920406 containerd[1573]: time="2025-09-04T04:18:40.920399119Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920406 containerd[1573]: time="2025-09-04T04:18:40.920408266Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920533 containerd[1573]: time="2025-09-04T04:18:40.920500719Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920833 containerd[1573]: time="2025-09-04T04:18:40.920807835Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920873 containerd[1573]: time="2025-09-04T04:18:40.920844624Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 04:18:40.920873 containerd[1573]: time="2025-09-04T04:18:40.920854763Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 04:18:40.920957 containerd[1573]: time="2025-09-04T04:18:40.920904216Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 04:18:40.921432 containerd[1573]: time="2025-09-04T04:18:40.921219237Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 04:18:40.921432 containerd[1573]: time="2025-09-04T04:18:40.921296431Z" level=info msg="metadata content store policy set" policy=shared Sep 4 04:18:40.974611 tar[1535]: linux-amd64/README.md Sep 4 04:18:41.009692 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 04:18:41.153353 containerd[1573]: time="2025-09-04T04:18:41.153205070Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 04:18:41.153353 containerd[1573]: time="2025-09-04T04:18:41.153363186Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153393784Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153411968Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153427046Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153447825Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153466029Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153481107Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153510002Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153529989Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153553964Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 04:18:41.153572 containerd[1573]: time="2025-09-04T04:18:41.153569363Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 04:18:41.153835 containerd[1573]: time="2025-09-04T04:18:41.153802751Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 04:18:41.153835 containerd[1573]: time="2025-09-04T04:18:41.153832166Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 04:18:41.153894 containerd[1573]: time="2025-09-04T04:18:41.153850109Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 04:18:41.153894 containerd[1573]: time="2025-09-04T04:18:41.153879885Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 04:18:41.153894 containerd[1573]: time="2025-09-04T04:18:41.153894683Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 04:18:41.153949 containerd[1573]: time="2025-09-04T04:18:41.153905143Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 04:18:41.153949 containerd[1573]: time="2025-09-04T04:18:41.153917416Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 04:18:41.153949 containerd[1573]: time="2025-09-04T04:18:41.153927224Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 04:18:41.153949 containerd[1573]: time="2025-09-04T04:18:41.153938705Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 04:18:41.154111 containerd[1573]: time="2025-09-04T04:18:41.153954084Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 04:18:41.154111 containerd[1573]: time="2025-09-04T04:18:41.153967049Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 04:18:41.154111 containerd[1573]: time="2025-09-04T04:18:41.154041448Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 04:18:41.154111 containerd[1573]: time="2025-09-04T04:18:41.154053481Z" level=info msg="Start snapshots syncer" Sep 4 04:18:41.154111 containerd[1573]: time="2025-09-04T04:18:41.154111840Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 04:18:41.154498 containerd[1573]: time="2025-09-04T04:18:41.154410079Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 04:18:41.154663 containerd[1573]: time="2025-09-04T04:18:41.154511059Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 04:18:41.154663 containerd[1573]: time="2025-09-04T04:18:41.154601158Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 04:18:41.155047 containerd[1573]: time="2025-09-04T04:18:41.155008091Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 04:18:41.155047 containerd[1573]: time="2025-09-04T04:18:41.155038878Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 04:18:41.155120 containerd[1573]: time="2025-09-04T04:18:41.155050560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 04:18:41.155120 containerd[1573]: time="2025-09-04T04:18:41.155077921Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 04:18:41.155120 containerd[1573]: time="2025-09-04T04:18:41.155089143Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 04:18:41.155120 containerd[1573]: time="2025-09-04T04:18:41.155103810Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 04:18:41.155120 containerd[1573]: time="2025-09-04T04:18:41.155119299Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155145478Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155156108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155168101Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155228494Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155258510Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 04:18:41.155268 containerd[1573]: time="2025-09-04T04:18:41.155270983Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155280912Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155288817Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155300038Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155310427Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155331838Z" level=info msg="runtime interface created" Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155339592Z" level=info msg="created NRI interface" Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155348148Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155358347Z" level=info msg="Connect containerd service" Sep 4 04:18:41.155459 containerd[1573]: time="2025-09-04T04:18:41.155381781Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 04:18:41.156394 containerd[1573]: time="2025-09-04T04:18:41.156343545Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 04:18:41.654669 containerd[1573]: time="2025-09-04T04:18:41.654597861Z" level=info msg="Start subscribing containerd event" Sep 4 04:18:41.654669 containerd[1573]: time="2025-09-04T04:18:41.654682811Z" level=info msg="Start recovering state" Sep 4 04:18:41.654932 containerd[1573]: time="2025-09-04T04:18:41.654909155Z" level=info msg="Start event monitor" Sep 4 04:18:41.654969 containerd[1573]: time="2025-09-04T04:18:41.654921528Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 04:18:41.654969 containerd[1573]: time="2025-09-04T04:18:41.654934963Z" level=info msg="Start cni network conf syncer for default" Sep 4 04:18:41.654969 containerd[1573]: time="2025-09-04T04:18:41.654967604Z" level=info msg="Start streaming server" Sep 4 04:18:41.655046 containerd[1573]: time="2025-09-04T04:18:41.654987832Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 04:18:41.655046 containerd[1573]: time="2025-09-04T04:18:41.654996158Z" level=info msg="runtime interface starting up..." Sep 4 04:18:41.655046 containerd[1573]: time="2025-09-04T04:18:41.655002460Z" level=info msg="starting plugins..." Sep 4 04:18:41.655046 containerd[1573]: time="2025-09-04T04:18:41.655023028Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 04:18:41.655216 containerd[1573]: time="2025-09-04T04:18:41.655004945Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 04:18:41.655340 containerd[1573]: time="2025-09-04T04:18:41.655316679Z" level=info msg="containerd successfully booted in 0.761447s" Sep 4 04:18:41.655542 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 04:18:41.727375 systemd-networkd[1477]: eth0: Gained IPv6LL Sep 4 04:18:41.730990 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 04:18:41.732870 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 04:18:41.735785 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 4 04:18:41.738458 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:18:41.740852 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 04:18:41.807316 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 04:18:41.822418 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 4 04:18:41.822742 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 4 04:18:41.825317 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 04:18:43.291268 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:18:43.294946 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 04:18:43.296848 systemd[1]: Startup finished in 3.561s (kernel) + 7.454s (initrd) + 6.620s (userspace) = 17.636s. Sep 4 04:18:43.307491 (kubelet)[1671]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 04:18:43.526963 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 04:18:43.528534 systemd[1]: Started sshd@0-10.0.0.54:22-10.0.0.1:53862.service - OpenSSH per-connection server daemon (10.0.0.1:53862). Sep 4 04:18:43.622890 sshd[1682]: Accepted publickey for core from 10.0.0.1 port 53862 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:43.624806 sshd-session[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:43.634093 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 04:18:43.635760 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 04:18:43.645954 systemd-logind[1521]: New session 1 of user core. Sep 4 04:18:43.719633 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 04:18:43.724240 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 04:18:43.745207 (systemd)[1687]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 04:18:43.748261 systemd-logind[1521]: New session c1 of user core. Sep 4 04:18:43.934372 systemd[1687]: Queued start job for default target default.target. Sep 4 04:18:43.961388 systemd[1687]: Created slice app.slice - User Application Slice. Sep 4 04:18:43.961421 systemd[1687]: Reached target paths.target - Paths. Sep 4 04:18:43.961481 systemd[1687]: Reached target timers.target - Timers. Sep 4 04:18:43.963702 systemd[1687]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 04:18:43.996156 systemd[1687]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 04:18:43.996314 systemd[1687]: Reached target sockets.target - Sockets. Sep 4 04:18:43.996362 systemd[1687]: Reached target basic.target - Basic System. Sep 4 04:18:43.996410 systemd[1687]: Reached target default.target - Main User Target. Sep 4 04:18:43.996449 systemd[1687]: Startup finished in 237ms. Sep 4 04:18:43.996992 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 04:18:44.006318 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 04:18:44.077610 systemd[1]: Started sshd@1-10.0.0.54:22-10.0.0.1:53902.service - OpenSSH per-connection server daemon (10.0.0.1:53902). Sep 4 04:18:44.137470 kubelet[1671]: E0904 04:18:44.137361 1671 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 04:18:44.141289 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 04:18:44.141552 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 04:18:44.142159 systemd[1]: kubelet.service: Consumed 2.101s CPU time, 264.6M memory peak. Sep 4 04:18:44.154629 sshd[1699]: Accepted publickey for core from 10.0.0.1 port 53902 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:44.156447 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:44.161532 systemd-logind[1521]: New session 2 of user core. Sep 4 04:18:44.171353 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 04:18:44.231670 sshd[1703]: Connection closed by 10.0.0.1 port 53902 Sep 4 04:18:44.232119 sshd-session[1699]: pam_unix(sshd:session): session closed for user core Sep 4 04:18:44.245311 systemd[1]: sshd@1-10.0.0.54:22-10.0.0.1:53902.service: Deactivated successfully. Sep 4 04:18:44.247688 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 04:18:44.248589 systemd-logind[1521]: Session 2 logged out. Waiting for processes to exit. Sep 4 04:18:44.252548 systemd[1]: Started sshd@2-10.0.0.54:22-10.0.0.1:53916.service - OpenSSH per-connection server daemon (10.0.0.1:53916). Sep 4 04:18:44.253526 systemd-logind[1521]: Removed session 2. Sep 4 04:18:44.324458 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 53916 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:44.326440 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:44.331923 systemd-logind[1521]: New session 3 of user core. Sep 4 04:18:44.351333 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 04:18:44.402536 sshd[1712]: Connection closed by 10.0.0.1 port 53916 Sep 4 04:18:44.402903 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Sep 4 04:18:44.447238 systemd[1]: sshd@2-10.0.0.54:22-10.0.0.1:53916.service: Deactivated successfully. Sep 4 04:18:44.449512 systemd[1]: session-3.scope: Deactivated successfully. Sep 4 04:18:44.450525 systemd-logind[1521]: Session 3 logged out. Waiting for processes to exit. Sep 4 04:18:44.453986 systemd[1]: Started sshd@3-10.0.0.54:22-10.0.0.1:53930.service - OpenSSH per-connection server daemon (10.0.0.1:53930). Sep 4 04:18:44.454623 systemd-logind[1521]: Removed session 3. Sep 4 04:18:44.511084 sshd[1718]: Accepted publickey for core from 10.0.0.1 port 53930 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:44.512661 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:44.517579 systemd-logind[1521]: New session 4 of user core. Sep 4 04:18:44.531236 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 04:18:44.588245 sshd[1721]: Connection closed by 10.0.0.1 port 53930 Sep 4 04:18:44.588680 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Sep 4 04:18:44.612748 systemd[1]: sshd@3-10.0.0.54:22-10.0.0.1:53930.service: Deactivated successfully. Sep 4 04:18:44.614437 systemd[1]: session-4.scope: Deactivated successfully. Sep 4 04:18:44.615216 systemd-logind[1521]: Session 4 logged out. Waiting for processes to exit. Sep 4 04:18:44.618405 systemd[1]: Started sshd@4-10.0.0.54:22-10.0.0.1:53946.service - OpenSSH per-connection server daemon (10.0.0.1:53946). Sep 4 04:18:44.619196 systemd-logind[1521]: Removed session 4. Sep 4 04:18:44.679109 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 53946 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:44.680521 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:44.686814 systemd-logind[1521]: New session 5 of user core. Sep 4 04:18:44.698391 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 04:18:44.760933 sudo[1731]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 04:18:44.761398 sudo[1731]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 04:18:44.782497 sudo[1731]: pam_unix(sudo:session): session closed for user root Sep 4 04:18:44.784762 sshd[1730]: Connection closed by 10.0.0.1 port 53946 Sep 4 04:18:44.785364 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Sep 4 04:18:44.801985 systemd[1]: sshd@4-10.0.0.54:22-10.0.0.1:53946.service: Deactivated successfully. Sep 4 04:18:44.804776 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 04:18:44.805845 systemd-logind[1521]: Session 5 logged out. Waiting for processes to exit. Sep 4 04:18:44.809428 systemd[1]: Started sshd@5-10.0.0.54:22-10.0.0.1:53962.service - OpenSSH per-connection server daemon (10.0.0.1:53962). Sep 4 04:18:44.810021 systemd-logind[1521]: Removed session 5. Sep 4 04:18:44.877347 sshd[1737]: Accepted publickey for core from 10.0.0.1 port 53962 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:44.878806 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:44.883583 systemd-logind[1521]: New session 6 of user core. Sep 4 04:18:44.898191 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 04:18:44.955315 sudo[1742]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 04:18:44.955662 sudo[1742]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 04:18:45.036146 sudo[1742]: pam_unix(sudo:session): session closed for user root Sep 4 04:18:45.043538 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 04:18:45.043948 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 04:18:45.055254 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 04:18:45.108111 augenrules[1764]: No rules Sep 4 04:18:45.110208 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 04:18:45.110541 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 04:18:45.112108 sudo[1741]: pam_unix(sudo:session): session closed for user root Sep 4 04:18:45.113881 sshd[1740]: Connection closed by 10.0.0.1 port 53962 Sep 4 04:18:45.114401 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Sep 4 04:18:45.124534 systemd[1]: sshd@5-10.0.0.54:22-10.0.0.1:53962.service: Deactivated successfully. Sep 4 04:18:45.126534 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 04:18:45.127278 systemd-logind[1521]: Session 6 logged out. Waiting for processes to exit. Sep 4 04:18:45.131427 systemd[1]: Started sshd@6-10.0.0.54:22-10.0.0.1:53976.service - OpenSSH per-connection server daemon (10.0.0.1:53976). Sep 4 04:18:45.132124 systemd-logind[1521]: Removed session 6. Sep 4 04:18:45.194971 sshd[1773]: Accepted publickey for core from 10.0.0.1 port 53976 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:18:45.197100 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:18:45.202159 systemd-logind[1521]: New session 7 of user core. Sep 4 04:18:45.218258 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 04:18:45.274125 sudo[1778]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 04:18:45.274464 sudo[1778]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 04:18:45.942191 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 04:18:45.959488 (dockerd)[1798]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 04:18:46.867073 dockerd[1798]: time="2025-09-04T04:18:46.866973524Z" level=info msg="Starting up" Sep 4 04:18:46.867973 dockerd[1798]: time="2025-09-04T04:18:46.867937672Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 04:18:46.880896 dockerd[1798]: time="2025-09-04T04:18:46.880833351Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 4 04:18:48.085950 dockerd[1798]: time="2025-09-04T04:18:48.085871437Z" level=info msg="Loading containers: start." Sep 4 04:18:48.179100 kernel: Initializing XFRM netlink socket Sep 4 04:18:48.538878 systemd-networkd[1477]: docker0: Link UP Sep 4 04:18:48.594505 dockerd[1798]: time="2025-09-04T04:18:48.594433281Z" level=info msg="Loading containers: done." Sep 4 04:18:48.667483 dockerd[1798]: time="2025-09-04T04:18:48.667414681Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 04:18:48.667685 dockerd[1798]: time="2025-09-04T04:18:48.667557248Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 4 04:18:48.667712 dockerd[1798]: time="2025-09-04T04:18:48.667691861Z" level=info msg="Initializing buildkit" Sep 4 04:18:48.947228 dockerd[1798]: time="2025-09-04T04:18:48.947176805Z" level=info msg="Completed buildkit initialization" Sep 4 04:18:48.955951 dockerd[1798]: time="2025-09-04T04:18:48.955887958Z" level=info msg="Daemon has completed initialization" Sep 4 04:18:48.956042 dockerd[1798]: time="2025-09-04T04:18:48.955990681Z" level=info msg="API listen on /run/docker.sock" Sep 4 04:18:48.956242 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 04:18:49.889867 containerd[1573]: time="2025-09-04T04:18:49.889803490Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 4 04:18:51.259423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2737771755.mount: Deactivated successfully. Sep 4 04:18:52.856093 containerd[1573]: time="2025-09-04T04:18:52.856006359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:52.871204 containerd[1573]: time="2025-09-04T04:18:52.871128453Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Sep 4 04:18:52.921428 containerd[1573]: time="2025-09-04T04:18:52.921337044Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:52.940150 containerd[1573]: time="2025-09-04T04:18:52.940049921Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:52.941278 containerd[1573]: time="2025-09-04T04:18:52.941224433Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 3.051374527s" Sep 4 04:18:52.941349 containerd[1573]: time="2025-09-04T04:18:52.941333137Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 4 04:18:52.942498 containerd[1573]: time="2025-09-04T04:18:52.942438449Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 4 04:18:54.159264 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 04:18:54.161117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:18:54.417026 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:18:54.435422 (kubelet)[2081]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 04:18:54.588222 kubelet[2081]: E0904 04:18:54.588152 2081 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 04:18:54.595419 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 04:18:54.595652 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 04:18:54.596152 systemd[1]: kubelet.service: Consumed 290ms CPU time, 111.1M memory peak. Sep 4 04:18:54.659080 containerd[1573]: time="2025-09-04T04:18:54.658972420Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:54.660271 containerd[1573]: time="2025-09-04T04:18:54.659864813Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Sep 4 04:18:54.665125 containerd[1573]: time="2025-09-04T04:18:54.665039726Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:54.676868 containerd[1573]: time="2025-09-04T04:18:54.676714597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:54.677910 containerd[1573]: time="2025-09-04T04:18:54.677861217Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.735381029s" Sep 4 04:18:54.677910 containerd[1573]: time="2025-09-04T04:18:54.677907884Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 4 04:18:54.678572 containerd[1573]: time="2025-09-04T04:18:54.678524039Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 4 04:18:56.963407 containerd[1573]: time="2025-09-04T04:18:56.963270338Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:56.981995 containerd[1573]: time="2025-09-04T04:18:56.981886193Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Sep 4 04:18:57.016283 containerd[1573]: time="2025-09-04T04:18:57.016162521Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:57.030684 containerd[1573]: time="2025-09-04T04:18:57.030575826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:18:57.032284 containerd[1573]: time="2025-09-04T04:18:57.032191505Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 2.353626008s" Sep 4 04:18:57.032284 containerd[1573]: time="2025-09-04T04:18:57.032262278Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 4 04:18:57.034187 containerd[1573]: time="2025-09-04T04:18:57.034148905Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 4 04:18:59.706429 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2852370813.mount: Deactivated successfully. Sep 4 04:19:01.701652 containerd[1573]: time="2025-09-04T04:19:01.701550101Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:01.728160 containerd[1573]: time="2025-09-04T04:19:01.728010174Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Sep 4 04:19:01.742777 containerd[1573]: time="2025-09-04T04:19:01.742669861Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:01.767900 containerd[1573]: time="2025-09-04T04:19:01.767809828Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:01.768547 containerd[1573]: time="2025-09-04T04:19:01.768487359Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 4.734298989s" Sep 4 04:19:01.768547 containerd[1573]: time="2025-09-04T04:19:01.768537984Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 4 04:19:01.769190 containerd[1573]: time="2025-09-04T04:19:01.769128300Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 4 04:19:04.079553 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3269234259.mount: Deactivated successfully. Sep 4 04:19:04.659447 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 04:19:04.661523 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:04.886909 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:04.893326 (kubelet)[2113]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 04:19:04.962200 kubelet[2113]: E0904 04:19:04.962000 2113 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 04:19:04.967250 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 04:19:04.967501 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 04:19:04.967886 systemd[1]: kubelet.service: Consumed 271ms CPU time, 110.6M memory peak. Sep 4 04:19:11.140900 containerd[1573]: time="2025-09-04T04:19:11.140805102Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:11.141759 containerd[1573]: time="2025-09-04T04:19:11.141656498Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 4 04:19:11.143572 containerd[1573]: time="2025-09-04T04:19:11.143528178Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:11.146506 containerd[1573]: time="2025-09-04T04:19:11.146468661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:11.147520 containerd[1573]: time="2025-09-04T04:19:11.147483815Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 9.37831632s" Sep 4 04:19:11.147589 containerd[1573]: time="2025-09-04T04:19:11.147529681Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 4 04:19:11.148318 containerd[1573]: time="2025-09-04T04:19:11.148288674Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 04:19:11.597402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount914323824.mount: Deactivated successfully. Sep 4 04:19:11.604222 containerd[1573]: time="2025-09-04T04:19:11.604153009Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 04:19:11.605069 containerd[1573]: time="2025-09-04T04:19:11.604982605Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 04:19:11.606698 containerd[1573]: time="2025-09-04T04:19:11.606639852Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 04:19:11.608901 containerd[1573]: time="2025-09-04T04:19:11.608841450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 04:19:11.609608 containerd[1573]: time="2025-09-04T04:19:11.609556841Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 461.239334ms" Sep 4 04:19:11.609608 containerd[1573]: time="2025-09-04T04:19:11.609586377Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 04:19:11.610191 containerd[1573]: time="2025-09-04T04:19:11.610159632Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 4 04:19:12.804336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4156758342.mount: Deactivated successfully. Sep 4 04:19:15.159457 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 4 04:19:15.163353 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:15.463771 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:15.493500 (kubelet)[2234]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 04:19:15.776506 kubelet[2234]: E0904 04:19:15.776352 2234 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 04:19:15.782088 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 04:19:15.782319 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 04:19:15.782958 systemd[1]: kubelet.service: Consumed 675ms CPU time, 112.1M memory peak. Sep 4 04:19:16.450114 containerd[1573]: time="2025-09-04T04:19:16.448857297Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:16.464463 containerd[1573]: time="2025-09-04T04:19:16.464257017Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 4 04:19:16.469123 containerd[1573]: time="2025-09-04T04:19:16.468662903Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:16.480352 containerd[1573]: time="2025-09-04T04:19:16.479047554Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:16.484021 containerd[1573]: time="2025-09-04T04:19:16.482651501Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 4.872458686s" Sep 4 04:19:16.484021 containerd[1573]: time="2025-09-04T04:19:16.483459281Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 4 04:19:20.062010 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:20.062327 systemd[1]: kubelet.service: Consumed 675ms CPU time, 112.1M memory peak. Sep 4 04:19:20.065198 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:20.096875 systemd[1]: Reload requested from client PID 2275 ('systemctl') (unit session-7.scope)... Sep 4 04:19:20.096912 systemd[1]: Reloading... Sep 4 04:19:20.202094 zram_generator::config[2320]: No configuration found. Sep 4 04:19:20.496173 systemd[1]: Reloading finished in 398 ms. Sep 4 04:19:20.558795 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 04:19:20.558898 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 04:19:20.559236 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:20.559280 systemd[1]: kubelet.service: Consumed 180ms CPU time, 98.4M memory peak. Sep 4 04:19:20.561035 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:20.772481 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:20.788458 (kubelet)[2365]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 04:19:20.850508 kubelet[2365]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 04:19:20.850508 kubelet[2365]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 04:19:20.850508 kubelet[2365]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 04:19:20.851024 kubelet[2365]: I0904 04:19:20.850572 2365 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 04:19:21.085551 kubelet[2365]: I0904 04:19:21.085360 2365 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 4 04:19:21.085551 kubelet[2365]: I0904 04:19:21.085402 2365 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 04:19:21.085744 kubelet[2365]: I0904 04:19:21.085715 2365 server.go:954] "Client rotation is on, will bootstrap in background" Sep 4 04:19:21.115624 kubelet[2365]: E0904 04:19:21.115532 2365 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.54:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:21.119524 kubelet[2365]: I0904 04:19:21.119446 2365 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 04:19:21.168539 kubelet[2365]: I0904 04:19:21.168496 2365 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 04:19:21.174535 kubelet[2365]: I0904 04:19:21.174472 2365 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 04:19:21.177131 kubelet[2365]: I0904 04:19:21.177045 2365 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 04:19:21.177384 kubelet[2365]: I0904 04:19:21.177119 2365 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 04:19:21.177592 kubelet[2365]: I0904 04:19:21.177393 2365 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 04:19:21.177592 kubelet[2365]: I0904 04:19:21.177406 2365 container_manager_linux.go:304] "Creating device plugin manager" Sep 4 04:19:21.177662 kubelet[2365]: I0904 04:19:21.177613 2365 state_mem.go:36] "Initialized new in-memory state store" Sep 4 04:19:21.181820 kubelet[2365]: I0904 04:19:21.181771 2365 kubelet.go:446] "Attempting to sync node with API server" Sep 4 04:19:21.185737 kubelet[2365]: I0904 04:19:21.185648 2365 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 04:19:21.185853 kubelet[2365]: I0904 04:19:21.185784 2365 kubelet.go:352] "Adding apiserver pod source" Sep 4 04:19:21.185853 kubelet[2365]: I0904 04:19:21.185808 2365 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 04:19:21.191000 kubelet[2365]: I0904 04:19:21.190944 2365 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 04:19:21.191571 kubelet[2365]: I0904 04:19:21.191543 2365 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 04:19:21.191800 kubelet[2365]: W0904 04:19:21.191713 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:21.191800 kubelet[2365]: W0904 04:19:21.191777 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:21.191800 kubelet[2365]: E0904 04:19:21.191806 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:21.192036 kubelet[2365]: E0904 04:19:21.191831 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:21.192655 kubelet[2365]: W0904 04:19:21.192618 2365 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 04:19:21.195555 kubelet[2365]: I0904 04:19:21.195504 2365 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 04:19:21.195617 kubelet[2365]: I0904 04:19:21.195570 2365 server.go:1287] "Started kubelet" Sep 4 04:19:21.195766 kubelet[2365]: I0904 04:19:21.195723 2365 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 04:19:21.197089 kubelet[2365]: I0904 04:19:21.197028 2365 server.go:479] "Adding debug handlers to kubelet server" Sep 4 04:19:21.198731 kubelet[2365]: I0904 04:19:21.198591 2365 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 04:19:21.199462 kubelet[2365]: I0904 04:19:21.199385 2365 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 04:19:21.199719 kubelet[2365]: I0904 04:19:21.199633 2365 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 04:19:21.202320 kubelet[2365]: I0904 04:19:21.200808 2365 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 04:19:21.202320 kubelet[2365]: E0904 04:19:21.201183 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.202320 kubelet[2365]: I0904 04:19:21.201210 2365 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 04:19:21.202320 kubelet[2365]: I0904 04:19:21.201387 2365 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 04:19:21.202320 kubelet[2365]: I0904 04:19:21.201443 2365 reconciler.go:26] "Reconciler: start to sync state" Sep 4 04:19:21.205075 kubelet[2365]: E0904 04:19:21.205010 2365 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 04:19:21.205588 kubelet[2365]: I0904 04:19:21.205560 2365 factory.go:221] Registration of the systemd container factory successfully Sep 4 04:19:21.205761 kubelet[2365]: I0904 04:19:21.205726 2365 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 04:19:21.206405 kubelet[2365]: E0904 04:19:21.206347 2365 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="200ms" Sep 4 04:19:21.209253 kubelet[2365]: W0904 04:19:21.206867 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:21.209253 kubelet[2365]: E0904 04:19:21.206956 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:21.209253 kubelet[2365]: I0904 04:19:21.207138 2365 factory.go:221] Registration of the containerd container factory successfully Sep 4 04:19:21.209253 kubelet[2365]: E0904 04:19:21.206879 2365 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.54:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.54:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1861f96981c926ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 04:19:21.195529983 +0000 UTC m=+0.401651674,LastTimestamp:2025-09-04 04:19:21.195529983 +0000 UTC m=+0.401651674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 04:19:21.224483 kubelet[2365]: I0904 04:19:21.224442 2365 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 04:19:21.224483 kubelet[2365]: I0904 04:19:21.224465 2365 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 04:19:21.224483 kubelet[2365]: I0904 04:19:21.224490 2365 state_mem.go:36] "Initialized new in-memory state store" Sep 4 04:19:21.233087 kubelet[2365]: I0904 04:19:21.232704 2365 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 04:19:21.234448 kubelet[2365]: I0904 04:19:21.234417 2365 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 04:19:21.234519 kubelet[2365]: I0904 04:19:21.234459 2365 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 4 04:19:21.234519 kubelet[2365]: I0904 04:19:21.234490 2365 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 04:19:21.234519 kubelet[2365]: I0904 04:19:21.234500 2365 kubelet.go:2382] "Starting kubelet main sync loop" Sep 4 04:19:21.234603 kubelet[2365]: E0904 04:19:21.234558 2365 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 04:19:21.236084 kubelet[2365]: W0904 04:19:21.235391 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:21.236084 kubelet[2365]: E0904 04:19:21.235434 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:21.302136 kubelet[2365]: E0904 04:19:21.302014 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.335032 kubelet[2365]: E0904 04:19:21.334920 2365 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 04:19:21.402692 kubelet[2365]: E0904 04:19:21.402398 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.407538 kubelet[2365]: E0904 04:19:21.407466 2365 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="400ms" Sep 4 04:19:21.502827 kubelet[2365]: E0904 04:19:21.502735 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.535515 kubelet[2365]: E0904 04:19:21.535407 2365 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 04:19:21.602979 kubelet[2365]: E0904 04:19:21.602866 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.703300 kubelet[2365]: E0904 04:19:21.703208 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.754127 kubelet[2365]: I0904 04:19:21.754004 2365 policy_none.go:49] "None policy: Start" Sep 4 04:19:21.754127 kubelet[2365]: I0904 04:19:21.754086 2365 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 04:19:21.754127 kubelet[2365]: I0904 04:19:21.754105 2365 state_mem.go:35] "Initializing new in-memory state store" Sep 4 04:19:21.764371 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 04:19:21.783350 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 04:19:21.786824 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 04:19:21.804200 kubelet[2365]: E0904 04:19:21.804135 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:21.805580 kubelet[2365]: I0904 04:19:21.805555 2365 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 04:19:21.805909 kubelet[2365]: I0904 04:19:21.805885 2365 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 04:19:21.805988 kubelet[2365]: I0904 04:19:21.805905 2365 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 04:19:21.806224 kubelet[2365]: I0904 04:19:21.806195 2365 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 04:19:21.808350 kubelet[2365]: E0904 04:19:21.807725 2365 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 04:19:21.808350 kubelet[2365]: E0904 04:19:21.807786 2365 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 4 04:19:21.808350 kubelet[2365]: E0904 04:19:21.808125 2365 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="800ms" Sep 4 04:19:21.907828 kubelet[2365]: I0904 04:19:21.907761 2365 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 04:19:21.908330 kubelet[2365]: E0904 04:19:21.908286 2365 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 4 04:19:21.946444 systemd[1]: Created slice kubepods-burstable-podc700ac377fb3db79fe14b7ad0801eadd.slice - libcontainer container kubepods-burstable-podc700ac377fb3db79fe14b7ad0801eadd.slice. Sep 4 04:19:21.954421 kubelet[2365]: E0904 04:19:21.954282 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:21.956903 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 4 04:19:21.982172 kubelet[2365]: E0904 04:19:21.982125 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:21.985294 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 4 04:19:21.987422 kubelet[2365]: E0904 04:19:21.987386 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:22.005849 kubelet[2365]: I0904 04:19:22.005792 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:22.005849 kubelet[2365]: I0904 04:19:22.005849 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:22.006098 kubelet[2365]: I0904 04:19:22.005879 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 4 04:19:22.006098 kubelet[2365]: I0904 04:19:22.005897 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:22.006098 kubelet[2365]: I0904 04:19:22.005933 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:22.006098 kubelet[2365]: I0904 04:19:22.005960 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:22.006098 kubelet[2365]: I0904 04:19:22.005979 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:22.006224 kubelet[2365]: I0904 04:19:22.005995 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:22.006224 kubelet[2365]: I0904 04:19:22.006079 2365 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:22.110091 kubelet[2365]: I0904 04:19:22.110020 2365 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 04:19:22.110539 kubelet[2365]: E0904 04:19:22.110490 2365 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 4 04:19:22.256292 containerd[1573]: time="2025-09-04T04:19:22.256117675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c700ac377fb3db79fe14b7ad0801eadd,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:22.284187 containerd[1573]: time="2025-09-04T04:19:22.284135534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:22.285102 containerd[1573]: time="2025-09-04T04:19:22.285028100Z" level=info msg="connecting to shim 7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6" address="unix:///run/containerd/s/8eae84df7990f5885f5258e19a823b2c5d287520637a3cc832e9761c151058bd" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:22.288583 containerd[1573]: time="2025-09-04T04:19:22.288538307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:22.321503 systemd[1]: Started cri-containerd-7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6.scope - libcontainer container 7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6. Sep 4 04:19:22.322898 containerd[1573]: time="2025-09-04T04:19:22.322845171Z" level=info msg="connecting to shim b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b" address="unix:///run/containerd/s/9c3d6c704dcc5ba00d77faa116f5eb5c8734f1c3d7e76521d34d5485f6e5a0d3" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:22.334313 containerd[1573]: time="2025-09-04T04:19:22.334256946Z" level=info msg="connecting to shim 7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8" address="unix:///run/containerd/s/94ad439b56a6cef03b546782e75a42201901f6b5c94b56a880b6261f55a26f5b" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:22.366293 systemd[1]: Started cri-containerd-b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b.scope - libcontainer container b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b. Sep 4 04:19:22.373373 systemd[1]: Started cri-containerd-7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8.scope - libcontainer container 7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8. Sep 4 04:19:22.444108 kubelet[2365]: W0904 04:19:22.443860 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:22.444108 kubelet[2365]: E0904 04:19:22.444115 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.54:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:22.466540 containerd[1573]: time="2025-09-04T04:19:22.466478078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c700ac377fb3db79fe14b7ad0801eadd,Namespace:kube-system,Attempt:0,} returns sandbox id \"7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6\"" Sep 4 04:19:22.471091 kubelet[2365]: W0904 04:19:22.470864 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:22.471091 kubelet[2365]: E0904 04:19:22.470925 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.54:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:22.471301 containerd[1573]: time="2025-09-04T04:19:22.470939621Z" level=info msg="CreateContainer within sandbox \"7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 04:19:22.483863 containerd[1573]: time="2025-09-04T04:19:22.483813998Z" level=info msg="Container 560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:22.492619 containerd[1573]: time="2025-09-04T04:19:22.492572625Z" level=info msg="CreateContainer within sandbox \"7c977ea968b7d3440d885537f8fd5fe701e0b470b49ee906bbf895a3fab8b7f6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d\"" Sep 4 04:19:22.496093 containerd[1573]: time="2025-09-04T04:19:22.494452916Z" level=info msg="StartContainer for \"560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d\"" Sep 4 04:19:22.496093 containerd[1573]: time="2025-09-04T04:19:22.495213101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b\"" Sep 4 04:19:22.497353 containerd[1573]: time="2025-09-04T04:19:22.497289562Z" level=info msg="connecting to shim 560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d" address="unix:///run/containerd/s/8eae84df7990f5885f5258e19a823b2c5d287520637a3cc832e9761c151058bd" protocol=ttrpc version=3 Sep 4 04:19:22.499824 containerd[1573]: time="2025-09-04T04:19:22.499799250Z" level=info msg="CreateContainer within sandbox \"b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 04:19:22.509535 containerd[1573]: time="2025-09-04T04:19:22.508974494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8\"" Sep 4 04:19:22.512003 kubelet[2365]: I0904 04:19:22.511980 2365 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 04:19:22.512951 kubelet[2365]: E0904 04:19:22.512551 2365 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.54:6443/api/v1/nodes\": dial tcp 10.0.0.54:6443: connect: connection refused" node="localhost" Sep 4 04:19:22.513556 containerd[1573]: time="2025-09-04T04:19:22.513165015Z" level=info msg="CreateContainer within sandbox \"7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 04:19:22.514098 containerd[1573]: time="2025-09-04T04:19:22.514035719Z" level=info msg="Container 0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:22.523281 containerd[1573]: time="2025-09-04T04:19:22.523240508Z" level=info msg="CreateContainer within sandbox \"b198507d8a053d5910d328bdf3147378314e58a7d83fce4e49c7561b80aa406b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5\"" Sep 4 04:19:22.523665 containerd[1573]: time="2025-09-04T04:19:22.523636415Z" level=info msg="StartContainer for \"0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5\"" Sep 4 04:19:22.524929 containerd[1573]: time="2025-09-04T04:19:22.524843374Z" level=info msg="connecting to shim 0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5" address="unix:///run/containerd/s/9c3d6c704dcc5ba00d77faa116f5eb5c8734f1c3d7e76521d34d5485f6e5a0d3" protocol=ttrpc version=3 Sep 4 04:19:22.526292 containerd[1573]: time="2025-09-04T04:19:22.526253036Z" level=info msg="Container 417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:22.530370 kubelet[2365]: W0904 04:19:22.530247 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:22.530370 kubelet[2365]: E0904 04:19:22.530339 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.54:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:22.534262 systemd[1]: Started cri-containerd-560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d.scope - libcontainer container 560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d. Sep 4 04:19:22.536810 containerd[1573]: time="2025-09-04T04:19:22.536776885Z" level=info msg="CreateContainer within sandbox \"7ed49dec5e8972bb1e15d7e9f2407ec4ad955823783baff83cccfd2f40d15dd8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a\"" Sep 4 04:19:22.537415 containerd[1573]: time="2025-09-04T04:19:22.537375305Z" level=info msg="StartContainer for \"417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a\"" Sep 4 04:19:22.538580 containerd[1573]: time="2025-09-04T04:19:22.538552509Z" level=info msg="connecting to shim 417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a" address="unix:///run/containerd/s/94ad439b56a6cef03b546782e75a42201901f6b5c94b56a880b6261f55a26f5b" protocol=ttrpc version=3 Sep 4 04:19:22.596765 systemd[1]: Started cri-containerd-417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a.scope - libcontainer container 417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a. Sep 4 04:19:22.609725 kubelet[2365]: E0904 04:19:22.609345 2365 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.54:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.54:6443: connect: connection refused" interval="1.6s" Sep 4 04:19:22.613276 systemd[1]: Started cri-containerd-0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5.scope - libcontainer container 0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5. Sep 4 04:19:22.644944 kubelet[2365]: W0904 04:19:22.644856 2365 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.54:6443: connect: connection refused Sep 4 04:19:22.645041 kubelet[2365]: E0904 04:19:22.644954 2365 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.54:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.54:6443: connect: connection refused" logger="UnhandledError" Sep 4 04:19:22.657083 containerd[1573]: time="2025-09-04T04:19:22.656863726Z" level=info msg="StartContainer for \"560bebba3230d0ffd9b4ef1f8fbc64a0541a346d7ba0a5c9a4825c8a2f55bd2d\" returns successfully" Sep 4 04:19:22.670464 containerd[1573]: time="2025-09-04T04:19:22.670414431Z" level=info msg="StartContainer for \"417dfdad85ac895231f44bd02131343fa2980cb640da03b356a54d01c5f0264a\" returns successfully" Sep 4 04:19:22.687100 containerd[1573]: time="2025-09-04T04:19:22.686192711Z" level=info msg="StartContainer for \"0ce6d785009f6feb8138aac75c6b15d80c12ef24a2df2cd7b1d4e6c5a7246bc5\" returns successfully" Sep 4 04:19:23.242657 kubelet[2365]: E0904 04:19:23.242605 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:23.245120 kubelet[2365]: E0904 04:19:23.245079 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:23.247022 kubelet[2365]: E0904 04:19:23.246993 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:23.315827 kubelet[2365]: I0904 04:19:23.315782 2365 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 04:19:24.067453 kubelet[2365]: I0904 04:19:24.067364 2365 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 04:19:24.067453 kubelet[2365]: E0904 04:19:24.067428 2365 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 4 04:19:24.194551 kubelet[2365]: E0904 04:19:24.194490 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:24.248837 kubelet[2365]: E0904 04:19:24.248790 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:24.249392 kubelet[2365]: E0904 04:19:24.249259 2365 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 4 04:19:24.295087 kubelet[2365]: E0904 04:19:24.295013 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:24.396344 kubelet[2365]: E0904 04:19:24.396163 2365 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:24.504927 kubelet[2365]: I0904 04:19:24.504858 2365 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:24.519617 kubelet[2365]: E0904 04:19:24.519532 2365 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:24.519617 kubelet[2365]: I0904 04:19:24.519584 2365 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:24.523314 kubelet[2365]: E0904 04:19:24.523251 2365 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:24.523314 kubelet[2365]: I0904 04:19:24.523301 2365 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 04:19:24.528051 kubelet[2365]: E0904 04:19:24.527969 2365 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 4 04:19:25.191924 kubelet[2365]: I0904 04:19:25.191846 2365 apiserver.go:52] "Watching apiserver" Sep 4 04:19:25.301731 kubelet[2365]: I0904 04:19:25.301663 2365 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 04:19:25.662270 update_engine[1524]: I20250904 04:19:25.662143 1524 update_attempter.cc:509] Updating boot flags... Sep 4 04:19:26.251482 systemd[1]: Reload requested from client PID 2657 ('systemctl') (unit session-7.scope)... Sep 4 04:19:26.251503 systemd[1]: Reloading... Sep 4 04:19:26.362180 zram_generator::config[2703]: No configuration found. Sep 4 04:19:26.653418 systemd[1]: Reloading finished in 401 ms. Sep 4 04:19:26.680332 kubelet[2365]: I0904 04:19:26.680168 2365 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 04:19:26.680805 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:26.701712 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 04:19:26.702091 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:26.702160 systemd[1]: kubelet.service: Consumed 1.002s CPU time, 132.5M memory peak. Sep 4 04:19:26.704424 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 04:19:26.946690 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 04:19:26.968547 (kubelet)[2745]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 04:19:27.010078 kubelet[2745]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 04:19:27.010078 kubelet[2745]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 4 04:19:27.010078 kubelet[2745]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 04:19:27.010678 kubelet[2745]: I0904 04:19:27.010108 2745 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 04:19:27.017246 kubelet[2745]: I0904 04:19:27.017188 2745 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 4 04:19:27.017246 kubelet[2745]: I0904 04:19:27.017214 2745 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 04:19:27.017452 kubelet[2745]: I0904 04:19:27.017399 2745 server.go:954] "Client rotation is on, will bootstrap in background" Sep 4 04:19:27.018499 kubelet[2745]: I0904 04:19:27.018478 2745 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 04:19:27.021250 kubelet[2745]: I0904 04:19:27.020986 2745 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 04:19:27.028945 kubelet[2745]: I0904 04:19:27.028896 2745 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 04:19:27.035694 kubelet[2745]: I0904 04:19:27.035632 2745 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 04:19:27.036011 kubelet[2745]: I0904 04:19:27.035971 2745 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 04:19:27.036234 kubelet[2745]: I0904 04:19:27.036007 2745 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 04:19:27.036394 kubelet[2745]: I0904 04:19:27.036251 2745 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 04:19:27.036394 kubelet[2745]: I0904 04:19:27.036262 2745 container_manager_linux.go:304] "Creating device plugin manager" Sep 4 04:19:27.036394 kubelet[2745]: I0904 04:19:27.036341 2745 state_mem.go:36] "Initialized new in-memory state store" Sep 4 04:19:27.036560 kubelet[2745]: I0904 04:19:27.036535 2745 kubelet.go:446] "Attempting to sync node with API server" Sep 4 04:19:27.036606 kubelet[2745]: I0904 04:19:27.036576 2745 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 04:19:27.036606 kubelet[2745]: I0904 04:19:27.036604 2745 kubelet.go:352] "Adding apiserver pod source" Sep 4 04:19:27.036680 kubelet[2745]: I0904 04:19:27.036633 2745 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 04:19:27.037862 kubelet[2745]: I0904 04:19:27.037826 2745 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 04:19:27.038530 kubelet[2745]: I0904 04:19:27.038470 2745 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 04:19:27.039447 kubelet[2745]: I0904 04:19:27.039388 2745 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 4 04:19:27.039492 kubelet[2745]: I0904 04:19:27.039464 2745 server.go:1287] "Started kubelet" Sep 4 04:19:27.039606 kubelet[2745]: I0904 04:19:27.039567 2745 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 04:19:27.041105 kubelet[2745]: I0904 04:19:27.040464 2745 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 04:19:27.041105 kubelet[2745]: I0904 04:19:27.041022 2745 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 04:19:27.042745 kubelet[2745]: I0904 04:19:27.042707 2745 server.go:479] "Adding debug handlers to kubelet server" Sep 4 04:19:27.043507 kubelet[2745]: I0904 04:19:27.043424 2745 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 04:19:27.043507 kubelet[2745]: I0904 04:19:27.043478 2745 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 04:19:27.053299 kubelet[2745]: E0904 04:19:27.053264 2745 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 04:19:27.053792 kubelet[2745]: I0904 04:19:27.053507 2745 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 4 04:19:27.053792 kubelet[2745]: I0904 04:19:27.053615 2745 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 4 04:19:27.053792 kubelet[2745]: I0904 04:19:27.053727 2745 reconciler.go:26] "Reconciler: start to sync state" Sep 4 04:19:27.054144 kubelet[2745]: E0904 04:19:27.053976 2745 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 04:19:27.055710 kubelet[2745]: I0904 04:19:27.055670 2745 factory.go:221] Registration of the systemd container factory successfully Sep 4 04:19:27.055908 kubelet[2745]: I0904 04:19:27.055753 2745 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 04:19:27.057238 kubelet[2745]: I0904 04:19:27.057173 2745 factory.go:221] Registration of the containerd container factory successfully Sep 4 04:19:27.065959 kubelet[2745]: I0904 04:19:27.065914 2745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 04:19:27.067623 kubelet[2745]: I0904 04:19:27.067555 2745 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 04:19:27.067623 kubelet[2745]: I0904 04:19:27.067609 2745 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 4 04:19:27.067708 kubelet[2745]: I0904 04:19:27.067646 2745 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 4 04:19:27.067708 kubelet[2745]: I0904 04:19:27.067661 2745 kubelet.go:2382] "Starting kubelet main sync loop" Sep 4 04:19:27.067802 kubelet[2745]: E0904 04:19:27.067748 2745 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 04:19:27.099630 kubelet[2745]: I0904 04:19:27.099580 2745 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 4 04:19:27.099630 kubelet[2745]: I0904 04:19:27.099611 2745 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 4 04:19:27.099793 kubelet[2745]: I0904 04:19:27.099652 2745 state_mem.go:36] "Initialized new in-memory state store" Sep 4 04:19:27.099968 kubelet[2745]: I0904 04:19:27.099941 2745 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 04:19:27.099994 kubelet[2745]: I0904 04:19:27.099965 2745 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 04:19:27.100051 kubelet[2745]: I0904 04:19:27.100001 2745 policy_none.go:49] "None policy: Start" Sep 4 04:19:27.100051 kubelet[2745]: I0904 04:19:27.100019 2745 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 4 04:19:27.100051 kubelet[2745]: I0904 04:19:27.100038 2745 state_mem.go:35] "Initializing new in-memory state store" Sep 4 04:19:27.100249 kubelet[2745]: I0904 04:19:27.100227 2745 state_mem.go:75] "Updated machine memory state" Sep 4 04:19:27.106666 kubelet[2745]: I0904 04:19:27.106277 2745 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 04:19:27.106666 kubelet[2745]: I0904 04:19:27.106546 2745 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 04:19:27.106666 kubelet[2745]: I0904 04:19:27.106562 2745 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 04:19:27.107417 kubelet[2745]: I0904 04:19:27.107395 2745 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 04:19:27.112949 kubelet[2745]: E0904 04:19:27.112907 2745 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 4 04:19:27.168609 kubelet[2745]: I0904 04:19:27.168540 2745 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:27.168931 kubelet[2745]: I0904 04:19:27.168668 2745 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 4 04:19:27.168931 kubelet[2745]: I0904 04:19:27.168560 2745 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.209545 kubelet[2745]: I0904 04:19:27.208829 2745 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 4 04:19:27.255250 kubelet[2745]: I0904 04:19:27.255188 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.255250 kubelet[2745]: I0904 04:19:27.255245 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.255515 kubelet[2745]: I0904 04:19:27.255272 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.255515 kubelet[2745]: I0904 04:19:27.255294 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.255515 kubelet[2745]: I0904 04:19:27.255319 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:27.255626 kubelet[2745]: I0904 04:19:27.255440 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:27.255626 kubelet[2745]: I0904 04:19:27.255575 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c700ac377fb3db79fe14b7ad0801eadd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c700ac377fb3db79fe14b7ad0801eadd\") " pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:27.255626 kubelet[2745]: I0904 04:19:27.255602 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 04:19:27.255626 kubelet[2745]: I0904 04:19:27.255626 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 4 04:19:27.337470 kubelet[2745]: I0904 04:19:27.337118 2745 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 4 04:19:27.337470 kubelet[2745]: I0904 04:19:27.337272 2745 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 4 04:19:28.037593 kubelet[2745]: I0904 04:19:28.037516 2745 apiserver.go:52] "Watching apiserver" Sep 4 04:19:28.053993 kubelet[2745]: I0904 04:19:28.053920 2745 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 4 04:19:28.081718 kubelet[2745]: I0904 04:19:28.081532 2745 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:28.119541 kubelet[2745]: E0904 04:19:28.119381 2745 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 4 04:19:28.160308 kubelet[2745]: I0904 04:19:28.160202 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.160178119 podStartE2EDuration="1.160178119s" podCreationTimestamp="2025-09-04 04:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:19:28.160038546 +0000 UTC m=+1.184076493" watchObservedRunningTime="2025-09-04 04:19:28.160178119 +0000 UTC m=+1.184216066" Sep 4 04:19:28.172200 kubelet[2745]: I0904 04:19:28.171533 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.171504686 podStartE2EDuration="1.171504686s" podCreationTimestamp="2025-09-04 04:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:19:28.171228666 +0000 UTC m=+1.195266623" watchObservedRunningTime="2025-09-04 04:19:28.171504686 +0000 UTC m=+1.195542633" Sep 4 04:19:28.195375 kubelet[2745]: I0904 04:19:28.195285 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.195255489 podStartE2EDuration="1.195255489s" podCreationTimestamp="2025-09-04 04:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:19:28.183708176 +0000 UTC m=+1.207746113" watchObservedRunningTime="2025-09-04 04:19:28.195255489 +0000 UTC m=+1.219293426" Sep 4 04:19:31.303911 kubelet[2745]: I0904 04:19:31.303860 2745 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 04:19:31.304566 kubelet[2745]: I0904 04:19:31.304461 2745 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 04:19:31.304614 containerd[1573]: time="2025-09-04T04:19:31.304273000Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 04:19:32.221520 systemd[1]: Created slice kubepods-besteffort-pod2daa5717_5a25_414e_97f3_de6d04363278.slice - libcontainer container kubepods-besteffort-pod2daa5717_5a25_414e_97f3_de6d04363278.slice. Sep 4 04:19:32.296805 kubelet[2745]: I0904 04:19:32.296721 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2daa5717-5a25-414e-97f3-de6d04363278-kube-proxy\") pod \"kube-proxy-8529s\" (UID: \"2daa5717-5a25-414e-97f3-de6d04363278\") " pod="kube-system/kube-proxy-8529s" Sep 4 04:19:32.296805 kubelet[2745]: I0904 04:19:32.296775 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2daa5717-5a25-414e-97f3-de6d04363278-xtables-lock\") pod \"kube-proxy-8529s\" (UID: \"2daa5717-5a25-414e-97f3-de6d04363278\") " pod="kube-system/kube-proxy-8529s" Sep 4 04:19:32.296805 kubelet[2745]: I0904 04:19:32.296805 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2daa5717-5a25-414e-97f3-de6d04363278-lib-modules\") pod \"kube-proxy-8529s\" (UID: \"2daa5717-5a25-414e-97f3-de6d04363278\") " pod="kube-system/kube-proxy-8529s" Sep 4 04:19:32.296805 kubelet[2745]: I0904 04:19:32.296820 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lgm\" (UniqueName: \"kubernetes.io/projected/2daa5717-5a25-414e-97f3-de6d04363278-kube-api-access-w4lgm\") pod \"kube-proxy-8529s\" (UID: \"2daa5717-5a25-414e-97f3-de6d04363278\") " pod="kube-system/kube-proxy-8529s" Sep 4 04:19:32.399617 systemd[1]: Created slice kubepods-besteffort-pod1479e961_e663_4ccb_ba09_d7e8b5d24aac.slice - libcontainer container kubepods-besteffort-pod1479e961_e663_4ccb_ba09_d7e8b5d24aac.slice. Sep 4 04:19:32.499121 kubelet[2745]: I0904 04:19:32.498878 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8zl\" (UniqueName: \"kubernetes.io/projected/1479e961-e663-4ccb-ba09-d7e8b5d24aac-kube-api-access-cv8zl\") pod \"tigera-operator-755d956888-gbbq5\" (UID: \"1479e961-e663-4ccb-ba09-d7e8b5d24aac\") " pod="tigera-operator/tigera-operator-755d956888-gbbq5" Sep 4 04:19:32.499121 kubelet[2745]: I0904 04:19:32.498936 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1479e961-e663-4ccb-ba09-d7e8b5d24aac-var-lib-calico\") pod \"tigera-operator-755d956888-gbbq5\" (UID: \"1479e961-e663-4ccb-ba09-d7e8b5d24aac\") " pod="tigera-operator/tigera-operator-755d956888-gbbq5" Sep 4 04:19:32.534637 containerd[1573]: time="2025-09-04T04:19:32.534572785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8529s,Uid:2daa5717-5a25-414e-97f3-de6d04363278,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:32.594252 containerd[1573]: time="2025-09-04T04:19:32.594193833Z" level=info msg="connecting to shim 5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c" address="unix:///run/containerd/s/385a8161d73705987e348dadb2b7d2fb3a05b018fa86ccd59b42f23658cc78c4" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:32.631201 systemd[1]: Started cri-containerd-5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c.scope - libcontainer container 5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c. Sep 4 04:19:32.656857 containerd[1573]: time="2025-09-04T04:19:32.656799217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8529s,Uid:2daa5717-5a25-414e-97f3-de6d04363278,Namespace:kube-system,Attempt:0,} returns sandbox id \"5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c\"" Sep 4 04:19:32.659255 containerd[1573]: time="2025-09-04T04:19:32.659228929Z" level=info msg="CreateContainer within sandbox \"5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 04:19:32.671212 containerd[1573]: time="2025-09-04T04:19:32.671165212Z" level=info msg="Container b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:32.681477 containerd[1573]: time="2025-09-04T04:19:32.681435498Z" level=info msg="CreateContainer within sandbox \"5e9c95127b1a0516c1bfadde1c7f3453cb9574a408c99222b6cec87e6beaaa8c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11\"" Sep 4 04:19:32.682183 containerd[1573]: time="2025-09-04T04:19:32.682152117Z" level=info msg="StartContainer for \"b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11\"" Sep 4 04:19:32.683825 containerd[1573]: time="2025-09-04T04:19:32.683786472Z" level=info msg="connecting to shim b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11" address="unix:///run/containerd/s/385a8161d73705987e348dadb2b7d2fb3a05b018fa86ccd59b42f23658cc78c4" protocol=ttrpc version=3 Sep 4 04:19:32.704664 containerd[1573]: time="2025-09-04T04:19:32.704611282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gbbq5,Uid:1479e961-e663-4ccb-ba09-d7e8b5d24aac,Namespace:tigera-operator,Attempt:0,}" Sep 4 04:19:32.712253 systemd[1]: Started cri-containerd-b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11.scope - libcontainer container b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11. Sep 4 04:19:32.734860 containerd[1573]: time="2025-09-04T04:19:32.734792097Z" level=info msg="connecting to shim 167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1" address="unix:///run/containerd/s/693a098c0f14473d08508162c651bd244c4af78d9ee706b1f9b12a40cf929577" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:32.770388 systemd[1]: Started cri-containerd-167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1.scope - libcontainer container 167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1. Sep 4 04:19:32.779405 containerd[1573]: time="2025-09-04T04:19:32.779351682Z" level=info msg="StartContainer for \"b0a3c24e4ab162ec648c1aa5e62c0430df5fd97aed53384d90a3c648eca29c11\" returns successfully" Sep 4 04:19:32.830080 containerd[1573]: time="2025-09-04T04:19:32.830009342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-gbbq5,Uid:1479e961-e663-4ccb-ba09-d7e8b5d24aac,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1\"" Sep 4 04:19:32.832488 containerd[1573]: time="2025-09-04T04:19:32.832464192Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 04:19:33.415687 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4284918559.mount: Deactivated successfully. Sep 4 04:19:34.573666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2122521387.mount: Deactivated successfully. Sep 4 04:19:35.302751 containerd[1573]: time="2025-09-04T04:19:35.302683657Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:35.303343 containerd[1573]: time="2025-09-04T04:19:35.303317890Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 04:19:35.307074 containerd[1573]: time="2025-09-04T04:19:35.304551490Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:35.310913 containerd[1573]: time="2025-09-04T04:19:35.310882327Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:35.311487 containerd[1573]: time="2025-09-04T04:19:35.311448060Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.478955835s" Sep 4 04:19:35.311487 containerd[1573]: time="2025-09-04T04:19:35.311477626Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 04:19:35.313231 containerd[1573]: time="2025-09-04T04:19:35.313208611Z" level=info msg="CreateContainer within sandbox \"167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 04:19:35.324901 containerd[1573]: time="2025-09-04T04:19:35.324862218Z" level=info msg="Container 2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:35.336507 containerd[1573]: time="2025-09-04T04:19:35.336465739Z" level=info msg="CreateContainer within sandbox \"167863485f48e002d866eb9ade23e9495a9e6502d4a35bb751b9ed06f5eea6d1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba\"" Sep 4 04:19:35.336908 containerd[1573]: time="2025-09-04T04:19:35.336886141Z" level=info msg="StartContainer for \"2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba\"" Sep 4 04:19:35.337780 containerd[1573]: time="2025-09-04T04:19:35.337744164Z" level=info msg="connecting to shim 2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba" address="unix:///run/containerd/s/693a098c0f14473d08508162c651bd244c4af78d9ee706b1f9b12a40cf929577" protocol=ttrpc version=3 Sep 4 04:19:35.393212 systemd[1]: Started cri-containerd-2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba.scope - libcontainer container 2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba. Sep 4 04:19:35.428106 containerd[1573]: time="2025-09-04T04:19:35.428051169Z" level=info msg="StartContainer for \"2763d560928d83affe74da6df4584454bbe0758a43f5fbfea50686cf8b1f2fba\" returns successfully" Sep 4 04:19:35.937311 kubelet[2745]: I0904 04:19:35.937207 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8529s" podStartSLOduration=3.937180509 podStartE2EDuration="3.937180509s" podCreationTimestamp="2025-09-04 04:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:19:33.104511889 +0000 UTC m=+6.128549846" watchObservedRunningTime="2025-09-04 04:19:35.937180509 +0000 UTC m=+8.961218446" Sep 4 04:19:37.148163 kubelet[2745]: I0904 04:19:37.147812 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-gbbq5" podStartSLOduration=2.667019371 podStartE2EDuration="5.147786404s" podCreationTimestamp="2025-09-04 04:19:32 +0000 UTC" firstStartedPulling="2025-09-04 04:19:32.831269223 +0000 UTC m=+5.855307160" lastFinishedPulling="2025-09-04 04:19:35.312036266 +0000 UTC m=+8.336074193" observedRunningTime="2025-09-04 04:19:36.116870641 +0000 UTC m=+9.140908578" watchObservedRunningTime="2025-09-04 04:19:37.147786404 +0000 UTC m=+10.171824341" Sep 4 04:19:41.525649 sudo[1778]: pam_unix(sudo:session): session closed for user root Sep 4 04:19:41.529763 sshd[1777]: Connection closed by 10.0.0.1 port 53976 Sep 4 04:19:41.530829 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Sep 4 04:19:41.538588 systemd[1]: sshd@6-10.0.0.54:22-10.0.0.1:53976.service: Deactivated successfully. Sep 4 04:19:41.543894 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 04:19:41.544152 systemd[1]: session-7.scope: Consumed 6.158s CPU time, 227.1M memory peak. Sep 4 04:19:41.546343 systemd-logind[1521]: Session 7 logged out. Waiting for processes to exit. Sep 4 04:19:41.549239 systemd-logind[1521]: Removed session 7. Sep 4 04:19:44.013189 systemd[1]: Created slice kubepods-besteffort-podbded69f5_8481_43a8_8ec3_993770f93523.slice - libcontainer container kubepods-besteffort-podbded69f5_8481_43a8_8ec3_993770f93523.slice. Sep 4 04:19:44.063407 kubelet[2745]: I0904 04:19:44.063336 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkv4\" (UniqueName: \"kubernetes.io/projected/bded69f5-8481-43a8-8ec3-993770f93523-kube-api-access-xkkv4\") pod \"calico-typha-cd99b56b4-sr64f\" (UID: \"bded69f5-8481-43a8-8ec3-993770f93523\") " pod="calico-system/calico-typha-cd99b56b4-sr64f" Sep 4 04:19:44.063407 kubelet[2745]: I0904 04:19:44.063397 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bded69f5-8481-43a8-8ec3-993770f93523-typha-certs\") pod \"calico-typha-cd99b56b4-sr64f\" (UID: \"bded69f5-8481-43a8-8ec3-993770f93523\") " pod="calico-system/calico-typha-cd99b56b4-sr64f" Sep 4 04:19:44.063407 kubelet[2745]: I0904 04:19:44.063422 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bded69f5-8481-43a8-8ec3-993770f93523-tigera-ca-bundle\") pod \"calico-typha-cd99b56b4-sr64f\" (UID: \"bded69f5-8481-43a8-8ec3-993770f93523\") " pod="calico-system/calico-typha-cd99b56b4-sr64f" Sep 4 04:19:44.319731 containerd[1573]: time="2025-09-04T04:19:44.319410370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-cd99b56b4-sr64f,Uid:bded69f5-8481-43a8-8ec3-993770f93523,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:44.439335 systemd[1]: Created slice kubepods-besteffort-pod3b21ba42_b698_4f8f_8ecf_2a7a21ebd7a2.slice - libcontainer container kubepods-besteffort-pod3b21ba42_b698_4f8f_8ecf_2a7a21ebd7a2.slice. Sep 4 04:19:44.451475 containerd[1573]: time="2025-09-04T04:19:44.451330917Z" level=info msg="connecting to shim a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b" address="unix:///run/containerd/s/197b37b4a02a65041d19865c45a9a4a00ff24b1c02eaa5f139f71b4235628920" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:44.465862 kubelet[2745]: I0904 04:19:44.465805 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-cni-net-dir\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.465862 kubelet[2745]: I0904 04:19:44.465857 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-flexvol-driver-host\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.465862 kubelet[2745]: I0904 04:19:44.465874 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-var-run-calico\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466283 kubelet[2745]: I0904 04:19:44.465891 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-cni-log-dir\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466283 kubelet[2745]: I0904 04:19:44.465906 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-lib-modules\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466283 kubelet[2745]: I0904 04:19:44.465923 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-node-certs\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466283 kubelet[2745]: I0904 04:19:44.465935 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-policysync\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466283 kubelet[2745]: I0904 04:19:44.465948 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-xtables-lock\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466905 kubelet[2745]: I0904 04:19:44.465962 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljgn\" (UniqueName: \"kubernetes.io/projected/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-kube-api-access-jljgn\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466905 kubelet[2745]: I0904 04:19:44.465976 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-var-lib-calico\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466905 kubelet[2745]: I0904 04:19:44.465989 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-cni-bin-dir\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.466905 kubelet[2745]: I0904 04:19:44.466012 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2-tigera-ca-bundle\") pod \"calico-node-j88qj\" (UID: \"3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2\") " pod="calico-system/calico-node-j88qj" Sep 4 04:19:44.509367 systemd[1]: Started cri-containerd-a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b.scope - libcontainer container a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b. Sep 4 04:19:44.572246 containerd[1573]: time="2025-09-04T04:19:44.571705329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-cd99b56b4-sr64f,Uid:bded69f5-8481-43a8-8ec3-993770f93523,Namespace:calico-system,Attempt:0,} returns sandbox id \"a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b\"" Sep 4 04:19:44.578617 kubelet[2745]: E0904 04:19:44.578576 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.578617 kubelet[2745]: W0904 04:19:44.578603 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.578756 kubelet[2745]: E0904 04:19:44.578651 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.579458 containerd[1573]: time="2025-09-04T04:19:44.579397707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 04:19:44.585463 kubelet[2745]: E0904 04:19:44.585421 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.585463 kubelet[2745]: W0904 04:19:44.585454 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.585588 kubelet[2745]: E0904 04:19:44.585483 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.674281 kubelet[2745]: E0904 04:19:44.674181 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:44.745823 containerd[1573]: time="2025-09-04T04:19:44.745751927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j88qj,Uid:3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:44.760436 kubelet[2745]: E0904 04:19:44.760369 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.760436 kubelet[2745]: W0904 04:19:44.760405 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.760436 kubelet[2745]: E0904 04:19:44.760438 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.760820 kubelet[2745]: E0904 04:19:44.760786 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.760820 kubelet[2745]: W0904 04:19:44.760823 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.760934 kubelet[2745]: E0904 04:19:44.760834 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.761074 kubelet[2745]: E0904 04:19:44.761029 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.761074 kubelet[2745]: W0904 04:19:44.761041 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.761074 kubelet[2745]: E0904 04:19:44.761070 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.762727 kubelet[2745]: E0904 04:19:44.762654 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.762727 kubelet[2745]: W0904 04:19:44.762693 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.762727 kubelet[2745]: E0904 04:19:44.762708 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.763090 kubelet[2745]: E0904 04:19:44.762994 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.763090 kubelet[2745]: W0904 04:19:44.763033 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.763393 kubelet[2745]: E0904 04:19:44.763092 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.763865 kubelet[2745]: E0904 04:19:44.763442 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.763865 kubelet[2745]: W0904 04:19:44.763456 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.763865 kubelet[2745]: E0904 04:19:44.763470 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.763865 kubelet[2745]: E0904 04:19:44.763739 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.763865 kubelet[2745]: W0904 04:19:44.763751 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.763865 kubelet[2745]: E0904 04:19:44.763780 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.764161 kubelet[2745]: E0904 04:19:44.764038 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.764161 kubelet[2745]: W0904 04:19:44.764050 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.764161 kubelet[2745]: E0904 04:19:44.764078 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.764412 kubelet[2745]: E0904 04:19:44.764394 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.764412 kubelet[2745]: W0904 04:19:44.764408 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.764556 kubelet[2745]: E0904 04:19:44.764427 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.764719 kubelet[2745]: E0904 04:19:44.764702 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.764719 kubelet[2745]: W0904 04:19:44.764715 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.764813 kubelet[2745]: E0904 04:19:44.764727 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.764952 kubelet[2745]: E0904 04:19:44.764924 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.764952 kubelet[2745]: W0904 04:19:44.764933 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.764952 kubelet[2745]: E0904 04:19:44.764942 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.765244 kubelet[2745]: E0904 04:19:44.765197 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.765244 kubelet[2745]: W0904 04:19:44.765209 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.765244 kubelet[2745]: E0904 04:19:44.765219 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.765459 kubelet[2745]: E0904 04:19:44.765438 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.765459 kubelet[2745]: W0904 04:19:44.765455 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.765559 kubelet[2745]: E0904 04:19:44.765471 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.765771 kubelet[2745]: E0904 04:19:44.765747 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.765771 kubelet[2745]: W0904 04:19:44.765765 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.765864 kubelet[2745]: E0904 04:19:44.765779 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.766103 kubelet[2745]: E0904 04:19:44.766083 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.766103 kubelet[2745]: W0904 04:19:44.766100 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.766182 kubelet[2745]: E0904 04:19:44.766113 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.766393 kubelet[2745]: E0904 04:19:44.766320 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.766393 kubelet[2745]: W0904 04:19:44.766338 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.766393 kubelet[2745]: E0904 04:19:44.766349 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.766700 kubelet[2745]: E0904 04:19:44.766680 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.766700 kubelet[2745]: W0904 04:19:44.766696 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.766784 kubelet[2745]: E0904 04:19:44.766710 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.766961 kubelet[2745]: E0904 04:19:44.766941 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.766961 kubelet[2745]: W0904 04:19:44.766956 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.767047 kubelet[2745]: E0904 04:19:44.766968 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.767386 kubelet[2745]: E0904 04:19:44.767255 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.767386 kubelet[2745]: W0904 04:19:44.767271 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.767386 kubelet[2745]: E0904 04:19:44.767284 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.767937 kubelet[2745]: E0904 04:19:44.767885 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.768157 kubelet[2745]: W0904 04:19:44.768026 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.768250 kubelet[2745]: E0904 04:19:44.768048 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.772544 kubelet[2745]: E0904 04:19:44.772386 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.772544 kubelet[2745]: W0904 04:19:44.772414 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.772544 kubelet[2745]: E0904 04:19:44.772438 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.772544 kubelet[2745]: I0904 04:19:44.772468 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/586c1b6d-b550-4d19-9b28-3936c94d31f1-registration-dir\") pod \"csi-node-driver-7q5r5\" (UID: \"586c1b6d-b550-4d19-9b28-3936c94d31f1\") " pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:44.773132 kubelet[2745]: E0904 04:19:44.773112 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.773132 kubelet[2745]: W0904 04:19:44.773131 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.773299 kubelet[2745]: E0904 04:19:44.773264 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.773299 kubelet[2745]: I0904 04:19:44.773293 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/586c1b6d-b550-4d19-9b28-3936c94d31f1-kubelet-dir\") pod \"csi-node-driver-7q5r5\" (UID: \"586c1b6d-b550-4d19-9b28-3936c94d31f1\") " pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:44.774242 kubelet[2745]: E0904 04:19:44.774191 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.774242 kubelet[2745]: W0904 04:19:44.774228 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.775190 kubelet[2745]: E0904 04:19:44.774569 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.775290 kubelet[2745]: E0904 04:19:44.774960 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.775340 kubelet[2745]: W0904 04:19:44.775295 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.776139 kubelet[2745]: E0904 04:19:44.776093 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.776207 kubelet[2745]: I0904 04:19:44.776158 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/586c1b6d-b550-4d19-9b28-3936c94d31f1-varrun\") pod \"csi-node-driver-7q5r5\" (UID: \"586c1b6d-b550-4d19-9b28-3936c94d31f1\") " pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:44.776821 kubelet[2745]: E0904 04:19:44.776786 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.776821 kubelet[2745]: W0904 04:19:44.776814 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.776927 kubelet[2745]: E0904 04:19:44.776839 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.777156 kubelet[2745]: E0904 04:19:44.777093 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.777156 kubelet[2745]: W0904 04:19:44.777112 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.777156 kubelet[2745]: E0904 04:19:44.777124 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.777788 kubelet[2745]: E0904 04:19:44.777754 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.777788 kubelet[2745]: W0904 04:19:44.777771 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.778002 kubelet[2745]: E0904 04:19:44.777908 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.778182 kubelet[2745]: I0904 04:19:44.777937 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg4vg\" (UniqueName: \"kubernetes.io/projected/586c1b6d-b550-4d19-9b28-3936c94d31f1-kube-api-access-gg4vg\") pod \"csi-node-driver-7q5r5\" (UID: \"586c1b6d-b550-4d19-9b28-3936c94d31f1\") " pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:44.778540 kubelet[2745]: E0904 04:19:44.778490 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.778540 kubelet[2745]: W0904 04:19:44.778510 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.778745 kubelet[2745]: E0904 04:19:44.778650 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.779480 kubelet[2745]: E0904 04:19:44.779391 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.779480 kubelet[2745]: W0904 04:19:44.779445 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.779480 kubelet[2745]: E0904 04:19:44.779461 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.779719 containerd[1573]: time="2025-09-04T04:19:44.779593256Z" level=info msg="connecting to shim e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623" address="unix:///run/containerd/s/53cb9382ef46e2548683df883c3105b86eb3a5e9f52d1ea8af019798ea554017" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:19:44.780385 kubelet[2745]: E0904 04:19:44.780347 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.780385 kubelet[2745]: W0904 04:19:44.780363 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.780385 kubelet[2745]: E0904 04:19:44.780385 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.780647 kubelet[2745]: E0904 04:19:44.780628 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.780647 kubelet[2745]: W0904 04:19:44.780643 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.780773 kubelet[2745]: E0904 04:19:44.780670 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.780773 kubelet[2745]: I0904 04:19:44.780693 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/586c1b6d-b550-4d19-9b28-3936c94d31f1-socket-dir\") pod \"csi-node-driver-7q5r5\" (UID: \"586c1b6d-b550-4d19-9b28-3936c94d31f1\") " pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:44.780968 kubelet[2745]: E0904 04:19:44.780926 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.780968 kubelet[2745]: W0904 04:19:44.780947 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.780968 kubelet[2745]: E0904 04:19:44.780959 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.781247 kubelet[2745]: E0904 04:19:44.781227 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.781247 kubelet[2745]: W0904 04:19:44.781243 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.781329 kubelet[2745]: E0904 04:19:44.781274 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.781555 kubelet[2745]: E0904 04:19:44.781535 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.781555 kubelet[2745]: W0904 04:19:44.781552 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.781633 kubelet[2745]: E0904 04:19:44.781564 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.781878 kubelet[2745]: E0904 04:19:44.781859 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.781878 kubelet[2745]: W0904 04:19:44.781874 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.781955 kubelet[2745]: E0904 04:19:44.781886 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.817542 systemd[1]: Started cri-containerd-e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623.scope - libcontainer container e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623. Sep 4 04:19:44.867638 containerd[1573]: time="2025-09-04T04:19:44.867008174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j88qj,Uid:3b21ba42-b698-4f8f-8ecf-2a7a21ebd7a2,Namespace:calico-system,Attempt:0,} returns sandbox id \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\"" Sep 4 04:19:44.883288 kubelet[2745]: E0904 04:19:44.883238 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.883288 kubelet[2745]: W0904 04:19:44.883270 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.883288 kubelet[2745]: E0904 04:19:44.883298 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.884388 kubelet[2745]: E0904 04:19:44.884240 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.884388 kubelet[2745]: W0904 04:19:44.884265 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.884388 kubelet[2745]: E0904 04:19:44.884378 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.885080 kubelet[2745]: E0904 04:19:44.885036 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.886925 kubelet[2745]: W0904 04:19:44.885052 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.886925 kubelet[2745]: E0904 04:19:44.885201 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.886925 kubelet[2745]: E0904 04:19:44.886098 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.886925 kubelet[2745]: W0904 04:19:44.886108 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.886925 kubelet[2745]: E0904 04:19:44.886194 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.887179 kubelet[2745]: E0904 04:19:44.887137 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.887179 kubelet[2745]: W0904 04:19:44.887156 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.887362 kubelet[2745]: E0904 04:19:44.887220 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.887482 kubelet[2745]: E0904 04:19:44.887393 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.887482 kubelet[2745]: W0904 04:19:44.887420 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.887482 kubelet[2745]: E0904 04:19:44.887476 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.887938 kubelet[2745]: E0904 04:19:44.887684 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.887938 kubelet[2745]: W0904 04:19:44.887698 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.887938 kubelet[2745]: E0904 04:19:44.887779 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.888028 kubelet[2745]: E0904 04:19:44.887975 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.888028 kubelet[2745]: W0904 04:19:44.887983 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.888107 kubelet[2745]: E0904 04:19:44.888046 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.888468 kubelet[2745]: E0904 04:19:44.888421 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.888468 kubelet[2745]: W0904 04:19:44.888435 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.888545 kubelet[2745]: E0904 04:19:44.888472 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.888963 kubelet[2745]: E0904 04:19:44.888937 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.888963 kubelet[2745]: W0904 04:19:44.888951 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.889115 kubelet[2745]: E0904 04:19:44.888981 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.889384 kubelet[2745]: E0904 04:19:44.889350 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.889384 kubelet[2745]: W0904 04:19:44.889365 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.889473 kubelet[2745]: E0904 04:19:44.889404 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.889894 kubelet[2745]: E0904 04:19:44.889861 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.889894 kubelet[2745]: W0904 04:19:44.889880 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.890030 kubelet[2745]: E0904 04:19:44.890006 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.890156 kubelet[2745]: E0904 04:19:44.890139 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.890156 kubelet[2745]: W0904 04:19:44.890150 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.890262 kubelet[2745]: E0904 04:19:44.890241 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.890516 kubelet[2745]: E0904 04:19:44.890488 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.890516 kubelet[2745]: W0904 04:19:44.890507 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.890812 kubelet[2745]: E0904 04:19:44.890722 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.890907 kubelet[2745]: E0904 04:19:44.890886 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.890907 kubelet[2745]: W0904 04:19:44.890899 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.890974 kubelet[2745]: E0904 04:19:44.890948 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.891718 kubelet[2745]: E0904 04:19:44.891360 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.891718 kubelet[2745]: W0904 04:19:44.891375 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.891718 kubelet[2745]: E0904 04:19:44.891676 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.891922 kubelet[2745]: E0904 04:19:44.891897 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.891922 kubelet[2745]: W0904 04:19:44.891914 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.892023 kubelet[2745]: E0904 04:19:44.891956 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.892159 kubelet[2745]: E0904 04:19:44.892136 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.892224 kubelet[2745]: W0904 04:19:44.892170 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.892224 kubelet[2745]: E0904 04:19:44.892210 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.892451 kubelet[2745]: E0904 04:19:44.892424 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.892451 kubelet[2745]: W0904 04:19:44.892440 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.892625 kubelet[2745]: E0904 04:19:44.892492 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.892721 kubelet[2745]: E0904 04:19:44.892703 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.892721 kubelet[2745]: W0904 04:19:44.892718 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.892918 kubelet[2745]: E0904 04:19:44.892804 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.893353 kubelet[2745]: E0904 04:19:44.893335 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.893353 kubelet[2745]: W0904 04:19:44.893348 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.894030 kubelet[2745]: E0904 04:19:44.893880 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.894432 kubelet[2745]: E0904 04:19:44.894415 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.894432 kubelet[2745]: W0904 04:19:44.894428 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.894682 kubelet[2745]: E0904 04:19:44.894639 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.895689 kubelet[2745]: E0904 04:19:44.895670 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.895689 kubelet[2745]: W0904 04:19:44.895684 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.895787 kubelet[2745]: E0904 04:19:44.895778 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.895966 kubelet[2745]: E0904 04:19:44.895951 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.895966 kubelet[2745]: W0904 04:19:44.895961 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.896032 kubelet[2745]: E0904 04:19:44.895976 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.896357 kubelet[2745]: E0904 04:19:44.896240 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.896357 kubelet[2745]: W0904 04:19:44.896253 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.896357 kubelet[2745]: E0904 04:19:44.896262 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:44.906314 kubelet[2745]: E0904 04:19:44.906284 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:44.906547 kubelet[2745]: W0904 04:19:44.906478 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:44.906547 kubelet[2745]: E0904 04:19:44.906504 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:45.925979 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2177025114.mount: Deactivated successfully. Sep 4 04:19:46.068470 kubelet[2745]: E0904 04:19:46.068419 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:46.288980 containerd[1573]: time="2025-09-04T04:19:46.288817117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:46.289975 containerd[1573]: time="2025-09-04T04:19:46.289939596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 04:19:46.291219 containerd[1573]: time="2025-09-04T04:19:46.291182800Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:46.293367 containerd[1573]: time="2025-09-04T04:19:46.293335552Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:46.293871 containerd[1573]: time="2025-09-04T04:19:46.293846783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 1.714405483s" Sep 4 04:19:46.293923 containerd[1573]: time="2025-09-04T04:19:46.293876518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 04:19:46.294666 containerd[1573]: time="2025-09-04T04:19:46.294648408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 04:19:46.303507 containerd[1573]: time="2025-09-04T04:19:46.303461616Z" level=info msg="CreateContainer within sandbox \"a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 04:19:46.314051 containerd[1573]: time="2025-09-04T04:19:46.313981901Z" level=info msg="Container 9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:46.324877 containerd[1573]: time="2025-09-04T04:19:46.324822558Z" level=info msg="CreateContainer within sandbox \"a12f99ef57814e2365de63cd6f6d2e5e825b93469ebb6bf71d2b5e1e4874567b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29\"" Sep 4 04:19:46.325552 containerd[1573]: time="2025-09-04T04:19:46.325473099Z" level=info msg="StartContainer for \"9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29\"" Sep 4 04:19:46.326860 containerd[1573]: time="2025-09-04T04:19:46.326805902Z" level=info msg="connecting to shim 9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29" address="unix:///run/containerd/s/197b37b4a02a65041d19865c45a9a4a00ff24b1c02eaa5f139f71b4235628920" protocol=ttrpc version=3 Sep 4 04:19:46.354368 systemd[1]: Started cri-containerd-9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29.scope - libcontainer container 9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29. Sep 4 04:19:46.408545 containerd[1573]: time="2025-09-04T04:19:46.408505425Z" level=info msg="StartContainer for \"9c81bbba5f95207539827f8d78bacb4368ee7da58702ac1679edb0a5ae248a29\" returns successfully" Sep 4 04:19:47.142666 kubelet[2745]: I0904 04:19:47.142308 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-cd99b56b4-sr64f" podStartSLOduration=2.426582495 podStartE2EDuration="4.142285904s" podCreationTimestamp="2025-09-04 04:19:43 +0000 UTC" firstStartedPulling="2025-09-04 04:19:44.578836574 +0000 UTC m=+17.602874501" lastFinishedPulling="2025-09-04 04:19:46.294539963 +0000 UTC m=+19.318577910" observedRunningTime="2025-09-04 04:19:47.141505679 +0000 UTC m=+20.165543617" watchObservedRunningTime="2025-09-04 04:19:47.142285904 +0000 UTC m=+20.166323841" Sep 4 04:19:47.184783 kubelet[2745]: E0904 04:19:47.184739 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.184783 kubelet[2745]: W0904 04:19:47.184780 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.184977 kubelet[2745]: E0904 04:19:47.184807 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.185086 kubelet[2745]: E0904 04:19:47.185016 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.185086 kubelet[2745]: W0904 04:19:47.185033 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.185086 kubelet[2745]: E0904 04:19:47.185046 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.185335 kubelet[2745]: E0904 04:19:47.185295 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.185335 kubelet[2745]: W0904 04:19:47.185306 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.185335 kubelet[2745]: E0904 04:19:47.185320 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.185626 kubelet[2745]: E0904 04:19:47.185595 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.185626 kubelet[2745]: W0904 04:19:47.185609 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.185626 kubelet[2745]: E0904 04:19:47.185620 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.185859 kubelet[2745]: E0904 04:19:47.185841 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.185859 kubelet[2745]: W0904 04:19:47.185854 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.185944 kubelet[2745]: E0904 04:19:47.185865 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.186099 kubelet[2745]: E0904 04:19:47.186079 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.186099 kubelet[2745]: W0904 04:19:47.186092 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.186192 kubelet[2745]: E0904 04:19:47.186104 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.186346 kubelet[2745]: E0904 04:19:47.186326 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.186346 kubelet[2745]: W0904 04:19:47.186340 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.186433 kubelet[2745]: E0904 04:19:47.186354 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.186623 kubelet[2745]: E0904 04:19:47.186585 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.186623 kubelet[2745]: W0904 04:19:47.186600 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.186623 kubelet[2745]: E0904 04:19:47.186610 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.186879 kubelet[2745]: E0904 04:19:47.186833 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.186879 kubelet[2745]: W0904 04:19:47.186848 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.186879 kubelet[2745]: E0904 04:19:47.186859 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.187150 kubelet[2745]: E0904 04:19:47.187103 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.187150 kubelet[2745]: W0904 04:19:47.187119 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.187150 kubelet[2745]: E0904 04:19:47.187131 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.187361 kubelet[2745]: E0904 04:19:47.187344 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.187361 kubelet[2745]: W0904 04:19:47.187356 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.187521 kubelet[2745]: E0904 04:19:47.187368 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.187632 kubelet[2745]: E0904 04:19:47.187613 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.187632 kubelet[2745]: W0904 04:19:47.187628 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.187742 kubelet[2745]: E0904 04:19:47.187639 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.187862 kubelet[2745]: E0904 04:19:47.187841 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.187862 kubelet[2745]: W0904 04:19:47.187855 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.187942 kubelet[2745]: E0904 04:19:47.187866 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.188321 kubelet[2745]: E0904 04:19:47.188284 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.188321 kubelet[2745]: W0904 04:19:47.188305 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.188321 kubelet[2745]: E0904 04:19:47.188318 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.188596 kubelet[2745]: E0904 04:19:47.188572 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.188596 kubelet[2745]: W0904 04:19:47.188590 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.188686 kubelet[2745]: E0904 04:19:47.188605 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.205024 kubelet[2745]: E0904 04:19:47.204966 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.205024 kubelet[2745]: W0904 04:19:47.204999 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.205024 kubelet[2745]: E0904 04:19:47.205022 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.205355 kubelet[2745]: E0904 04:19:47.205328 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.205394 kubelet[2745]: W0904 04:19:47.205374 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.205431 kubelet[2745]: E0904 04:19:47.205393 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.205745 kubelet[2745]: E0904 04:19:47.205725 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.205745 kubelet[2745]: W0904 04:19:47.205740 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.205825 kubelet[2745]: E0904 04:19:47.205759 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.206118 kubelet[2745]: E0904 04:19:47.206092 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.206118 kubelet[2745]: W0904 04:19:47.206111 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.206189 kubelet[2745]: E0904 04:19:47.206130 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.206429 kubelet[2745]: E0904 04:19:47.206386 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.206429 kubelet[2745]: W0904 04:19:47.206419 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.206492 kubelet[2745]: E0904 04:19:47.206461 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.206728 kubelet[2745]: E0904 04:19:47.206710 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.206728 kubelet[2745]: W0904 04:19:47.206722 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.206787 kubelet[2745]: E0904 04:19:47.206737 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.206959 kubelet[2745]: E0904 04:19:47.206939 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.206959 kubelet[2745]: W0904 04:19:47.206956 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.207024 kubelet[2745]: E0904 04:19:47.207005 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.207250 kubelet[2745]: E0904 04:19:47.207230 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.207250 kubelet[2745]: W0904 04:19:47.207245 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.207314 kubelet[2745]: E0904 04:19:47.207275 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.207465 kubelet[2745]: E0904 04:19:47.207446 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.207465 kubelet[2745]: W0904 04:19:47.207461 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.207525 kubelet[2745]: E0904 04:19:47.207507 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.207748 kubelet[2745]: E0904 04:19:47.207720 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.207748 kubelet[2745]: W0904 04:19:47.207737 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.207797 kubelet[2745]: E0904 04:19:47.207754 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.208046 kubelet[2745]: E0904 04:19:47.208017 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.208046 kubelet[2745]: W0904 04:19:47.208036 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.208109 kubelet[2745]: E0904 04:19:47.208051 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.208330 kubelet[2745]: E0904 04:19:47.208304 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.208363 kubelet[2745]: W0904 04:19:47.208328 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.208363 kubelet[2745]: E0904 04:19:47.208351 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.208671 kubelet[2745]: E0904 04:19:47.208604 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.208671 kubelet[2745]: W0904 04:19:47.208623 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.208671 kubelet[2745]: E0904 04:19:47.208639 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.208938 kubelet[2745]: E0904 04:19:47.208918 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.208938 kubelet[2745]: W0904 04:19:47.208934 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.209007 kubelet[2745]: E0904 04:19:47.208950 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.209210 kubelet[2745]: E0904 04:19:47.209186 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.209210 kubelet[2745]: W0904 04:19:47.209201 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.209267 kubelet[2745]: E0904 04:19:47.209217 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.209448 kubelet[2745]: E0904 04:19:47.209433 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.209448 kubelet[2745]: W0904 04:19:47.209445 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.209513 kubelet[2745]: E0904 04:19:47.209461 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.209695 kubelet[2745]: E0904 04:19:47.209669 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.209695 kubelet[2745]: W0904 04:19:47.209680 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.209695 kubelet[2745]: E0904 04:19:47.209688 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.209987 kubelet[2745]: E0904 04:19:47.209970 2745 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 04:19:47.209987 kubelet[2745]: W0904 04:19:47.209981 2745 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 04:19:47.210042 kubelet[2745]: E0904 04:19:47.209990 2745 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 04:19:47.707395 containerd[1573]: time="2025-09-04T04:19:47.707296679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:47.708859 containerd[1573]: time="2025-09-04T04:19:47.708681710Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 04:19:47.710028 containerd[1573]: time="2025-09-04T04:19:47.709967433Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:47.712146 containerd[1573]: time="2025-09-04T04:19:47.712112411Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:47.712810 containerd[1573]: time="2025-09-04T04:19:47.712776508Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.418102573s" Sep 4 04:19:47.712863 containerd[1573]: time="2025-09-04T04:19:47.712811855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 04:19:47.715119 containerd[1573]: time="2025-09-04T04:19:47.715080464Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 04:19:47.726330 containerd[1573]: time="2025-09-04T04:19:47.726259364Z" level=info msg="Container 7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:47.835777 containerd[1573]: time="2025-09-04T04:19:47.835697475Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\"" Sep 4 04:19:47.836318 containerd[1573]: time="2025-09-04T04:19:47.836268257Z" level=info msg="StartContainer for \"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\"" Sep 4 04:19:47.838113 containerd[1573]: time="2025-09-04T04:19:47.838030996Z" level=info msg="connecting to shim 7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c" address="unix:///run/containerd/s/53cb9382ef46e2548683df883c3105b86eb3a5e9f52d1ea8af019798ea554017" protocol=ttrpc version=3 Sep 4 04:19:47.864284 systemd[1]: Started cri-containerd-7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c.scope - libcontainer container 7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c. Sep 4 04:19:47.921041 containerd[1573]: time="2025-09-04T04:19:47.920982956Z" level=info msg="StartContainer for \"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\" returns successfully" Sep 4 04:19:47.933179 systemd[1]: cri-containerd-7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c.scope: Deactivated successfully. Sep 4 04:19:47.936076 containerd[1573]: time="2025-09-04T04:19:47.935326188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\" id:\"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\" pid:3429 exited_at:{seconds:1756959587 nanos:934421799}" Sep 4 04:19:47.936076 containerd[1573]: time="2025-09-04T04:19:47.935380981Z" level=info msg="received exit event container_id:\"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\" id:\"7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c\" pid:3429 exited_at:{seconds:1756959587 nanos:934421799}" Sep 4 04:19:47.972236 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7b7b4708e4f25549b68d00d21f27e8d18706a11ab47be7b14c890ebc5d39c60c-rootfs.mount: Deactivated successfully. Sep 4 04:19:48.069128 kubelet[2745]: E0904 04:19:48.069030 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:48.129674 kubelet[2745]: I0904 04:19:48.129626 2745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 04:19:49.148612 containerd[1573]: time="2025-09-04T04:19:49.148336669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 04:19:50.068826 kubelet[2745]: E0904 04:19:50.068753 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:52.068676 kubelet[2745]: E0904 04:19:52.068573 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:53.864475 containerd[1573]: time="2025-09-04T04:19:53.864401990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:53.865435 containerd[1573]: time="2025-09-04T04:19:53.865366561Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 04:19:53.867084 containerd[1573]: time="2025-09-04T04:19:53.866653255Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:53.869513 containerd[1573]: time="2025-09-04T04:19:53.869469080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:19:53.870070 containerd[1573]: time="2025-09-04T04:19:53.870022298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.721565655s" Sep 4 04:19:53.870070 containerd[1573]: time="2025-09-04T04:19:53.870049229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 04:19:53.872112 containerd[1573]: time="2025-09-04T04:19:53.872087264Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 04:19:53.882890 containerd[1573]: time="2025-09-04T04:19:53.882848711Z" level=info msg="Container 33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:19:53.896327 containerd[1573]: time="2025-09-04T04:19:53.896269999Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\"" Sep 4 04:19:53.896889 containerd[1573]: time="2025-09-04T04:19:53.896863112Z" level=info msg="StartContainer for \"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\"" Sep 4 04:19:53.898465 containerd[1573]: time="2025-09-04T04:19:53.898432156Z" level=info msg="connecting to shim 33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3" address="unix:///run/containerd/s/53cb9382ef46e2548683df883c3105b86eb3a5e9f52d1ea8af019798ea554017" protocol=ttrpc version=3 Sep 4 04:19:53.928270 systemd[1]: Started cri-containerd-33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3.scope - libcontainer container 33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3. Sep 4 04:19:53.978564 containerd[1573]: time="2025-09-04T04:19:53.978495256Z" level=info msg="StartContainer for \"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\" returns successfully" Sep 4 04:19:54.069435 kubelet[2745]: E0904 04:19:54.068600 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:55.201273 containerd[1573]: time="2025-09-04T04:19:55.201204863Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 04:19:55.204994 systemd[1]: cri-containerd-33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3.scope: Deactivated successfully. Sep 4 04:19:55.205515 systemd[1]: cri-containerd-33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3.scope: Consumed 723ms CPU time, 181.1M memory peak, 4.2M read from disk, 171.3M written to disk. Sep 4 04:19:55.207093 containerd[1573]: time="2025-09-04T04:19:55.207036186Z" level=info msg="received exit event container_id:\"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\" id:\"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\" pid:3490 exited_at:{seconds:1756959595 nanos:206827043}" Sep 4 04:19:55.207370 containerd[1573]: time="2025-09-04T04:19:55.207321391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\" id:\"33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3\" pid:3490 exited_at:{seconds:1756959595 nanos:206827043}" Sep 4 04:19:55.233307 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33774f1d1c6154b9470780e5e2a154baae7c91170c7317bf90dd18ce8bef13d3-rootfs.mount: Deactivated successfully. Sep 4 04:19:55.266897 kubelet[2745]: I0904 04:19:55.266839 2745 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 4 04:19:55.574202 systemd[1]: Created slice kubepods-burstable-pode7a8b161_507c_42a3_a943_83edd2ebf502.slice - libcontainer container kubepods-burstable-pode7a8b161_507c_42a3_a943_83edd2ebf502.slice. Sep 4 04:19:55.666315 kubelet[2745]: I0904 04:19:55.666239 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7a8b161-507c-42a3-a943-83edd2ebf502-config-volume\") pod \"coredns-668d6bf9bc-zm9wc\" (UID: \"e7a8b161-507c-42a3-a943-83edd2ebf502\") " pod="kube-system/coredns-668d6bf9bc-zm9wc" Sep 4 04:19:55.666315 kubelet[2745]: I0904 04:19:55.666317 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pjm\" (UniqueName: \"kubernetes.io/projected/e7a8b161-507c-42a3-a943-83edd2ebf502-kube-api-access-d6pjm\") pod \"coredns-668d6bf9bc-zm9wc\" (UID: \"e7a8b161-507c-42a3-a943-83edd2ebf502\") " pod="kube-system/coredns-668d6bf9bc-zm9wc" Sep 4 04:19:55.814040 systemd[1]: Created slice kubepods-burstable-pod00c7f901_355b_43d3_894a_6d038427ef01.slice - libcontainer container kubepods-burstable-pod00c7f901_355b_43d3_894a_6d038427ef01.slice. Sep 4 04:19:55.819966 systemd[1]: Created slice kubepods-besteffort-pod6857fb4d_09bc_4243_9e00_1619a4fbac6b.slice - libcontainer container kubepods-besteffort-pod6857fb4d_09bc_4243_9e00_1619a4fbac6b.slice. Sep 4 04:19:55.824770 systemd[1]: Created slice kubepods-besteffort-pod206d3408_826b_403f_9856_5b773f0ee6ff.slice - libcontainer container kubepods-besteffort-pod206d3408_826b_403f_9856_5b773f0ee6ff.slice. Sep 4 04:19:55.829332 systemd[1]: Created slice kubepods-besteffort-poda21fd64b_400b_4064_b1d8_df6a0495db00.slice - libcontainer container kubepods-besteffort-poda21fd64b_400b_4064_b1d8_df6a0495db00.slice. Sep 4 04:19:55.834224 systemd[1]: Created slice kubepods-besteffort-pod0c459fef_8814_434e_82d4_dedf8b1c5faa.slice - libcontainer container kubepods-besteffort-pod0c459fef_8814_434e_82d4_dedf8b1c5faa.slice. Sep 4 04:19:55.837860 systemd[1]: Created slice kubepods-besteffort-podef72b028_b203_4e62_b9a0_96331c5964b9.slice - libcontainer container kubepods-besteffort-podef72b028_b203_4e62_b9a0_96331c5964b9.slice. Sep 4 04:19:55.868199 kubelet[2745]: I0904 04:19:55.868141 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-backend-key-pair\") pod \"whisker-846495449b-nhdh2\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " pod="calico-system/whisker-846495449b-nhdh2" Sep 4 04:19:55.868199 kubelet[2745]: I0904 04:19:55.868184 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00c7f901-355b-43d3-894a-6d038427ef01-config-volume\") pod \"coredns-668d6bf9bc-nfhxx\" (UID: \"00c7f901-355b-43d3-894a-6d038427ef01\") " pod="kube-system/coredns-668d6bf9bc-nfhxx" Sep 4 04:19:55.868199 kubelet[2745]: I0904 04:19:55.868203 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h67w\" (UniqueName: \"kubernetes.io/projected/a21fd64b-400b-4064-b1d8-df6a0495db00-kube-api-access-8h67w\") pod \"whisker-846495449b-nhdh2\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " pod="calico-system/whisker-846495449b-nhdh2" Sep 4 04:19:55.868399 kubelet[2745]: I0904 04:19:55.868221 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6857fb4d-09bc-4243-9e00-1619a4fbac6b-calico-apiserver-certs\") pod \"calico-apiserver-66557b5466-8jz66\" (UID: \"6857fb4d-09bc-4243-9e00-1619a4fbac6b\") " pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" Sep 4 04:19:55.868399 kubelet[2745]: I0904 04:19:55.868239 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef72b028-b203-4e62-b9a0-96331c5964b9-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-lxjhq\" (UID: \"ef72b028-b203-4e62-b9a0-96331c5964b9\") " pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:55.868399 kubelet[2745]: I0904 04:19:55.868256 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlkth\" (UniqueName: \"kubernetes.io/projected/ef72b028-b203-4e62-b9a0-96331c5964b9-kube-api-access-wlkth\") pod \"goldmane-54d579b49d-lxjhq\" (UID: \"ef72b028-b203-4e62-b9a0-96331c5964b9\") " pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:55.868399 kubelet[2745]: I0904 04:19:55.868284 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/206d3408-826b-403f-9856-5b773f0ee6ff-calico-apiserver-certs\") pod \"calico-apiserver-66557b5466-5h544\" (UID: \"206d3408-826b-403f-9856-5b773f0ee6ff\") " pod="calico-apiserver/calico-apiserver-66557b5466-5h544" Sep 4 04:19:55.868399 kubelet[2745]: I0904 04:19:55.868301 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5s8\" (UniqueName: \"kubernetes.io/projected/206d3408-826b-403f-9856-5b773f0ee6ff-kube-api-access-xl5s8\") pod \"calico-apiserver-66557b5466-5h544\" (UID: \"206d3408-826b-403f-9856-5b773f0ee6ff\") " pod="calico-apiserver/calico-apiserver-66557b5466-5h544" Sep 4 04:19:55.868518 kubelet[2745]: I0904 04:19:55.868320 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ef72b028-b203-4e62-b9a0-96331c5964b9-goldmane-key-pair\") pod \"goldmane-54d579b49d-lxjhq\" (UID: \"ef72b028-b203-4e62-b9a0-96331c5964b9\") " pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:55.868518 kubelet[2745]: I0904 04:19:55.868338 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxtl\" (UniqueName: \"kubernetes.io/projected/6857fb4d-09bc-4243-9e00-1619a4fbac6b-kube-api-access-rzxtl\") pod \"calico-apiserver-66557b5466-8jz66\" (UID: \"6857fb4d-09bc-4243-9e00-1619a4fbac6b\") " pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" Sep 4 04:19:55.868518 kubelet[2745]: I0904 04:19:55.868359 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef72b028-b203-4e62-b9a0-96331c5964b9-config\") pod \"goldmane-54d579b49d-lxjhq\" (UID: \"ef72b028-b203-4e62-b9a0-96331c5964b9\") " pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:55.868518 kubelet[2745]: I0904 04:19:55.868375 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c459fef-8814-434e-82d4-dedf8b1c5faa-tigera-ca-bundle\") pod \"calico-kube-controllers-9444b947f-288sl\" (UID: \"0c459fef-8814-434e-82d4-dedf8b1c5faa\") " pod="calico-system/calico-kube-controllers-9444b947f-288sl" Sep 4 04:19:55.868518 kubelet[2745]: I0904 04:19:55.868393 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t5fm\" (UniqueName: \"kubernetes.io/projected/0c459fef-8814-434e-82d4-dedf8b1c5faa-kube-api-access-6t5fm\") pod \"calico-kube-controllers-9444b947f-288sl\" (UID: \"0c459fef-8814-434e-82d4-dedf8b1c5faa\") " pod="calico-system/calico-kube-controllers-9444b947f-288sl" Sep 4 04:19:55.868654 kubelet[2745]: I0904 04:19:55.868418 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8cbk\" (UniqueName: \"kubernetes.io/projected/00c7f901-355b-43d3-894a-6d038427ef01-kube-api-access-v8cbk\") pod \"coredns-668d6bf9bc-nfhxx\" (UID: \"00c7f901-355b-43d3-894a-6d038427ef01\") " pod="kube-system/coredns-668d6bf9bc-nfhxx" Sep 4 04:19:55.868654 kubelet[2745]: I0904 04:19:55.868433 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-ca-bundle\") pod \"whisker-846495449b-nhdh2\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " pod="calico-system/whisker-846495449b-nhdh2" Sep 4 04:19:55.877817 containerd[1573]: time="2025-09-04T04:19:55.877776124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zm9wc,Uid:e7a8b161-507c-42a3-a943-83edd2ebf502,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:55.985358 containerd[1573]: time="2025-09-04T04:19:55.983592715Z" level=error msg="Failed to destroy network for sandbox \"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:55.985358 containerd[1573]: time="2025-09-04T04:19:55.985129999Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zm9wc,Uid:e7a8b161-507c-42a3-a943-83edd2ebf502,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:55.986958 kubelet[2745]: E0904 04:19:55.986911 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:55.989826 systemd[1]: run-netns-cni\x2d17fdcc13\x2dd97b\x2d3ee0\x2df20b\x2db917dba3944c.mount: Deactivated successfully. Sep 4 04:19:55.992777 kubelet[2745]: E0904 04:19:55.991599 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zm9wc" Sep 4 04:19:55.992777 kubelet[2745]: E0904 04:19:55.991656 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zm9wc" Sep 4 04:19:55.992777 kubelet[2745]: E0904 04:19:55.991723 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-zm9wc_kube-system(e7a8b161-507c-42a3-a943-83edd2ebf502)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-zm9wc_kube-system(e7a8b161-507c-42a3-a943-83edd2ebf502)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"913fdc08fb3b5938f302a7a101e4881fc3ccc8190a4f3cea4828bce238dc88da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-zm9wc" podUID="e7a8b161-507c-42a3-a943-83edd2ebf502" Sep 4 04:19:56.075375 systemd[1]: Created slice kubepods-besteffort-pod586c1b6d_b550_4d19_9b28_3936c94d31f1.slice - libcontainer container kubepods-besteffort-pod586c1b6d_b550_4d19_9b28_3936c94d31f1.slice. Sep 4 04:19:56.078946 containerd[1573]: time="2025-09-04T04:19:56.078892842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7q5r5,Uid:586c1b6d-b550-4d19-9b28-3936c94d31f1,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:56.117277 containerd[1573]: time="2025-09-04T04:19:56.117207007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nfhxx,Uid:00c7f901-355b-43d3-894a-6d038427ef01,Namespace:kube-system,Attempt:0,}" Sep 4 04:19:56.123524 containerd[1573]: time="2025-09-04T04:19:56.123481801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-8jz66,Uid:6857fb4d-09bc-4243-9e00-1619a4fbac6b,Namespace:calico-apiserver,Attempt:0,}" Sep 4 04:19:56.128515 containerd[1573]: time="2025-09-04T04:19:56.128476865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-5h544,Uid:206d3408-826b-403f-9856-5b773f0ee6ff,Namespace:calico-apiserver,Attempt:0,}" Sep 4 04:19:56.132592 containerd[1573]: time="2025-09-04T04:19:56.132557271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-846495449b-nhdh2,Uid:a21fd64b-400b-4064-b1d8-df6a0495db00,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:56.138162 containerd[1573]: time="2025-09-04T04:19:56.138111834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9444b947f-288sl,Uid:0c459fef-8814-434e-82d4-dedf8b1c5faa,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:56.144076 containerd[1573]: time="2025-09-04T04:19:56.143960559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-lxjhq,Uid:ef72b028-b203-4e62-b9a0-96331c5964b9,Namespace:calico-system,Attempt:0,}" Sep 4 04:19:56.145282 containerd[1573]: time="2025-09-04T04:19:56.145219762Z" level=error msg="Failed to destroy network for sandbox \"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.155538 containerd[1573]: time="2025-09-04T04:19:56.155456200Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7q5r5,Uid:586c1b6d-b550-4d19-9b28-3936c94d31f1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.155823 kubelet[2745]: E0904 04:19:56.155768 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.155877 kubelet[2745]: E0904 04:19:56.155860 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:56.155906 kubelet[2745]: E0904 04:19:56.155888 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7q5r5" Sep 4 04:19:56.155995 kubelet[2745]: E0904 04:19:56.155960 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7q5r5_calico-system(586c1b6d-b550-4d19-9b28-3936c94d31f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7q5r5_calico-system(586c1b6d-b550-4d19-9b28-3936c94d31f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"500d78a3ffa34477a88fe2d0e1c71b0fe3638471721cdd844bf42844c6cc97f1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7q5r5" podUID="586c1b6d-b550-4d19-9b28-3936c94d31f1" Sep 4 04:19:56.177376 containerd[1573]: time="2025-09-04T04:19:56.177323002Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 04:19:56.254879 containerd[1573]: time="2025-09-04T04:19:56.254822126Z" level=error msg="Failed to destroy network for sandbox \"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.256674 containerd[1573]: time="2025-09-04T04:19:56.256648843Z" level=error msg="Failed to destroy network for sandbox \"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.258451 containerd[1573]: time="2025-09-04T04:19:56.258363160Z" level=error msg="Failed to destroy network for sandbox \"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.258893 systemd[1]: run-netns-cni\x2d1eea4b3d\x2d5db9\x2d8e55\x2d3871\x2d6878cb478a15.mount: Deactivated successfully. Sep 4 04:19:56.262928 containerd[1573]: time="2025-09-04T04:19:56.261957705Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9444b947f-288sl,Uid:0c459fef-8814-434e-82d4-dedf8b1c5faa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.263109 systemd[1]: run-netns-cni\x2dd1487ad9\x2d21a0\x2db49a\x2d667e\x2d133a9e4d0723.mount: Deactivated successfully. Sep 4 04:19:56.263219 systemd[1]: run-netns-cni\x2d29dc796b\x2d5176\x2d0589\x2d439a\x2dfe5121b449ac.mount: Deactivated successfully. Sep 4 04:19:56.263458 containerd[1573]: time="2025-09-04T04:19:56.263343245Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-5h544,Uid:206d3408-826b-403f-9856-5b773f0ee6ff,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.264279 kubelet[2745]: E0904 04:19:56.263736 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.264279 kubelet[2745]: E0904 04:19:56.263806 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66557b5466-5h544" Sep 4 04:19:56.264279 kubelet[2745]: E0904 04:19:56.263827 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66557b5466-5h544" Sep 4 04:19:56.264420 kubelet[2745]: E0904 04:19:56.263872 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66557b5466-5h544_calico-apiserver(206d3408-826b-403f-9856-5b773f0ee6ff)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66557b5466-5h544_calico-apiserver(206d3408-826b-403f-9856-5b773f0ee6ff)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"425bda70588c9b5d2765ca91c71f8b6c39edb3cb48641d322586a50c6a6a1a47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66557b5466-5h544" podUID="206d3408-826b-403f-9856-5b773f0ee6ff" Sep 4 04:19:56.264420 kubelet[2745]: E0904 04:19:56.264148 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.264420 kubelet[2745]: E0904 04:19:56.264169 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9444b947f-288sl" Sep 4 04:19:56.264520 kubelet[2745]: E0904 04:19:56.264183 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9444b947f-288sl" Sep 4 04:19:56.264520 kubelet[2745]: E0904 04:19:56.264205 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-9444b947f-288sl_calico-system(0c459fef-8814-434e-82d4-dedf8b1c5faa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-9444b947f-288sl_calico-system(0c459fef-8814-434e-82d4-dedf8b1c5faa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a8860ea99ae07aa98b581d77df360265bc6f531ee2074cde74d9ceb2bc0841b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-9444b947f-288sl" podUID="0c459fef-8814-434e-82d4-dedf8b1c5faa" Sep 4 04:19:56.265640 containerd[1573]: time="2025-09-04T04:19:56.265567208Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-846495449b-nhdh2,Uid:a21fd64b-400b-4064-b1d8-df6a0495db00,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.266000 kubelet[2745]: E0904 04:19:56.265840 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.266000 kubelet[2745]: E0904 04:19:56.265900 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-846495449b-nhdh2" Sep 4 04:19:56.266000 kubelet[2745]: E0904 04:19:56.265914 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-846495449b-nhdh2" Sep 4 04:19:56.266227 kubelet[2745]: E0904 04:19:56.265943 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-846495449b-nhdh2_calico-system(a21fd64b-400b-4064-b1d8-df6a0495db00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-846495449b-nhdh2_calico-system(a21fd64b-400b-4064-b1d8-df6a0495db00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6e09d3ae02b3957d3a63179423b99476ba0fbd06823e27c410e5086372ef21cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-846495449b-nhdh2" podUID="a21fd64b-400b-4064-b1d8-df6a0495db00" Sep 4 04:19:56.272107 containerd[1573]: time="2025-09-04T04:19:56.271568660Z" level=error msg="Failed to destroy network for sandbox \"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.275517 systemd[1]: run-netns-cni\x2dcf57e830\x2d666a\x2d8043\x2dec53\x2d70be18804417.mount: Deactivated successfully. Sep 4 04:19:56.277046 containerd[1573]: time="2025-09-04T04:19:56.276239895Z" level=error msg="Failed to destroy network for sandbox \"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.277567 containerd[1573]: time="2025-09-04T04:19:56.277521961Z" level=error msg="Failed to destroy network for sandbox \"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.278113 containerd[1573]: time="2025-09-04T04:19:56.278083655Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nfhxx,Uid:00c7f901-355b-43d3-894a-6d038427ef01,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.278912 systemd[1]: run-netns-cni\x2d75b834ea\x2d014b\x2d5886\x2d9c30\x2d4e3bc47ca4f6.mount: Deactivated successfully. Sep 4 04:19:56.279503 kubelet[2745]: E0904 04:19:56.279378 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.279825 containerd[1573]: time="2025-09-04T04:19:56.279448907Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-8jz66,Uid:6857fb4d-09bc-4243-9e00-1619a4fbac6b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.279874 kubelet[2745]: E0904 04:19:56.279636 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nfhxx" Sep 4 04:19:56.279874 kubelet[2745]: E0904 04:19:56.279669 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nfhxx" Sep 4 04:19:56.279874 kubelet[2745]: E0904 04:19:56.279688 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.279945 kubelet[2745]: E0904 04:19:56.279719 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-nfhxx_kube-system(00c7f901-355b-43d3-894a-6d038427ef01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-nfhxx_kube-system(00c7f901-355b-43d3-894a-6d038427ef01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1fa5787f655dc2206208a7644ae1c4a654a6c14ec54e607ab35dca28798c4a19\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-nfhxx" podUID="00c7f901-355b-43d3-894a-6d038427ef01" Sep 4 04:19:56.279945 kubelet[2745]: E0904 04:19:56.279745 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" Sep 4 04:19:56.279945 kubelet[2745]: E0904 04:19:56.279784 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" Sep 4 04:19:56.280530 kubelet[2745]: E0904 04:19:56.279833 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66557b5466-8jz66_calico-apiserver(6857fb4d-09bc-4243-9e00-1619a4fbac6b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66557b5466-8jz66_calico-apiserver(6857fb4d-09bc-4243-9e00-1619a4fbac6b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14a92d368efcf6100943fc947a448f869c603a16e45217edc0b71c5a5855b7e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" podUID="6857fb4d-09bc-4243-9e00-1619a4fbac6b" Sep 4 04:19:56.280996 containerd[1573]: time="2025-09-04T04:19:56.280956456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-lxjhq,Uid:ef72b028-b203-4e62-b9a0-96331c5964b9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.281396 kubelet[2745]: E0904 04:19:56.281364 2745 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 04:19:56.281457 kubelet[2745]: E0904 04:19:56.281408 2745 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:56.281457 kubelet[2745]: E0904 04:19:56.281426 2745 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-lxjhq" Sep 4 04:19:56.281507 kubelet[2745]: E0904 04:19:56.281462 2745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-lxjhq_calico-system(ef72b028-b203-4e62-b9a0-96331c5964b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-lxjhq_calico-system(ef72b028-b203-4e62-b9a0-96331c5964b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6cf9affa5d200de466ff5a0c1d07a516cd523b45d0a88fefe56a87e0ddfb99a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-lxjhq" podUID="ef72b028-b203-4e62-b9a0-96331c5964b9" Sep 4 04:19:57.235203 systemd[1]: run-netns-cni\x2d35f57c33\x2dfd5b\x2d25fa\x2d3a5b\x2da8b75af8f0eb.mount: Deactivated successfully. Sep 4 04:20:01.160453 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1562975346.mount: Deactivated successfully. Sep 4 04:20:02.262284 containerd[1573]: time="2025-09-04T04:20:02.262191565Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:02.273905 containerd[1573]: time="2025-09-04T04:20:02.262610001Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 04:20:02.273905 containerd[1573]: time="2025-09-04T04:20:02.265400085Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:02.274581 containerd[1573]: time="2025-09-04T04:20:02.270338289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.092936089s" Sep 4 04:20:02.274581 containerd[1573]: time="2025-09-04T04:20:02.274085599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 04:20:02.276128 containerd[1573]: time="2025-09-04T04:20:02.276069190Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:02.284973 containerd[1573]: time="2025-09-04T04:20:02.284856725Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 04:20:02.305991 containerd[1573]: time="2025-09-04T04:20:02.304255166Z" level=info msg="Container c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:02.330676 containerd[1573]: time="2025-09-04T04:20:02.330594939Z" level=info msg="CreateContainer within sandbox \"e112d3bec1859152a9a793c47b7a7ba2115b24673eb9c7640a3d6c0aa497e623\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\"" Sep 4 04:20:02.331384 containerd[1573]: time="2025-09-04T04:20:02.331348874Z" level=info msg="StartContainer for \"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\"" Sep 4 04:20:02.332992 containerd[1573]: time="2025-09-04T04:20:02.332961089Z" level=info msg="connecting to shim c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c" address="unix:///run/containerd/s/53cb9382ef46e2548683df883c3105b86eb3a5e9f52d1ea8af019798ea554017" protocol=ttrpc version=3 Sep 4 04:20:02.356213 systemd[1]: Started cri-containerd-c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c.scope - libcontainer container c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c. Sep 4 04:20:02.411091 containerd[1573]: time="2025-09-04T04:20:02.410975232Z" level=info msg="StartContainer for \"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\" returns successfully" Sep 4 04:20:02.494838 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 04:20:02.495764 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 04:20:02.715698 kubelet[2745]: I0904 04:20:02.715622 2745 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h67w\" (UniqueName: \"kubernetes.io/projected/a21fd64b-400b-4064-b1d8-df6a0495db00-kube-api-access-8h67w\") pod \"a21fd64b-400b-4064-b1d8-df6a0495db00\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " Sep 4 04:20:02.715698 kubelet[2745]: I0904 04:20:02.715696 2745 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-ca-bundle\") pod \"a21fd64b-400b-4064-b1d8-df6a0495db00\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " Sep 4 04:20:02.716297 kubelet[2745]: I0904 04:20:02.715720 2745 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-backend-key-pair\") pod \"a21fd64b-400b-4064-b1d8-df6a0495db00\" (UID: \"a21fd64b-400b-4064-b1d8-df6a0495db00\") " Sep 4 04:20:02.722619 kubelet[2745]: I0904 04:20:02.722555 2745 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a21fd64b-400b-4064-b1d8-df6a0495db00" (UID: "a21fd64b-400b-4064-b1d8-df6a0495db00"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 4 04:20:02.725954 kubelet[2745]: I0904 04:20:02.725912 2745 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21fd64b-400b-4064-b1d8-df6a0495db00-kube-api-access-8h67w" (OuterVolumeSpecName: "kube-api-access-8h67w") pod "a21fd64b-400b-4064-b1d8-df6a0495db00" (UID: "a21fd64b-400b-4064-b1d8-df6a0495db00"). InnerVolumeSpecName "kube-api-access-8h67w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 4 04:20:02.726020 kubelet[2745]: I0904 04:20:02.725895 2745 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a21fd64b-400b-4064-b1d8-df6a0495db00" (UID: "a21fd64b-400b-4064-b1d8-df6a0495db00"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 4 04:20:02.816459 kubelet[2745]: I0904 04:20:02.816393 2745 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 4 04:20:02.816459 kubelet[2745]: I0904 04:20:02.816437 2745 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a21fd64b-400b-4064-b1d8-df6a0495db00-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 4 04:20:02.816459 kubelet[2745]: I0904 04:20:02.816447 2745 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8h67w\" (UniqueName: \"kubernetes.io/projected/a21fd64b-400b-4064-b1d8-df6a0495db00-kube-api-access-8h67w\") on node \"localhost\" DevicePath \"\"" Sep 4 04:20:03.093008 systemd[1]: Removed slice kubepods-besteffort-poda21fd64b_400b_4064_b1d8_df6a0495db00.slice - libcontainer container kubepods-besteffort-poda21fd64b_400b_4064_b1d8_df6a0495db00.slice. Sep 4 04:20:03.217670 kubelet[2745]: I0904 04:20:03.217582 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-j88qj" podStartSLOduration=1.8145039120000002 podStartE2EDuration="19.217545754s" podCreationTimestamp="2025-09-04 04:19:44 +0000 UTC" firstStartedPulling="2025-09-04 04:19:44.872188644 +0000 UTC m=+17.896226571" lastFinishedPulling="2025-09-04 04:20:02.275230476 +0000 UTC m=+35.299268413" observedRunningTime="2025-09-04 04:20:03.216699448 +0000 UTC m=+36.240737405" watchObservedRunningTime="2025-09-04 04:20:03.217545754 +0000 UTC m=+36.241583691" Sep 4 04:20:03.275444 systemd[1]: Created slice kubepods-besteffort-pod27cb634e_c103_41f3_bfa9_ce4b26b14497.slice - libcontainer container kubepods-besteffort-pod27cb634e_c103_41f3_bfa9_ce4b26b14497.slice. Sep 4 04:20:03.283842 systemd[1]: var-lib-kubelet-pods-a21fd64b\x2d400b\x2d4064\x2db1d8\x2ddf6a0495db00-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8h67w.mount: Deactivated successfully. Sep 4 04:20:03.284022 systemd[1]: var-lib-kubelet-pods-a21fd64b\x2d400b\x2d4064\x2db1d8\x2ddf6a0495db00-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 04:20:03.320697 kubelet[2745]: I0904 04:20:03.320623 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59xn\" (UniqueName: \"kubernetes.io/projected/27cb634e-c103-41f3-bfa9-ce4b26b14497-kube-api-access-x59xn\") pod \"whisker-68c69cccff-fn8ng\" (UID: \"27cb634e-c103-41f3-bfa9-ce4b26b14497\") " pod="calico-system/whisker-68c69cccff-fn8ng" Sep 4 04:20:03.320697 kubelet[2745]: I0904 04:20:03.320686 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/27cb634e-c103-41f3-bfa9-ce4b26b14497-whisker-backend-key-pair\") pod \"whisker-68c69cccff-fn8ng\" (UID: \"27cb634e-c103-41f3-bfa9-ce4b26b14497\") " pod="calico-system/whisker-68c69cccff-fn8ng" Sep 4 04:20:03.320697 kubelet[2745]: I0904 04:20:03.320711 2745 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27cb634e-c103-41f3-bfa9-ce4b26b14497-whisker-ca-bundle\") pod \"whisker-68c69cccff-fn8ng\" (UID: \"27cb634e-c103-41f3-bfa9-ce4b26b14497\") " pod="calico-system/whisker-68c69cccff-fn8ng" Sep 4 04:20:03.580285 containerd[1573]: time="2025-09-04T04:20:03.580203714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68c69cccff-fn8ng,Uid:27cb634e-c103-41f3-bfa9-ce4b26b14497,Namespace:calico-system,Attempt:0,}" Sep 4 04:20:03.792242 systemd-networkd[1477]: calid13363a982a: Link UP Sep 4 04:20:03.792460 systemd-networkd[1477]: calid13363a982a: Gained carrier Sep 4 04:20:03.812564 containerd[1573]: 2025-09-04 04:20:03.630 [INFO][3868] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 04:20:03.812564 containerd[1573]: 2025-09-04 04:20:03.650 [INFO][3868] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--68c69cccff--fn8ng-eth0 whisker-68c69cccff- calico-system 27cb634e-c103-41f3-bfa9-ce4b26b14497 861 0 2025-09-04 04:20:03 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:68c69cccff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-68c69cccff-fn8ng eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid13363a982a [] [] }} ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-" Sep 4 04:20:03.812564 containerd[1573]: 2025-09-04 04:20:03.650 [INFO][3868] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.812564 containerd[1573]: 2025-09-04 04:20:03.726 [INFO][3883] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" HandleID="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Workload="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.727 [INFO][3883] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" HandleID="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Workload="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003856f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-68c69cccff-fn8ng", "timestamp":"2025-09-04 04:20:03.726969907 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.727 [INFO][3883] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.727 [INFO][3883] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.728 [INFO][3883] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.739 [INFO][3883] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" host="localhost" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.747 [INFO][3883] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.751 [INFO][3883] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.753 [INFO][3883] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.756 [INFO][3883] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:03.812886 containerd[1573]: 2025-09-04 04:20:03.756 [INFO][3883] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" host="localhost" Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.758 [INFO][3883] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.768 [INFO][3883] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" host="localhost" Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.779 [INFO][3883] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" host="localhost" Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.779 [INFO][3883] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" host="localhost" Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.779 [INFO][3883] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:03.813165 containerd[1573]: 2025-09-04 04:20:03.779 [INFO][3883] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" HandleID="k8s-pod-network.2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Workload="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.813308 containerd[1573]: 2025-09-04 04:20:03.783 [INFO][3868] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--68c69cccff--fn8ng-eth0", GenerateName:"whisker-68c69cccff-", Namespace:"calico-system", SelfLink:"", UID:"27cb634e-c103-41f3-bfa9-ce4b26b14497", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 20, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68c69cccff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-68c69cccff-fn8ng", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid13363a982a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:03.813308 containerd[1573]: 2025-09-04 04:20:03.783 [INFO][3868] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.813383 containerd[1573]: 2025-09-04 04:20:03.783 [INFO][3868] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid13363a982a ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.813383 containerd[1573]: 2025-09-04 04:20:03.792 [INFO][3868] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:03.813440 containerd[1573]: 2025-09-04 04:20:03.792 [INFO][3868] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--68c69cccff--fn8ng-eth0", GenerateName:"whisker-68c69cccff-", Namespace:"calico-system", SelfLink:"", UID:"27cb634e-c103-41f3-bfa9-ce4b26b14497", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 20, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68c69cccff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d", Pod:"whisker-68c69cccff-fn8ng", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid13363a982a", MAC:"f2:f6:e6:33:52:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:03.813496 containerd[1573]: 2025-09-04 04:20:03.807 [INFO][3868] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" Namespace="calico-system" Pod="whisker-68c69cccff-fn8ng" WorkloadEndpoint="localhost-k8s-whisker--68c69cccff--fn8ng-eth0" Sep 4 04:20:04.215095 containerd[1573]: time="2025-09-04T04:20:04.214352328Z" level=info msg="connecting to shim 2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d" address="unix:///run/containerd/s/8e930e038077d9e07ca355b70743c78a1a7a7ef62974b4625808122e6605ef50" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:04.291455 systemd[1]: Started cri-containerd-2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d.scope - libcontainer container 2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d. Sep 4 04:20:04.318879 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:04.411357 containerd[1573]: time="2025-09-04T04:20:04.411302119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68c69cccff-fn8ng,Uid:27cb634e-c103-41f3-bfa9-ce4b26b14497,Namespace:calico-system,Attempt:0,} returns sandbox id \"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d\"" Sep 4 04:20:04.415466 containerd[1573]: time="2025-09-04T04:20:04.415407972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 04:20:04.916495 systemd-networkd[1477]: vxlan.calico: Link UP Sep 4 04:20:04.916510 systemd-networkd[1477]: vxlan.calico: Gained carrier Sep 4 04:20:05.072464 kubelet[2745]: I0904 04:20:05.072387 2745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21fd64b-400b-4064-b1d8-df6a0495db00" path="/var/lib/kubelet/pods/a21fd64b-400b-4064-b1d8-df6a0495db00/volumes" Sep 4 04:20:05.248260 systemd-networkd[1477]: calid13363a982a: Gained IPv6LL Sep 4 04:20:06.449213 containerd[1573]: time="2025-09-04T04:20:06.449032372Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:06.450544 containerd[1573]: time="2025-09-04T04:20:06.450506427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 04:20:06.452032 containerd[1573]: time="2025-09-04T04:20:06.451976544Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:06.454632 containerd[1573]: time="2025-09-04T04:20:06.454583264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:06.455513 containerd[1573]: time="2025-09-04T04:20:06.455468104Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.040006953s" Sep 4 04:20:06.455513 containerd[1573]: time="2025-09-04T04:20:06.455509531Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 04:20:06.458114 containerd[1573]: time="2025-09-04T04:20:06.458043344Z" level=info msg="CreateContainer within sandbox \"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 04:20:06.464332 systemd-networkd[1477]: vxlan.calico: Gained IPv6LL Sep 4 04:20:06.492830 containerd[1573]: time="2025-09-04T04:20:06.492110837Z" level=info msg="Container b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:06.503501 containerd[1573]: time="2025-09-04T04:20:06.503422835Z" level=info msg="CreateContainer within sandbox \"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7\"" Sep 4 04:20:06.504114 containerd[1573]: time="2025-09-04T04:20:06.504075580Z" level=info msg="StartContainer for \"b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7\"" Sep 4 04:20:06.505248 containerd[1573]: time="2025-09-04T04:20:06.505223854Z" level=info msg="connecting to shim b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7" address="unix:///run/containerd/s/8e930e038077d9e07ca355b70743c78a1a7a7ef62974b4625808122e6605ef50" protocol=ttrpc version=3 Sep 4 04:20:06.553543 systemd[1]: Started cri-containerd-b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7.scope - libcontainer container b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7. Sep 4 04:20:06.636716 containerd[1573]: time="2025-09-04T04:20:06.636656540Z" level=info msg="StartContainer for \"b8be7bbbb0fb1c93ecba0f87eb29097548388dca89baf9476d75e91c4b39f7e7\" returns successfully" Sep 4 04:20:06.638695 containerd[1573]: time="2025-09-04T04:20:06.638660329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 04:20:07.031153 systemd[1]: Started sshd@7-10.0.0.54:22-10.0.0.1:33306.service - OpenSSH per-connection server daemon (10.0.0.1:33306). Sep 4 04:20:07.074500 containerd[1573]: time="2025-09-04T04:20:07.074415718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7q5r5,Uid:586c1b6d-b550-4d19-9b28-3936c94d31f1,Namespace:calico-system,Attempt:0,}" Sep 4 04:20:07.079087 containerd[1573]: time="2025-09-04T04:20:07.077081098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9444b947f-288sl,Uid:0c459fef-8814-434e-82d4-dedf8b1c5faa,Namespace:calico-system,Attempt:0,}" Sep 4 04:20:07.112601 sshd[4187]: Accepted publickey for core from 10.0.0.1 port 33306 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:07.114379 sshd-session[4187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:07.121144 systemd-logind[1521]: New session 8 of user core. Sep 4 04:20:07.130506 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 04:20:07.237979 systemd-networkd[1477]: cali88e3b973d25: Link UP Sep 4 04:20:07.238987 systemd-networkd[1477]: cali88e3b973d25: Gained carrier Sep 4 04:20:07.263796 containerd[1573]: 2025-09-04 04:20:07.133 [INFO][4190] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7q5r5-eth0 csi-node-driver- calico-system 586c1b6d-b550-4d19-9b28-3936c94d31f1 684 0 2025-09-04 04:19:44 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7q5r5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali88e3b973d25 [] [] }} ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-" Sep 4 04:20:07.263796 containerd[1573]: 2025-09-04 04:20:07.133 [INFO][4190] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.263796 containerd[1573]: 2025-09-04 04:20:07.179 [INFO][4220] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" HandleID="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Workload="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.180 [INFO][4220] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" HandleID="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Workload="localhost-k8s-csi--node--driver--7q5r5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000188850), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7q5r5", "timestamp":"2025-09-04 04:20:07.179787306 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.180 [INFO][4220] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.180 [INFO][4220] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.180 [INFO][4220] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.188 [INFO][4220] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" host="localhost" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.200 [INFO][4220] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.206 [INFO][4220] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.208 [INFO][4220] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.210 [INFO][4220] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:07.264209 containerd[1573]: 2025-09-04 04:20:07.210 [INFO][4220] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" host="localhost" Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.212 [INFO][4220] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.219 [INFO][4220] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" host="localhost" Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.226 [INFO][4220] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" host="localhost" Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.226 [INFO][4220] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" host="localhost" Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.226 [INFO][4220] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:07.264471 containerd[1573]: 2025-09-04 04:20:07.226 [INFO][4220] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" HandleID="k8s-pod-network.b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Workload="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.264663 containerd[1573]: 2025-09-04 04:20:07.233 [INFO][4190] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7q5r5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"586c1b6d-b550-4d19-9b28-3936c94d31f1", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7q5r5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali88e3b973d25", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:07.264720 containerd[1573]: 2025-09-04 04:20:07.233 [INFO][4190] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.264720 containerd[1573]: 2025-09-04 04:20:07.233 [INFO][4190] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali88e3b973d25 ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.264720 containerd[1573]: 2025-09-04 04:20:07.240 [INFO][4190] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.264787 containerd[1573]: 2025-09-04 04:20:07.240 [INFO][4190] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7q5r5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"586c1b6d-b550-4d19-9b28-3936c94d31f1", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec", Pod:"csi-node-driver-7q5r5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali88e3b973d25", MAC:"2a:42:2f:46:f9:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:07.264838 containerd[1573]: 2025-09-04 04:20:07.255 [INFO][4190] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" Namespace="calico-system" Pod="csi-node-driver-7q5r5" WorkloadEndpoint="localhost-k8s-csi--node--driver--7q5r5-eth0" Sep 4 04:20:07.346210 containerd[1573]: time="2025-09-04T04:20:07.346014992Z" level=info msg="connecting to shim b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec" address="unix:///run/containerd/s/aef3c6b5db6c3d6470c917690b8e27add067a7279d64652ee897049856f365f3" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:07.352247 sshd[4218]: Connection closed by 10.0.0.1 port 33306 Sep 4 04:20:07.352891 sshd-session[4187]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:07.357291 systemd-networkd[1477]: caliea59ae673a4: Link UP Sep 4 04:20:07.358711 systemd-networkd[1477]: caliea59ae673a4: Gained carrier Sep 4 04:20:07.363196 systemd[1]: sshd@7-10.0.0.54:22-10.0.0.1:33306.service: Deactivated successfully. Sep 4 04:20:07.368664 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 04:20:07.375425 systemd-logind[1521]: Session 8 logged out. Waiting for processes to exit. Sep 4 04:20:07.379937 systemd-logind[1521]: Removed session 8. Sep 4 04:20:07.398495 containerd[1573]: 2025-09-04 04:20:07.149 [INFO][4200] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0 calico-kube-controllers-9444b947f- calico-system 0c459fef-8814-434e-82d4-dedf8b1c5faa 801 0 2025-09-04 04:19:44 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:9444b947f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-9444b947f-288sl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] caliea59ae673a4 [] [] }} ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-" Sep 4 04:20:07.398495 containerd[1573]: 2025-09-04 04:20:07.149 [INFO][4200] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.398495 containerd[1573]: 2025-09-04 04:20:07.194 [INFO][4227] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" HandleID="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Workload="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.398331 systemd[1]: Started cri-containerd-b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec.scope - libcontainer container b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec. Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.194 [INFO][4227] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" HandleID="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Workload="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000494b30), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-9444b947f-288sl", "timestamp":"2025-09-04 04:20:07.194324443 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.194 [INFO][4227] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.226 [INFO][4227] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.227 [INFO][4227] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.289 [INFO][4227] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" host="localhost" Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.302 [INFO][4227] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.315 [INFO][4227] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.319 [INFO][4227] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.321 [INFO][4227] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:07.398950 containerd[1573]: 2025-09-04 04:20:07.321 [INFO][4227] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" host="localhost" Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.324 [INFO][4227] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84 Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.330 [INFO][4227] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" host="localhost" Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.344 [INFO][4227] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" host="localhost" Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.344 [INFO][4227] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" host="localhost" Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.345 [INFO][4227] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:07.399283 containerd[1573]: 2025-09-04 04:20:07.346 [INFO][4227] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" HandleID="k8s-pod-network.4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Workload="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.399410 containerd[1573]: 2025-09-04 04:20:07.350 [INFO][4200] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0", GenerateName:"calico-kube-controllers-9444b947f-", Namespace:"calico-system", SelfLink:"", UID:"0c459fef-8814-434e-82d4-dedf8b1c5faa", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9444b947f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-9444b947f-288sl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliea59ae673a4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:07.399467 containerd[1573]: 2025-09-04 04:20:07.350 [INFO][4200] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.399467 containerd[1573]: 2025-09-04 04:20:07.351 [INFO][4200] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliea59ae673a4 ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.399467 containerd[1573]: 2025-09-04 04:20:07.360 [INFO][4200] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.399549 containerd[1573]: 2025-09-04 04:20:07.360 [INFO][4200] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0", GenerateName:"calico-kube-controllers-9444b947f-", Namespace:"calico-system", SelfLink:"", UID:"0c459fef-8814-434e-82d4-dedf8b1c5faa", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9444b947f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84", Pod:"calico-kube-controllers-9444b947f-288sl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliea59ae673a4", MAC:"32:71:cf:87:e5:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:07.399603 containerd[1573]: 2025-09-04 04:20:07.393 [INFO][4200] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" Namespace="calico-system" Pod="calico-kube-controllers-9444b947f-288sl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9444b947f--288sl-eth0" Sep 4 04:20:07.429719 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:07.439839 containerd[1573]: time="2025-09-04T04:20:07.439775369Z" level=info msg="connecting to shim 4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84" address="unix:///run/containerd/s/2ac8bf1337be7b9d35902b351b23e90ac03e0ee4d35575065387e89816cc5552" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:07.454336 containerd[1573]: time="2025-09-04T04:20:07.454275428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7q5r5,Uid:586c1b6d-b550-4d19-9b28-3936c94d31f1,Namespace:calico-system,Attempt:0,} returns sandbox id \"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec\"" Sep 4 04:20:07.476366 systemd[1]: Started cri-containerd-4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84.scope - libcontainer container 4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84. Sep 4 04:20:07.491746 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:07.526646 containerd[1573]: time="2025-09-04T04:20:07.526592343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9444b947f-288sl,Uid:0c459fef-8814-434e-82d4-dedf8b1c5faa,Namespace:calico-system,Attempt:0,} returns sandbox id \"4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84\"" Sep 4 04:20:08.069898 containerd[1573]: time="2025-09-04T04:20:08.069765628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nfhxx,Uid:00c7f901-355b-43d3-894a-6d038427ef01,Namespace:kube-system,Attempt:0,}" Sep 4 04:20:08.210361 systemd-networkd[1477]: cali069200c5e81: Link UP Sep 4 04:20:08.211160 systemd-networkd[1477]: cali069200c5e81: Gained carrier Sep 4 04:20:08.228767 containerd[1573]: 2025-09-04 04:20:08.130 [INFO][4364] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0 coredns-668d6bf9bc- kube-system 00c7f901-355b-43d3-894a-6d038427ef01 800 0 2025-09-04 04:19:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-nfhxx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali069200c5e81 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-" Sep 4 04:20:08.228767 containerd[1573]: 2025-09-04 04:20:08.130 [INFO][4364] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.228767 containerd[1573]: 2025-09-04 04:20:08.162 [INFO][4379] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" HandleID="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Workload="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.163 [INFO][4379] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" HandleID="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Workload="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000357600), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-nfhxx", "timestamp":"2025-09-04 04:20:08.162788178 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.163 [INFO][4379] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.163 [INFO][4379] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.163 [INFO][4379] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.172 [INFO][4379] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" host="localhost" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.178 [INFO][4379] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.184 [INFO][4379] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.186 [INFO][4379] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.189 [INFO][4379] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:08.229134 containerd[1573]: 2025-09-04 04:20:08.189 [INFO][4379] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" host="localhost" Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.191 [INFO][4379] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355 Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.197 [INFO][4379] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" host="localhost" Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.203 [INFO][4379] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" host="localhost" Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.203 [INFO][4379] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" host="localhost" Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.203 [INFO][4379] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:08.229852 containerd[1573]: 2025-09-04 04:20:08.203 [INFO][4379] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" HandleID="k8s-pod-network.a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Workload="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.230019 containerd[1573]: 2025-09-04 04:20:08.207 [INFO][4364] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"00c7f901-355b-43d3-894a-6d038427ef01", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-nfhxx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali069200c5e81", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:08.230142 containerd[1573]: 2025-09-04 04:20:08.207 [INFO][4364] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.230142 containerd[1573]: 2025-09-04 04:20:08.207 [INFO][4364] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali069200c5e81 ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.230142 containerd[1573]: 2025-09-04 04:20:08.210 [INFO][4364] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.230231 containerd[1573]: 2025-09-04 04:20:08.213 [INFO][4364] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"00c7f901-355b-43d3-894a-6d038427ef01", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355", Pod:"coredns-668d6bf9bc-nfhxx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali069200c5e81", MAC:"5e:0d:28:14:b1:3a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:08.230231 containerd[1573]: 2025-09-04 04:20:08.224 [INFO][4364] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" Namespace="kube-system" Pod="coredns-668d6bf9bc-nfhxx" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nfhxx-eth0" Sep 4 04:20:08.284809 containerd[1573]: time="2025-09-04T04:20:08.284749567Z" level=info msg="connecting to shim a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355" address="unix:///run/containerd/s/aea4818d476a903ac395e2f124265b0d0662d9239422ad0e192b6b106832559f" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:08.321289 systemd[1]: Started cri-containerd-a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355.scope - libcontainer container a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355. Sep 4 04:20:08.340790 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:08.383217 containerd[1573]: time="2025-09-04T04:20:08.383140627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nfhxx,Uid:00c7f901-355b-43d3-894a-6d038427ef01,Namespace:kube-system,Attempt:0,} returns sandbox id \"a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355\"" Sep 4 04:20:08.386896 containerd[1573]: time="2025-09-04T04:20:08.386834885Z" level=info msg="CreateContainer within sandbox \"a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 04:20:08.425254 containerd[1573]: time="2025-09-04T04:20:08.425183515Z" level=info msg="Container 8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:08.433951 containerd[1573]: time="2025-09-04T04:20:08.433884325Z" level=info msg="CreateContainer within sandbox \"a354687d7f5d686f7794da868f63c4bbfb0cb91fe593d1194935edeaaac19355\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904\"" Sep 4 04:20:08.434838 containerd[1573]: time="2025-09-04T04:20:08.434767422Z" level=info msg="StartContainer for \"8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904\"" Sep 4 04:20:08.436105 containerd[1573]: time="2025-09-04T04:20:08.436039598Z" level=info msg="connecting to shim 8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904" address="unix:///run/containerd/s/aea4818d476a903ac395e2f124265b0d0662d9239422ad0e192b6b106832559f" protocol=ttrpc version=3 Sep 4 04:20:08.461384 systemd[1]: Started cri-containerd-8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904.scope - libcontainer container 8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904. Sep 4 04:20:08.497004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2376421588.mount: Deactivated successfully. Sep 4 04:20:08.600550 containerd[1573]: time="2025-09-04T04:20:08.600420993Z" level=info msg="StartContainer for \"8c7806ad617f334be7a24e26dd58c2b166c71af33deceb61647abb6659a3a904\" returns successfully" Sep 4 04:20:08.896681 systemd-networkd[1477]: cali88e3b973d25: Gained IPv6LL Sep 4 04:20:09.022272 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount485563072.mount: Deactivated successfully. Sep 4 04:20:09.072078 containerd[1573]: time="2025-09-04T04:20:09.072007661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zm9wc,Uid:e7a8b161-507c-42a3-a943-83edd2ebf502,Namespace:kube-system,Attempt:0,}" Sep 4 04:20:09.131235 containerd[1573]: time="2025-09-04T04:20:09.131168425Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:09.132283 containerd[1573]: time="2025-09-04T04:20:09.132247900Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 04:20:09.134583 containerd[1573]: time="2025-09-04T04:20:09.134541611Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:09.137933 containerd[1573]: time="2025-09-04T04:20:09.137863562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:09.138662 containerd[1573]: time="2025-09-04T04:20:09.138630721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.499939074s" Sep 4 04:20:09.138805 containerd[1573]: time="2025-09-04T04:20:09.138667360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 04:20:09.142124 containerd[1573]: time="2025-09-04T04:20:09.142050576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 04:20:09.143092 containerd[1573]: time="2025-09-04T04:20:09.143023631Z" level=info msg="CreateContainer within sandbox \"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 04:20:09.159421 containerd[1573]: time="2025-09-04T04:20:09.159375451Z" level=info msg="Container eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:09.195872 containerd[1573]: time="2025-09-04T04:20:09.195797325Z" level=info msg="CreateContainer within sandbox \"2e1d6f0708857a17d3727f474c3e91a569051aa487bf1b6f0f08f2c31275305d\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002\"" Sep 4 04:20:09.196939 containerd[1573]: time="2025-09-04T04:20:09.196672567Z" level=info msg="StartContainer for \"eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002\"" Sep 4 04:20:09.198168 containerd[1573]: time="2025-09-04T04:20:09.198138215Z" level=info msg="connecting to shim eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002" address="unix:///run/containerd/s/8e930e038077d9e07ca355b70743c78a1a7a7ef62974b4625808122e6605ef50" protocol=ttrpc version=3 Sep 4 04:20:09.215357 systemd-networkd[1477]: caliea59ae673a4: Gained IPv6LL Sep 4 04:20:09.233107 systemd-networkd[1477]: calid3da9f30e05: Link UP Sep 4 04:20:09.233304 systemd[1]: Started cri-containerd-eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002.scope - libcontainer container eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002. Sep 4 04:20:09.234643 systemd-networkd[1477]: calid3da9f30e05: Gained carrier Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.146 [INFO][4482] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0 coredns-668d6bf9bc- kube-system e7a8b161-507c-42a3-a943-83edd2ebf502 791 0 2025-09-04 04:19:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-zm9wc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid3da9f30e05 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.146 [INFO][4482] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.179 [INFO][4504] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" HandleID="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Workload="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.179 [INFO][4504] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" HandleID="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Workload="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000c17a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-zm9wc", "timestamp":"2025-09-04 04:20:09.17959753 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.179 [INFO][4504] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.180 [INFO][4504] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.180 [INFO][4504] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.186 [INFO][4504] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.191 [INFO][4504] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.197 [INFO][4504] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.204 [INFO][4504] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.208 [INFO][4504] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.208 [INFO][4504] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.210 [INFO][4504] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9 Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.215 [INFO][4504] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.226 [INFO][4504] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.226 [INFO][4504] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" host="localhost" Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.226 [INFO][4504] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:09.260087 containerd[1573]: 2025-09-04 04:20:09.226 [INFO][4504] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" HandleID="k8s-pod-network.a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Workload="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.230 [INFO][4482] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e7a8b161-507c-42a3-a943-83edd2ebf502", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-zm9wc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid3da9f30e05", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.230 [INFO][4482] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.230 [INFO][4482] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid3da9f30e05 ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.232 [INFO][4482] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.234 [INFO][4482] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e7a8b161-507c-42a3-a943-83edd2ebf502", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9", Pod:"coredns-668d6bf9bc-zm9wc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid3da9f30e05", MAC:"ea:ad:f0:ec:24:cb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:09.260687 containerd[1573]: 2025-09-04 04:20:09.246 [INFO][4482] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" Namespace="kube-system" Pod="coredns-668d6bf9bc-zm9wc" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zm9wc-eth0" Sep 4 04:20:09.279490 systemd-networkd[1477]: cali069200c5e81: Gained IPv6LL Sep 4 04:20:09.287562 kubelet[2745]: I0904 04:20:09.286204 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-nfhxx" podStartSLOduration=37.286165211 podStartE2EDuration="37.286165211s" podCreationTimestamp="2025-09-04 04:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:20:09.268137779 +0000 UTC m=+42.292175746" watchObservedRunningTime="2025-09-04 04:20:09.286165211 +0000 UTC m=+42.310203148" Sep 4 04:20:09.296977 containerd[1573]: time="2025-09-04T04:20:09.296917829Z" level=info msg="connecting to shim a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9" address="unix:///run/containerd/s/9100451a20324a10929c2b366551b979100e5af243b5f2019dcfd350c506e9b5" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:09.329510 systemd[1]: Started cri-containerd-a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9.scope - libcontainer container a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9. Sep 4 04:20:09.347657 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:09.380529 containerd[1573]: time="2025-09-04T04:20:09.380486082Z" level=info msg="StartContainer for \"eb2982fa481e597a800f99c774cdce56ed18f5c1e9b9a3e8e7858dc818810002\" returns successfully" Sep 4 04:20:09.382977 containerd[1573]: time="2025-09-04T04:20:09.382473801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zm9wc,Uid:e7a8b161-507c-42a3-a943-83edd2ebf502,Namespace:kube-system,Attempt:0,} returns sandbox id \"a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9\"" Sep 4 04:20:09.384991 containerd[1573]: time="2025-09-04T04:20:09.384931720Z" level=info msg="CreateContainer within sandbox \"a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 04:20:09.395422 containerd[1573]: time="2025-09-04T04:20:09.395375409Z" level=info msg="Container 175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:09.404283 containerd[1573]: time="2025-09-04T04:20:09.404226310Z" level=info msg="CreateContainer within sandbox \"a194577cca760df75b1855ccad86b60fd1025ed8e9ef9bae021cd51b3735abb9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629\"" Sep 4 04:20:09.404804 containerd[1573]: time="2025-09-04T04:20:09.404768758Z" level=info msg="StartContainer for \"175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629\"" Sep 4 04:20:09.405713 containerd[1573]: time="2025-09-04T04:20:09.405658486Z" level=info msg="connecting to shim 175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629" address="unix:///run/containerd/s/9100451a20324a10929c2b366551b979100e5af243b5f2019dcfd350c506e9b5" protocol=ttrpc version=3 Sep 4 04:20:09.438353 systemd[1]: Started cri-containerd-175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629.scope - libcontainer container 175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629. Sep 4 04:20:09.479272 containerd[1573]: time="2025-09-04T04:20:09.479222365Z" level=info msg="StartContainer for \"175cb4f8f0abda877978a5253f7bee2e66bce32d98f5a9c3a3cfbb09b4402629\" returns successfully" Sep 4 04:20:10.069257 containerd[1573]: time="2025-09-04T04:20:10.069185705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-lxjhq,Uid:ef72b028-b203-4e62-b9a0-96331c5964b9,Namespace:calico-system,Attempt:0,}" Sep 4 04:20:10.168959 systemd-networkd[1477]: cali531e363e10f: Link UP Sep 4 04:20:10.169765 systemd-networkd[1477]: cali531e363e10f: Gained carrier Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.102 [INFO][4640] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--lxjhq-eth0 goldmane-54d579b49d- calico-system ef72b028-b203-4e62-b9a0-96331c5964b9 797 0 2025-09-04 04:19:43 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-lxjhq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali531e363e10f [] [] }} ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.102 [INFO][4640] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.130 [INFO][4655] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" HandleID="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Workload="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.130 [INFO][4655] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" HandleID="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Workload="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f850), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-lxjhq", "timestamp":"2025-09-04 04:20:10.130436666 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.130 [INFO][4655] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.130 [INFO][4655] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.130 [INFO][4655] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.137 [INFO][4655] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.141 [INFO][4655] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.146 [INFO][4655] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.148 [INFO][4655] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.151 [INFO][4655] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.151 [INFO][4655] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.152 [INFO][4655] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5 Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.156 [INFO][4655] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.162 [INFO][4655] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.162 [INFO][4655] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" host="localhost" Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.162 [INFO][4655] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:10.186145 containerd[1573]: 2025-09-04 04:20:10.162 [INFO][4655] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" HandleID="k8s-pod-network.57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Workload="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.166 [INFO][4640] cni-plugin/k8s.go 418: Populated endpoint ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--lxjhq-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ef72b028-b203-4e62-b9a0-96331c5964b9", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-lxjhq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali531e363e10f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.166 [INFO][4640] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.166 [INFO][4640] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali531e363e10f ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.171 [INFO][4640] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.171 [INFO][4640] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--lxjhq-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"ef72b028-b203-4e62-b9a0-96331c5964b9", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5", Pod:"goldmane-54d579b49d-lxjhq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali531e363e10f", MAC:"be:f8:80:91:0d:51", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:10.186999 containerd[1573]: 2025-09-04 04:20:10.182 [INFO][4640] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" Namespace="calico-system" Pod="goldmane-54d579b49d-lxjhq" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--lxjhq-eth0" Sep 4 04:20:10.214338 containerd[1573]: time="2025-09-04T04:20:10.214272247Z" level=info msg="connecting to shim 57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5" address="unix:///run/containerd/s/f2aa94dcbad9d4deb9de5a6f255b025e4b3e04166070bb620a38a87d20e18a99" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:10.242607 systemd[1]: Started cri-containerd-57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5.scope - libcontainer container 57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5. Sep 4 04:20:10.260413 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:10.312505 kubelet[2745]: I0904 04:20:10.312133 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-68c69cccff-fn8ng" podStartSLOduration=2.587151735 podStartE2EDuration="7.312045279s" podCreationTimestamp="2025-09-04 04:20:03 +0000 UTC" firstStartedPulling="2025-09-04 04:20:04.414633809 +0000 UTC m=+37.438671746" lastFinishedPulling="2025-09-04 04:20:09.139527353 +0000 UTC m=+42.163565290" observedRunningTime="2025-09-04 04:20:10.309893794 +0000 UTC m=+43.333931731" watchObservedRunningTime="2025-09-04 04:20:10.312045279 +0000 UTC m=+43.336083216" Sep 4 04:20:10.314794 kubelet[2745]: I0904 04:20:10.314668 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-zm9wc" podStartSLOduration=38.314633975 podStartE2EDuration="38.314633975s" podCreationTimestamp="2025-09-04 04:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 04:20:10.281005121 +0000 UTC m=+43.305043058" watchObservedRunningTime="2025-09-04 04:20:10.314633975 +0000 UTC m=+43.338671912" Sep 4 04:20:10.404897 containerd[1573]: time="2025-09-04T04:20:10.404730459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-lxjhq,Uid:ef72b028-b203-4e62-b9a0-96331c5964b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5\"" Sep 4 04:20:10.823536 containerd[1573]: time="2025-09-04T04:20:10.823468990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:10.824432 containerd[1573]: time="2025-09-04T04:20:10.824177479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 04:20:10.825448 containerd[1573]: time="2025-09-04T04:20:10.825404350Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:10.827647 containerd[1573]: time="2025-09-04T04:20:10.827607000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:10.828248 containerd[1573]: time="2025-09-04T04:20:10.828206726Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.686078505s" Sep 4 04:20:10.828248 containerd[1573]: time="2025-09-04T04:20:10.828243545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 04:20:10.829168 containerd[1573]: time="2025-09-04T04:20:10.829136680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 04:20:10.830708 containerd[1573]: time="2025-09-04T04:20:10.830683190Z" level=info msg="CreateContainer within sandbox \"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 04:20:10.842794 containerd[1573]: time="2025-09-04T04:20:10.842733371Z" level=info msg="Container 4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:10.878467 containerd[1573]: time="2025-09-04T04:20:10.878405878Z" level=info msg="CreateContainer within sandbox \"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40\"" Sep 4 04:20:10.879299 containerd[1573]: time="2025-09-04T04:20:10.879040329Z" level=info msg="StartContainer for \"4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40\"" Sep 4 04:20:10.880797 containerd[1573]: time="2025-09-04T04:20:10.880767007Z" level=info msg="connecting to shim 4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40" address="unix:///run/containerd/s/aef3c6b5db6c3d6470c917690b8e27add067a7279d64652ee897049856f365f3" protocol=ttrpc version=3 Sep 4 04:20:10.918370 systemd[1]: Started cri-containerd-4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40.scope - libcontainer container 4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40. Sep 4 04:20:10.943469 systemd-networkd[1477]: calid3da9f30e05: Gained IPv6LL Sep 4 04:20:10.994439 containerd[1573]: time="2025-09-04T04:20:10.994370761Z" level=info msg="StartContainer for \"4f3fd2b634d4a3c9be8f398663a1b975bc9fbc399ef5fdb7991687f5f5ea3d40\" returns successfully" Sep 4 04:20:11.072318 containerd[1573]: time="2025-09-04T04:20:11.072273833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-8jz66,Uid:6857fb4d-09bc-4243-9e00-1619a4fbac6b,Namespace:calico-apiserver,Attempt:0,}" Sep 4 04:20:11.182997 systemd-networkd[1477]: calif43243b12a4: Link UP Sep 4 04:20:11.183333 systemd-networkd[1477]: calif43243b12a4: Gained carrier Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.106 [INFO][4764] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0 calico-apiserver-66557b5466- calico-apiserver 6857fb4d-09bc-4243-9e00-1619a4fbac6b 799 0 2025-09-04 04:19:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:66557b5466 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-66557b5466-8jz66 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif43243b12a4 [] [] }} ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.106 [INFO][4764] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.137 [INFO][4778] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" HandleID="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Workload="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.137 [INFO][4778] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" HandleID="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Workload="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f2a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-66557b5466-8jz66", "timestamp":"2025-09-04 04:20:11.137570661 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.137 [INFO][4778] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.137 [INFO][4778] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.138 [INFO][4778] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.145 [INFO][4778] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.150 [INFO][4778] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.154 [INFO][4778] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.157 [INFO][4778] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.159 [INFO][4778] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.159 [INFO][4778] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.161 [INFO][4778] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.167 [INFO][4778] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.175 [INFO][4778] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.175 [INFO][4778] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" host="localhost" Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.175 [INFO][4778] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:11.197727 containerd[1573]: 2025-09-04 04:20:11.175 [INFO][4778] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" HandleID="k8s-pod-network.1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Workload="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.179 [INFO][4764] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0", GenerateName:"calico-apiserver-66557b5466-", Namespace:"calico-apiserver", SelfLink:"", UID:"6857fb4d-09bc-4243-9e00-1619a4fbac6b", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66557b5466", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-66557b5466-8jz66", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif43243b12a4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.179 [INFO][4764] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.179 [INFO][4764] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif43243b12a4 ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.183 [INFO][4764] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.184 [INFO][4764] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0", GenerateName:"calico-apiserver-66557b5466-", Namespace:"calico-apiserver", SelfLink:"", UID:"6857fb4d-09bc-4243-9e00-1619a4fbac6b", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66557b5466", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c", Pod:"calico-apiserver-66557b5466-8jz66", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif43243b12a4", MAC:"52:de:53:55:b2:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:11.198621 containerd[1573]: 2025-09-04 04:20:11.193 [INFO][4764] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-8jz66" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--8jz66-eth0" Sep 4 04:20:11.237911 containerd[1573]: time="2025-09-04T04:20:11.237832286Z" level=info msg="connecting to shim 1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c" address="unix:///run/containerd/s/a5fcd651d31a5289e10603a9a7f4b442acacdff770c77814f8bfde57aa943e54" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:11.273217 systemd[1]: Started cri-containerd-1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c.scope - libcontainer container 1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c. Sep 4 04:20:11.292262 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:11.327818 containerd[1573]: time="2025-09-04T04:20:11.327776472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-8jz66,Uid:6857fb4d-09bc-4243-9e00-1619a4fbac6b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c\"" Sep 4 04:20:11.520325 systemd-networkd[1477]: cali531e363e10f: Gained IPv6LL Sep 4 04:20:12.069394 containerd[1573]: time="2025-09-04T04:20:12.069334490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-5h544,Uid:206d3408-826b-403f-9856-5b773f0ee6ff,Namespace:calico-apiserver,Attempt:0,}" Sep 4 04:20:12.236534 systemd-networkd[1477]: cali775447e145d: Link UP Sep 4 04:20:12.237176 systemd-networkd[1477]: cali775447e145d: Gained carrier Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.104 [INFO][4846] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--66557b5466--5h544-eth0 calico-apiserver-66557b5466- calico-apiserver 206d3408-826b-403f-9856-5b773f0ee6ff 798 0 2025-09-04 04:19:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:66557b5466 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-66557b5466-5h544 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali775447e145d [] [] }} ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.104 [INFO][4846] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.145 [INFO][4861] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" HandleID="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Workload="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.145 [INFO][4861] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" HandleID="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Workload="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001b0e30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-66557b5466-5h544", "timestamp":"2025-09-04 04:20:12.145169901 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.145 [INFO][4861] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.145 [INFO][4861] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.145 [INFO][4861] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.158 [INFO][4861] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.168 [INFO][4861] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.180 [INFO][4861] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.191 [INFO][4861] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.195 [INFO][4861] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.195 [INFO][4861] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.197 [INFO][4861] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553 Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.203 [INFO][4861] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.227 [INFO][4861] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.227 [INFO][4861] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" host="localhost" Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.227 [INFO][4861] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 04:20:12.265240 containerd[1573]: 2025-09-04 04:20:12.227 [INFO][4861] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" HandleID="k8s-pod-network.8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Workload="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.231 [INFO][4846] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66557b5466--5h544-eth0", GenerateName:"calico-apiserver-66557b5466-", Namespace:"calico-apiserver", SelfLink:"", UID:"206d3408-826b-403f-9856-5b773f0ee6ff", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66557b5466", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-66557b5466-5h544", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali775447e145d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.231 [INFO][4846] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.231 [INFO][4846] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali775447e145d ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.235 [INFO][4846] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.236 [INFO][4846] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66557b5466--5h544-eth0", GenerateName:"calico-apiserver-66557b5466-", Namespace:"calico-apiserver", SelfLink:"", UID:"206d3408-826b-403f-9856-5b773f0ee6ff", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66557b5466", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553", Pod:"calico-apiserver-66557b5466-5h544", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali775447e145d", MAC:"d6:19:f4:70:4c:ba", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 04:20:12.266500 containerd[1573]: 2025-09-04 04:20:12.255 [INFO][4846] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" Namespace="calico-apiserver" Pod="calico-apiserver-66557b5466-5h544" WorkloadEndpoint="localhost-k8s-calico--apiserver--66557b5466--5h544-eth0" Sep 4 04:20:12.365860 systemd[1]: Started sshd@8-10.0.0.54:22-10.0.0.1:41926.service - OpenSSH per-connection server daemon (10.0.0.1:41926). Sep 4 04:20:12.467255 sshd[4879]: Accepted publickey for core from 10.0.0.1 port 41926 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:12.469863 sshd-session[4879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:12.472173 containerd[1573]: time="2025-09-04T04:20:12.472114322Z" level=info msg="connecting to shim 8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553" address="unix:///run/containerd/s/4ac19693a4c279d992e76885dd4275a76a003450578b0c157bf162d33d7275e6" namespace=k8s.io protocol=ttrpc version=3 Sep 4 04:20:12.475861 systemd-logind[1521]: New session 9 of user core. Sep 4 04:20:12.480314 systemd-networkd[1477]: calif43243b12a4: Gained IPv6LL Sep 4 04:20:12.482338 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 04:20:12.503236 systemd[1]: Started cri-containerd-8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553.scope - libcontainer container 8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553. Sep 4 04:20:12.517298 systemd-resolved[1410]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 04:20:12.614752 containerd[1573]: time="2025-09-04T04:20:12.614664607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66557b5466-5h544,Uid:206d3408-826b-403f-9856-5b773f0ee6ff,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553\"" Sep 4 04:20:12.701244 sshd[4916]: Connection closed by 10.0.0.1 port 41926 Sep 4 04:20:12.701697 sshd-session[4879]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:12.707993 systemd-logind[1521]: Session 9 logged out. Waiting for processes to exit. Sep 4 04:20:12.708914 systemd[1]: sshd@8-10.0.0.54:22-10.0.0.1:41926.service: Deactivated successfully. Sep 4 04:20:12.711753 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 04:20:12.713757 systemd-logind[1521]: Removed session 9. Sep 4 04:20:13.411625 containerd[1573]: time="2025-09-04T04:20:13.411559474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:13.428879 containerd[1573]: time="2025-09-04T04:20:13.428816770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 04:20:13.479522 containerd[1573]: time="2025-09-04T04:20:13.479455292Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:13.488302 containerd[1573]: time="2025-09-04T04:20:13.488252200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:13.488915 containerd[1573]: time="2025-09-04T04:20:13.488870260Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.659702482s" Sep 4 04:20:13.488975 containerd[1573]: time="2025-09-04T04:20:13.488917789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 04:20:13.489955 containerd[1573]: time="2025-09-04T04:20:13.489922282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 04:20:13.503269 systemd-networkd[1477]: cali775447e145d: Gained IPv6LL Sep 4 04:20:13.506265 containerd[1573]: time="2025-09-04T04:20:13.505787918Z" level=info msg="CreateContainer within sandbox \"4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 04:20:13.517954 containerd[1573]: time="2025-09-04T04:20:13.517893442Z" level=info msg="Container 2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:13.538088 containerd[1573]: time="2025-09-04T04:20:13.538004539Z" level=info msg="CreateContainer within sandbox \"4f3dcd3a0664c793a3cbc885ff5e6e4db80f68f4fb207fc56d57251396929c84\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\"" Sep 4 04:20:13.538771 containerd[1573]: time="2025-09-04T04:20:13.538718700Z" level=info msg="StartContainer for \"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\"" Sep 4 04:20:13.541663 containerd[1573]: time="2025-09-04T04:20:13.541625661Z" level=info msg="connecting to shim 2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d" address="unix:///run/containerd/s/2ac8bf1337be7b9d35902b351b23e90ac03e0ee4d35575065387e89816cc5552" protocol=ttrpc version=3 Sep 4 04:20:13.586297 systemd[1]: Started cri-containerd-2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d.scope - libcontainer container 2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d. Sep 4 04:20:13.651589 containerd[1573]: time="2025-09-04T04:20:13.651536818Z" level=info msg="StartContainer for \"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\" returns successfully" Sep 4 04:20:13.834100 kubelet[2745]: I0904 04:20:13.833878 2745 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 04:20:14.011370 containerd[1573]: time="2025-09-04T04:20:14.011302513Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\" id:\"7c19a351772aca392593ade946393c2657a3f64aec0793d21357d5b04ebda00f\" pid:5004 exited_at:{seconds:1756959614 nanos:10606568}" Sep 4 04:20:14.122398 containerd[1573]: time="2025-09-04T04:20:14.122139874Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\" id:\"6fc66e415806403edba1441f3bd4f33b2c9552e138e739321e9c44a029cd75c4\" pid:5028 exited_at:{seconds:1756959614 nanos:121725176}" Sep 4 04:20:15.353729 containerd[1573]: time="2025-09-04T04:20:15.353672850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\" id:\"43085da4ccdda522101498c613b4e99dc0d11aff2cc14fd657fb0df9a354260e\" pid:5060 exited_at:{seconds:1756959615 nanos:353260827}" Sep 4 04:20:15.429204 kubelet[2745]: I0904 04:20:15.429093 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-9444b947f-288sl" podStartSLOduration=25.467057333 podStartE2EDuration="31.429047479s" podCreationTimestamp="2025-09-04 04:19:44 +0000 UTC" firstStartedPulling="2025-09-04 04:20:07.52779554 +0000 UTC m=+40.551833467" lastFinishedPulling="2025-09-04 04:20:13.489785676 +0000 UTC m=+46.513823613" observedRunningTime="2025-09-04 04:20:14.40148364 +0000 UTC m=+47.425521577" watchObservedRunningTime="2025-09-04 04:20:15.429047479 +0000 UTC m=+48.453085416" Sep 4 04:20:15.834712 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3025330516.mount: Deactivated successfully. Sep 4 04:20:16.752118 containerd[1573]: time="2025-09-04T04:20:16.752017913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:16.753459 containerd[1573]: time="2025-09-04T04:20:16.753264329Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 04:20:16.755376 containerd[1573]: time="2025-09-04T04:20:16.755343010Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:16.768516 containerd[1573]: time="2025-09-04T04:20:16.768443134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:16.769570 containerd[1573]: time="2025-09-04T04:20:16.769503249Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.279548295s" Sep 4 04:20:16.769570 containerd[1573]: time="2025-09-04T04:20:16.769556189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 04:20:16.771100 containerd[1573]: time="2025-09-04T04:20:16.770643875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 04:20:16.772860 containerd[1573]: time="2025-09-04T04:20:16.772822405Z" level=info msg="CreateContainer within sandbox \"57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 04:20:16.783302 containerd[1573]: time="2025-09-04T04:20:16.783238979Z" level=info msg="Container 5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:16.798198 containerd[1573]: time="2025-09-04T04:20:16.798125955Z" level=info msg="CreateContainer within sandbox \"57ff6f5a193e6b47eef37253972915fa44d1bf8f6dca0e0e095403bc3a0f08e5\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\"" Sep 4 04:20:16.798780 containerd[1573]: time="2025-09-04T04:20:16.798727477Z" level=info msg="StartContainer for \"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\"" Sep 4 04:20:16.800405 containerd[1573]: time="2025-09-04T04:20:16.800374355Z" level=info msg="connecting to shim 5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab" address="unix:///run/containerd/s/f2aa94dcbad9d4deb9de5a6f255b025e4b3e04166070bb620a38a87d20e18a99" protocol=ttrpc version=3 Sep 4 04:20:16.849308 systemd[1]: Started cri-containerd-5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab.scope - libcontainer container 5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab. Sep 4 04:20:16.906476 containerd[1573]: time="2025-09-04T04:20:16.906418343Z" level=info msg="StartContainer for \"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" returns successfully" Sep 4 04:20:17.319363 kubelet[2745]: I0904 04:20:17.319144 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-lxjhq" podStartSLOduration=27.954861805 podStartE2EDuration="34.31912239s" podCreationTimestamp="2025-09-04 04:19:43 +0000 UTC" firstStartedPulling="2025-09-04 04:20:10.406228198 +0000 UTC m=+43.430266135" lastFinishedPulling="2025-09-04 04:20:16.770488773 +0000 UTC m=+49.794526720" observedRunningTime="2025-09-04 04:20:17.318407798 +0000 UTC m=+50.342445735" watchObservedRunningTime="2025-09-04 04:20:17.31912239 +0000 UTC m=+50.343160327" Sep 4 04:20:17.643443 containerd[1573]: time="2025-09-04T04:20:17.643287426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"e54dfe5f88c478e221a50c49df43dcbf7d15788eb1111c4910f90d4c9060c394\" pid:5134 exit_status:1 exited_at:{seconds:1756959617 nanos:642645003}" Sep 4 04:20:17.714838 systemd[1]: Started sshd@9-10.0.0.54:22-10.0.0.1:41936.service - OpenSSH per-connection server daemon (10.0.0.1:41936). Sep 4 04:20:17.745228 containerd[1573]: time="2025-09-04T04:20:17.745164745Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"99abf53845d0acad957a97e2d737638e83f868f153b152c4a7ecb7f770b0cb14\" pid:5157 exit_status:1 exited_at:{seconds:1756959617 nanos:744603911}" Sep 4 04:20:17.789255 sshd[5170]: Accepted publickey for core from 10.0.0.1 port 41936 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:17.791253 sshd-session[5170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:17.797045 systemd-logind[1521]: New session 10 of user core. Sep 4 04:20:17.808250 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 04:20:17.958103 sshd[5175]: Connection closed by 10.0.0.1 port 41936 Sep 4 04:20:17.958552 sshd-session[5170]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:17.964975 systemd-logind[1521]: Session 10 logged out. Waiting for processes to exit. Sep 4 04:20:17.965154 systemd[1]: sshd@9-10.0.0.54:22-10.0.0.1:41936.service: Deactivated successfully. Sep 4 04:20:17.967574 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 04:20:17.970697 systemd-logind[1521]: Removed session 10. Sep 4 04:20:18.409796 containerd[1573]: time="2025-09-04T04:20:18.409347875Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"9fa07d2c732944c63a2047078cd55585653f75d21bbd1e1620c04335b046a3b3\" pid:5201 exit_status:1 exited_at:{seconds:1756959618 nanos:408878217}" Sep 4 04:20:19.073921 containerd[1573]: time="2025-09-04T04:20:19.073827356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:19.075118 containerd[1573]: time="2025-09-04T04:20:19.075038153Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 04:20:19.077282 containerd[1573]: time="2025-09-04T04:20:19.077204956Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:19.079900 containerd[1573]: time="2025-09-04T04:20:19.079811799Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:19.080729 containerd[1573]: time="2025-09-04T04:20:19.080582937Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.309908434s" Sep 4 04:20:19.080729 containerd[1573]: time="2025-09-04T04:20:19.080628886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 04:20:19.089311 containerd[1573]: time="2025-09-04T04:20:19.088988905Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 04:20:19.098986 containerd[1573]: time="2025-09-04T04:20:19.098931107Z" level=info msg="CreateContainer within sandbox \"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 04:20:19.112761 containerd[1573]: time="2025-09-04T04:20:19.112552785Z" level=info msg="Container e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:19.151169 containerd[1573]: time="2025-09-04T04:20:19.151082857Z" level=info msg="CreateContainer within sandbox \"b1f287dec228fd8675d39e2882f4b1e2d8bbf0f78b73b72aabcc7e7c5a4641ec\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909\"" Sep 4 04:20:19.154735 containerd[1573]: time="2025-09-04T04:20:19.154660101Z" level=info msg="StartContainer for \"e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909\"" Sep 4 04:20:19.156999 containerd[1573]: time="2025-09-04T04:20:19.156935735Z" level=info msg="connecting to shim e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909" address="unix:///run/containerd/s/aef3c6b5db6c3d6470c917690b8e27add067a7279d64652ee897049856f365f3" protocol=ttrpc version=3 Sep 4 04:20:19.183386 systemd[1]: Started cri-containerd-e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909.scope - libcontainer container e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909. Sep 4 04:20:19.244576 containerd[1573]: time="2025-09-04T04:20:19.244527189Z" level=info msg="StartContainer for \"e041fbbcc44ece5b90385433ad08912993452c1481e032c6fb12c46ffb8b8909\" returns successfully" Sep 4 04:20:19.355993 kubelet[2745]: I0904 04:20:19.355432 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7q5r5" podStartSLOduration=23.722340791 podStartE2EDuration="35.3554116s" podCreationTimestamp="2025-09-04 04:19:44 +0000 UTC" firstStartedPulling="2025-09-04 04:20:07.455792834 +0000 UTC m=+40.479830761" lastFinishedPulling="2025-09-04 04:20:19.088863613 +0000 UTC m=+52.112901570" observedRunningTime="2025-09-04 04:20:19.3510452 +0000 UTC m=+52.375083137" watchObservedRunningTime="2025-09-04 04:20:19.3554116 +0000 UTC m=+52.379449537" Sep 4 04:20:19.420372 containerd[1573]: time="2025-09-04T04:20:19.420308982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"9d4250191c90a20d101d7003a3092d75ade2aeabe9c2c7cf18d1d5f6ff3fe0f1\" pid:5260 exit_status:1 exited_at:{seconds:1756959619 nanos:419807655}" Sep 4 04:20:20.146874 kubelet[2745]: I0904 04:20:20.146795 2745 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 04:20:20.147052 kubelet[2745]: I0904 04:20:20.146908 2745 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 04:20:22.234433 containerd[1573]: time="2025-09-04T04:20:22.234362828Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:22.235909 containerd[1573]: time="2025-09-04T04:20:22.235827168Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 04:20:22.237404 containerd[1573]: time="2025-09-04T04:20:22.237361313Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:22.239983 containerd[1573]: time="2025-09-04T04:20:22.239942915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:22.240770 containerd[1573]: time="2025-09-04T04:20:22.240738057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.151699135s" Sep 4 04:20:22.240770 containerd[1573]: time="2025-09-04T04:20:22.240769146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 04:20:22.243280 containerd[1573]: time="2025-09-04T04:20:22.243248792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 04:20:22.245164 containerd[1573]: time="2025-09-04T04:20:22.245139043Z" level=info msg="CreateContainer within sandbox \"1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 04:20:22.255963 containerd[1573]: time="2025-09-04T04:20:22.255881643Z" level=info msg="Container 4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:22.269959 containerd[1573]: time="2025-09-04T04:20:22.269884143Z" level=info msg="CreateContainer within sandbox \"1cb91dc5eeaa78ad54363ff2481435288e79917573ed0599b20860365434218c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a\"" Sep 4 04:20:22.270612 containerd[1573]: time="2025-09-04T04:20:22.270467877Z" level=info msg="StartContainer for \"4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a\"" Sep 4 04:20:22.271794 containerd[1573]: time="2025-09-04T04:20:22.271742332Z" level=info msg="connecting to shim 4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a" address="unix:///run/containerd/s/a5fcd651d31a5289e10603a9a7f4b442acacdff770c77814f8bfde57aa943e54" protocol=ttrpc version=3 Sep 4 04:20:22.304354 systemd[1]: Started cri-containerd-4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a.scope - libcontainer container 4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a. Sep 4 04:20:22.518827 containerd[1573]: time="2025-09-04T04:20:22.518628413Z" level=info msg="StartContainer for \"4b223b11964fe4b0c90dd0a915be9a5b77b85ddbc521da6071e22c035cc44b4a\" returns successfully" Sep 4 04:20:22.674087 containerd[1573]: time="2025-09-04T04:20:22.673556861Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 04:20:22.676973 containerd[1573]: time="2025-09-04T04:20:22.676878299Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 04:20:22.679306 containerd[1573]: time="2025-09-04T04:20:22.679257370Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 435.976256ms" Sep 4 04:20:22.679306 containerd[1573]: time="2025-09-04T04:20:22.679301906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 04:20:22.683549 containerd[1573]: time="2025-09-04T04:20:22.683460446Z" level=info msg="CreateContainer within sandbox \"8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 04:20:22.702607 containerd[1573]: time="2025-09-04T04:20:22.702445331Z" level=info msg="Container 820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240: CDI devices from CRI Config.CDIDevices: []" Sep 4 04:20:22.723152 containerd[1573]: time="2025-09-04T04:20:22.723086116Z" level=info msg="CreateContainer within sandbox \"8292f921fce5a9e183a9e61c9f5aeb9625b7f6c96c3f736f7417310991821553\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240\"" Sep 4 04:20:22.724362 containerd[1573]: time="2025-09-04T04:20:22.724292570Z" level=info msg="StartContainer for \"820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240\"" Sep 4 04:20:22.726348 containerd[1573]: time="2025-09-04T04:20:22.726306779Z" level=info msg="connecting to shim 820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240" address="unix:///run/containerd/s/4ac19693a4c279d992e76885dd4275a76a003450578b0c157bf162d33d7275e6" protocol=ttrpc version=3 Sep 4 04:20:22.764399 systemd[1]: Started cri-containerd-820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240.scope - libcontainer container 820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240. Sep 4 04:20:22.895692 containerd[1573]: time="2025-09-04T04:20:22.895447099Z" level=info msg="StartContainer for \"820ef1573fdac49f557dec9bb9e69975db0c2cc17244e7b6372fa7c29997f240\" returns successfully" Sep 4 04:20:22.978834 systemd[1]: Started sshd@10-10.0.0.54:22-10.0.0.1:45100.service - OpenSSH per-connection server daemon (10.0.0.1:45100). Sep 4 04:20:23.095727 sshd[5355]: Accepted publickey for core from 10.0.0.1 port 45100 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:23.097955 sshd-session[5355]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:23.105337 systemd-logind[1521]: New session 11 of user core. Sep 4 04:20:23.114348 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 04:20:23.285149 sshd[5360]: Connection closed by 10.0.0.1 port 45100 Sep 4 04:20:23.285668 sshd-session[5355]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:23.299280 systemd[1]: sshd@10-10.0.0.54:22-10.0.0.1:45100.service: Deactivated successfully. Sep 4 04:20:23.302280 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 04:20:23.303573 systemd-logind[1521]: Session 11 logged out. Waiting for processes to exit. Sep 4 04:20:23.307884 systemd[1]: Started sshd@11-10.0.0.54:22-10.0.0.1:45108.service - OpenSSH per-connection server daemon (10.0.0.1:45108). Sep 4 04:20:23.309131 systemd-logind[1521]: Removed session 11. Sep 4 04:20:23.389087 sshd[5375]: Accepted publickey for core from 10.0.0.1 port 45108 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:23.391465 sshd-session[5375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:23.397979 systemd-logind[1521]: New session 12 of user core. Sep 4 04:20:23.404816 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 04:20:23.406446 kubelet[2745]: I0904 04:20:23.406356 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-66557b5466-5h544" podStartSLOduration=32.343405122 podStartE2EDuration="42.406329303s" podCreationTimestamp="2025-09-04 04:19:41 +0000 UTC" firstStartedPulling="2025-09-04 04:20:12.617463747 +0000 UTC m=+45.641501684" lastFinishedPulling="2025-09-04 04:20:22.680387918 +0000 UTC m=+55.704425865" observedRunningTime="2025-09-04 04:20:23.404574907 +0000 UTC m=+56.428612874" watchObservedRunningTime="2025-09-04 04:20:23.406329303 +0000 UTC m=+56.430367240" Sep 4 04:20:23.487100 kubelet[2745]: I0904 04:20:23.486878 2745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-66557b5466-8jz66" podStartSLOduration=31.572919324 podStartE2EDuration="42.486850979s" podCreationTimestamp="2025-09-04 04:19:41 +0000 UTC" firstStartedPulling="2025-09-04 04:20:11.329002762 +0000 UTC m=+44.353040699" lastFinishedPulling="2025-09-04 04:20:22.242934417 +0000 UTC m=+55.266972354" observedRunningTime="2025-09-04 04:20:23.482745167 +0000 UTC m=+56.506783104" watchObservedRunningTime="2025-09-04 04:20:23.486850979 +0000 UTC m=+56.510888946" Sep 4 04:20:23.630812 sshd[5378]: Connection closed by 10.0.0.1 port 45108 Sep 4 04:20:23.631420 sshd-session[5375]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:23.644273 systemd[1]: sshd@11-10.0.0.54:22-10.0.0.1:45108.service: Deactivated successfully. Sep 4 04:20:23.649302 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 04:20:23.651141 systemd-logind[1521]: Session 12 logged out. Waiting for processes to exit. Sep 4 04:20:23.659476 systemd[1]: Started sshd@12-10.0.0.54:22-10.0.0.1:45116.service - OpenSSH per-connection server daemon (10.0.0.1:45116). Sep 4 04:20:23.662246 systemd-logind[1521]: Removed session 12. Sep 4 04:20:23.724183 sshd[5394]: Accepted publickey for core from 10.0.0.1 port 45116 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:23.726840 sshd-session[5394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:23.736010 systemd-logind[1521]: New session 13 of user core. Sep 4 04:20:23.742458 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 04:20:23.908701 sshd[5397]: Connection closed by 10.0.0.1 port 45116 Sep 4 04:20:23.908916 sshd-session[5394]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:23.914829 systemd[1]: sshd@12-10.0.0.54:22-10.0.0.1:45116.service: Deactivated successfully. Sep 4 04:20:23.917105 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 04:20:23.918037 systemd-logind[1521]: Session 13 logged out. Waiting for processes to exit. Sep 4 04:20:23.919565 systemd-logind[1521]: Removed session 13. Sep 4 04:20:28.932376 systemd[1]: Started sshd@13-10.0.0.54:22-10.0.0.1:45120.service - OpenSSH per-connection server daemon (10.0.0.1:45120). Sep 4 04:20:29.014724 sshd[5424]: Accepted publickey for core from 10.0.0.1 port 45120 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:29.017513 sshd-session[5424]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:29.023494 systemd-logind[1521]: New session 14 of user core. Sep 4 04:20:29.033418 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 04:20:29.194697 sshd[5427]: Connection closed by 10.0.0.1 port 45120 Sep 4 04:20:29.195401 sshd-session[5424]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:29.200448 systemd[1]: sshd@13-10.0.0.54:22-10.0.0.1:45120.service: Deactivated successfully. Sep 4 04:20:29.202902 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 04:20:29.204715 systemd-logind[1521]: Session 14 logged out. Waiting for processes to exit. Sep 4 04:20:29.206493 systemd-logind[1521]: Removed session 14. Sep 4 04:20:34.209164 systemd[1]: Started sshd@14-10.0.0.54:22-10.0.0.1:32910.service - OpenSSH per-connection server daemon (10.0.0.1:32910). Sep 4 04:20:34.295046 sshd[5448]: Accepted publickey for core from 10.0.0.1 port 32910 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:34.297298 sshd-session[5448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:34.303805 systemd-logind[1521]: New session 15 of user core. Sep 4 04:20:34.322388 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 04:20:34.457337 sshd[5451]: Connection closed by 10.0.0.1 port 32910 Sep 4 04:20:34.457736 sshd-session[5448]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:34.462600 systemd[1]: sshd@14-10.0.0.54:22-10.0.0.1:32910.service: Deactivated successfully. Sep 4 04:20:34.465171 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 04:20:34.466112 systemd-logind[1521]: Session 15 logged out. Waiting for processes to exit. Sep 4 04:20:34.467934 systemd-logind[1521]: Removed session 15. Sep 4 04:20:39.477196 systemd[1]: Started sshd@15-10.0.0.54:22-10.0.0.1:32912.service - OpenSSH per-connection server daemon (10.0.0.1:32912). Sep 4 04:20:39.548663 sshd[5469]: Accepted publickey for core from 10.0.0.1 port 32912 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:39.550686 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:39.556091 systemd-logind[1521]: New session 16 of user core. Sep 4 04:20:39.565341 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 04:20:39.706413 sshd[5472]: Connection closed by 10.0.0.1 port 32912 Sep 4 04:20:39.706987 sshd-session[5469]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:39.714657 systemd[1]: sshd@15-10.0.0.54:22-10.0.0.1:32912.service: Deactivated successfully. Sep 4 04:20:39.717247 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 04:20:39.718340 systemd-logind[1521]: Session 16 logged out. Waiting for processes to exit. Sep 4 04:20:39.719777 systemd-logind[1521]: Removed session 16. Sep 4 04:20:44.124725 containerd[1573]: time="2025-09-04T04:20:44.124669765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\" id:\"9a850ba83bdef4fb7dacc57ff642a0573b95833ae120a1aae268f82303416314\" pid:5497 exited_at:{seconds:1756959644 nanos:124322994}" Sep 4 04:20:44.749641 systemd[1]: Started sshd@16-10.0.0.54:22-10.0.0.1:48796.service - OpenSSH per-connection server daemon (10.0.0.1:48796). Sep 4 04:20:44.781294 containerd[1573]: time="2025-09-04T04:20:44.774754344Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"aa17171f85335244f8340bf9b9ee537a972e75879047ed9c52d85a857fa67a23\" pid:5522 exited_at:{seconds:1756959644 nanos:771984632}" Sep 4 04:20:44.988122 sshd[5534]: Accepted publickey for core from 10.0.0.1 port 48796 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:44.992862 sshd-session[5534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:45.019518 systemd-logind[1521]: New session 17 of user core. Sep 4 04:20:45.030363 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 04:20:45.350207 sshd[5537]: Connection closed by 10.0.0.1 port 48796 Sep 4 04:20:45.352607 sshd-session[5534]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:45.381618 systemd[1]: sshd@16-10.0.0.54:22-10.0.0.1:48796.service: Deactivated successfully. Sep 4 04:20:45.392827 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 04:20:45.401360 systemd-logind[1521]: Session 17 logged out. Waiting for processes to exit. Sep 4 04:20:45.404996 systemd[1]: Started sshd@17-10.0.0.54:22-10.0.0.1:48798.service - OpenSSH per-connection server daemon (10.0.0.1:48798). Sep 4 04:20:45.410991 systemd-logind[1521]: Removed session 17. Sep 4 04:20:45.463278 containerd[1573]: time="2025-09-04T04:20:45.463174816Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\" id:\"7a18e9bf3aae2e5a28007d43fffe30c96a94eb57e94339fd936d209f4db1b0f7\" pid:5567 exited_at:{seconds:1756959645 nanos:462495583}" Sep 4 04:20:45.560980 sshd[5576]: Accepted publickey for core from 10.0.0.1 port 48798 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:45.567011 sshd-session[5576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:45.577746 systemd-logind[1521]: New session 18 of user core. Sep 4 04:20:45.591440 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 04:20:46.480017 sshd[5583]: Connection closed by 10.0.0.1 port 48798 Sep 4 04:20:46.482252 sshd-session[5576]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:46.497846 systemd[1]: Started sshd@18-10.0.0.54:22-10.0.0.1:48806.service - OpenSSH per-connection server daemon (10.0.0.1:48806). Sep 4 04:20:46.500474 systemd[1]: sshd@17-10.0.0.54:22-10.0.0.1:48798.service: Deactivated successfully. Sep 4 04:20:46.504858 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 04:20:46.508247 systemd-logind[1521]: Session 18 logged out. Waiting for processes to exit. Sep 4 04:20:46.515753 systemd-logind[1521]: Removed session 18. Sep 4 04:20:46.587707 sshd[5591]: Accepted publickey for core from 10.0.0.1 port 48806 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:46.589319 sshd-session[5591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:46.595107 systemd-logind[1521]: New session 19 of user core. Sep 4 04:20:46.600222 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 04:20:47.555955 sshd[5598]: Connection closed by 10.0.0.1 port 48806 Sep 4 04:20:47.557901 sshd-session[5591]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:47.566135 systemd[1]: sshd@18-10.0.0.54:22-10.0.0.1:48806.service: Deactivated successfully. Sep 4 04:20:47.568305 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 04:20:47.569525 systemd-logind[1521]: Session 19 logged out. Waiting for processes to exit. Sep 4 04:20:47.572868 systemd[1]: Started sshd@19-10.0.0.54:22-10.0.0.1:48812.service - OpenSSH per-connection server daemon (10.0.0.1:48812). Sep 4 04:20:47.573876 systemd-logind[1521]: Removed session 19. Sep 4 04:20:47.641108 sshd[5617]: Accepted publickey for core from 10.0.0.1 port 48812 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:47.643077 sshd-session[5617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:47.647965 systemd-logind[1521]: New session 20 of user core. Sep 4 04:20:47.658229 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 04:20:48.033076 sshd[5620]: Connection closed by 10.0.0.1 port 48812 Sep 4 04:20:48.033581 sshd-session[5617]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:48.046397 systemd[1]: sshd@19-10.0.0.54:22-10.0.0.1:48812.service: Deactivated successfully. Sep 4 04:20:48.048806 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 04:20:48.049748 systemd-logind[1521]: Session 20 logged out. Waiting for processes to exit. Sep 4 04:20:48.053658 systemd[1]: Started sshd@20-10.0.0.54:22-10.0.0.1:48824.service - OpenSSH per-connection server daemon (10.0.0.1:48824). Sep 4 04:20:48.054975 systemd-logind[1521]: Removed session 20. Sep 4 04:20:48.122764 sshd[5631]: Accepted publickey for core from 10.0.0.1 port 48824 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:48.125346 sshd-session[5631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:48.133147 systemd-logind[1521]: New session 21 of user core. Sep 4 04:20:48.136392 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 04:20:48.142454 containerd[1573]: time="2025-09-04T04:20:48.142399343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\" id:\"78305686ab2b6e966c684756c7ead5aecf8c41b82c6f549932f555224c1d7e05\" pid:5647 exited_at:{seconds:1756959648 nanos:141889113}" Sep 4 04:20:48.287719 sshd[5658]: Connection closed by 10.0.0.1 port 48824 Sep 4 04:20:48.288217 sshd-session[5631]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:48.293792 systemd[1]: sshd@20-10.0.0.54:22-10.0.0.1:48824.service: Deactivated successfully. Sep 4 04:20:48.296893 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 04:20:48.298192 systemd-logind[1521]: Session 21 logged out. Waiting for processes to exit. Sep 4 04:20:48.300306 systemd-logind[1521]: Removed session 21. Sep 4 04:20:49.400865 containerd[1573]: time="2025-09-04T04:20:49.400816136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5670c764b428d4d92a048cb6462d51720a0e1ce5b5ccb7024411a2baee970cab\" id:\"95a03c98d6feecccc8d28ae66dcf569389a6456c4449c97bb1d93a6e53c75d7b\" pid:5684 exited_at:{seconds:1756959649 nanos:400465289}" Sep 4 04:20:53.307380 systemd[1]: Started sshd@21-10.0.0.54:22-10.0.0.1:42178.service - OpenSSH per-connection server daemon (10.0.0.1:42178). Sep 4 04:20:53.395187 sshd[5700]: Accepted publickey for core from 10.0.0.1 port 42178 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:53.397404 sshd-session[5700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:53.404837 systemd-logind[1521]: New session 22 of user core. Sep 4 04:20:53.419324 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 04:20:53.593875 sshd[5703]: Connection closed by 10.0.0.1 port 42178 Sep 4 04:20:53.594217 sshd-session[5700]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:53.598981 systemd[1]: sshd@21-10.0.0.54:22-10.0.0.1:42178.service: Deactivated successfully. Sep 4 04:20:53.601399 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 04:20:53.602336 systemd-logind[1521]: Session 22 logged out. Waiting for processes to exit. Sep 4 04:20:53.604277 systemd-logind[1521]: Removed session 22. Sep 4 04:20:58.611390 systemd[1]: Started sshd@22-10.0.0.54:22-10.0.0.1:42180.service - OpenSSH per-connection server daemon (10.0.0.1:42180). Sep 4 04:20:58.672585 sshd[5718]: Accepted publickey for core from 10.0.0.1 port 42180 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:20:58.674198 sshd-session[5718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:20:58.678420 systemd-logind[1521]: New session 23 of user core. Sep 4 04:20:58.689178 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 04:20:58.807433 sshd[5721]: Connection closed by 10.0.0.1 port 42180 Sep 4 04:20:58.807787 sshd-session[5718]: pam_unix(sshd:session): session closed for user core Sep 4 04:20:58.811379 systemd[1]: sshd@22-10.0.0.54:22-10.0.0.1:42180.service: Deactivated successfully. Sep 4 04:20:58.813537 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 04:20:58.814977 systemd-logind[1521]: Session 23 logged out. Waiting for processes to exit. Sep 4 04:20:58.816305 systemd-logind[1521]: Removed session 23. Sep 4 04:21:03.826291 systemd[1]: Started sshd@23-10.0.0.54:22-10.0.0.1:44262.service - OpenSSH per-connection server daemon (10.0.0.1:44262). Sep 4 04:21:03.876931 sshd[5737]: Accepted publickey for core from 10.0.0.1 port 44262 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:21:03.879028 sshd-session[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:21:03.884899 systemd-logind[1521]: New session 24 of user core. Sep 4 04:21:03.889248 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 04:21:04.011688 sshd[5740]: Connection closed by 10.0.0.1 port 44262 Sep 4 04:21:04.012116 sshd-session[5737]: pam_unix(sshd:session): session closed for user core Sep 4 04:21:04.016981 systemd[1]: sshd@23-10.0.0.54:22-10.0.0.1:44262.service: Deactivated successfully. Sep 4 04:21:04.019497 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 04:21:04.020389 systemd-logind[1521]: Session 24 logged out. Waiting for processes to exit. Sep 4 04:21:04.022098 systemd-logind[1521]: Removed session 24. Sep 4 04:21:09.027719 systemd[1]: Started sshd@24-10.0.0.54:22-10.0.0.1:44264.service - OpenSSH per-connection server daemon (10.0.0.1:44264). Sep 4 04:21:09.091995 sshd[5754]: Accepted publickey for core from 10.0.0.1 port 44264 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:21:09.094973 sshd-session[5754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:21:09.101933 systemd-logind[1521]: New session 25 of user core. Sep 4 04:21:09.110335 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 04:21:09.259689 sshd[5757]: Connection closed by 10.0.0.1 port 44264 Sep 4 04:21:09.261462 sshd-session[5754]: pam_unix(sshd:session): session closed for user core Sep 4 04:21:09.267252 systemd[1]: sshd@24-10.0.0.54:22-10.0.0.1:44264.service: Deactivated successfully. Sep 4 04:21:09.270109 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 04:21:09.271040 systemd-logind[1521]: Session 25 logged out. Waiting for processes to exit. Sep 4 04:21:09.272881 systemd-logind[1521]: Removed session 25. Sep 4 04:21:14.133202 containerd[1573]: time="2025-09-04T04:21:14.133150322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c99448522e661cf8a9ad37633c93830b9159fdc14588033e7ea53bebfbd5030c\" id:\"7fb3393e1dfbb52ac437f496743f4bbb4a13b807da961d94b5f6228316dd3ef3\" pid:5781 exited_at:{seconds:1756959674 nanos:132371089}" Sep 4 04:21:14.276343 systemd[1]: Started sshd@25-10.0.0.54:22-10.0.0.1:43658.service - OpenSSH per-connection server daemon (10.0.0.1:43658). Sep 4 04:21:14.356763 sshd[5794]: Accepted publickey for core from 10.0.0.1 port 43658 ssh2: RSA SHA256:9+vpZc6EfwWxHenC1ZKsuuGVz7bQEj3BE+z2aG6aI0U Sep 4 04:21:14.359216 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 04:21:14.366907 systemd-logind[1521]: New session 26 of user core. Sep 4 04:21:14.373338 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 04:21:14.605571 sshd[5797]: Connection closed by 10.0.0.1 port 43658 Sep 4 04:21:14.606349 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Sep 4 04:21:14.612035 systemd[1]: sshd@25-10.0.0.54:22-10.0.0.1:43658.service: Deactivated successfully. Sep 4 04:21:14.615396 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 04:21:14.616514 systemd-logind[1521]: Session 26 logged out. Waiting for processes to exit. Sep 4 04:21:14.618959 systemd-logind[1521]: Removed session 26. Sep 4 04:21:15.361253 containerd[1573]: time="2025-09-04T04:21:15.360977417Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2966e17ea5d921ad4925552ea4151cd325c619ff96ca9681d6795fd900156d9d\" id:\"ce86d5c5babc81f8b77849afe486a79dc96db0ed6143f67010cdb40fffb21ab3\" pid:5821 exited_at:{seconds:1756959675 nanos:360348849}"