Sep 9 22:02:10.287819 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 19:55:16 -00 2025 Sep 9 22:02:10.287849 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:02:10.287861 kernel: BIOS-provided physical RAM map: Sep 9 22:02:10.287869 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 22:02:10.287877 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 22:02:10.287885 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 22:02:10.287895 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Sep 9 22:02:10.287904 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Sep 9 22:02:10.287915 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 22:02:10.287923 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 22:02:10.287932 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 22:02:10.287940 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 22:02:10.287948 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 9 22:02:10.287956 kernel: NX (Execute Disable) protection: active Sep 9 22:02:10.287969 kernel: APIC: Static calls initialized Sep 9 22:02:10.287978 kernel: SMBIOS 2.8 present. Sep 9 22:02:10.287991 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Sep 9 22:02:10.288000 kernel: DMI: Memory slots populated: 1/1 Sep 9 22:02:10.288009 kernel: Hypervisor detected: KVM Sep 9 22:02:10.288018 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 22:02:10.288027 kernel: kvm-clock: using sched offset of 4498461681 cycles Sep 9 22:02:10.288037 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 22:02:10.288047 kernel: tsc: Detected 2794.748 MHz processor Sep 9 22:02:10.288083 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 22:02:10.288094 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 22:02:10.288103 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Sep 9 22:02:10.288113 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 22:02:10.288122 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 22:02:10.288132 kernel: Using GB pages for direct mapping Sep 9 22:02:10.288141 kernel: ACPI: Early table checksum verification disabled Sep 9 22:02:10.288160 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Sep 9 22:02:10.288170 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288183 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288193 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288202 kernel: ACPI: FACS 0x000000009CFE0000 000040 Sep 9 22:02:10.288212 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288221 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288230 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288240 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 22:02:10.288249 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Sep 9 22:02:10.288266 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Sep 9 22:02:10.288276 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Sep 9 22:02:10.288286 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Sep 9 22:02:10.288296 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Sep 9 22:02:10.288306 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Sep 9 22:02:10.288315 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Sep 9 22:02:10.288328 kernel: No NUMA configuration found Sep 9 22:02:10.288338 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Sep 9 22:02:10.288348 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Sep 9 22:02:10.288358 kernel: Zone ranges: Sep 9 22:02:10.288368 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 22:02:10.288377 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Sep 9 22:02:10.288387 kernel: Normal empty Sep 9 22:02:10.288397 kernel: Device empty Sep 9 22:02:10.288406 kernel: Movable zone start for each node Sep 9 22:02:10.288416 kernel: Early memory node ranges Sep 9 22:02:10.288429 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 22:02:10.288439 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Sep 9 22:02:10.288449 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Sep 9 22:02:10.288459 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 22:02:10.288469 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 22:02:10.288479 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 9 22:02:10.288489 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 22:02:10.288503 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 22:02:10.288513 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 22:02:10.288527 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 22:02:10.288537 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 22:02:10.288547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 22:02:10.288558 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 22:02:10.288568 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 22:02:10.288578 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 22:02:10.288587 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 9 22:02:10.288597 kernel: TSC deadline timer available Sep 9 22:02:10.288607 kernel: CPU topo: Max. logical packages: 1 Sep 9 22:02:10.288621 kernel: CPU topo: Max. logical dies: 1 Sep 9 22:02:10.288631 kernel: CPU topo: Max. dies per package: 1 Sep 9 22:02:10.288640 kernel: CPU topo: Max. threads per core: 1 Sep 9 22:02:10.288650 kernel: CPU topo: Num. cores per package: 4 Sep 9 22:02:10.288660 kernel: CPU topo: Num. threads per package: 4 Sep 9 22:02:10.288670 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 9 22:02:10.288680 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 22:02:10.288690 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 9 22:02:10.288700 kernel: kvm-guest: setup PV sched yield Sep 9 22:02:10.288710 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 22:02:10.288724 kernel: Booting paravirtualized kernel on KVM Sep 9 22:02:10.288735 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 22:02:10.288744 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 9 22:02:10.288755 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 9 22:02:10.288764 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 9 22:02:10.288774 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 9 22:02:10.288784 kernel: kvm-guest: PV spinlocks enabled Sep 9 22:02:10.288794 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 22:02:10.288806 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:02:10.288821 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 22:02:10.288831 kernel: random: crng init done Sep 9 22:02:10.288841 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 22:02:10.288852 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 22:02:10.288862 kernel: Fallback order for Node 0: 0 Sep 9 22:02:10.288872 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Sep 9 22:02:10.288882 kernel: Policy zone: DMA32 Sep 9 22:02:10.288893 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 22:02:10.288908 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 22:02:10.288918 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 22:02:10.288928 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 22:02:10.288938 kernel: Dynamic Preempt: voluntary Sep 9 22:02:10.288948 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 22:02:10.288959 kernel: rcu: RCU event tracing is enabled. Sep 9 22:02:10.288970 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 22:02:10.288980 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 22:02:10.288995 kernel: Rude variant of Tasks RCU enabled. Sep 9 22:02:10.289010 kernel: Tracing variant of Tasks RCU enabled. Sep 9 22:02:10.289020 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 22:02:10.289030 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 22:02:10.289040 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 22:02:10.289051 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 22:02:10.289094 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 22:02:10.289104 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 9 22:02:10.289115 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 22:02:10.289137 kernel: Console: colour VGA+ 80x25 Sep 9 22:02:10.289158 kernel: printk: legacy console [ttyS0] enabled Sep 9 22:02:10.289169 kernel: ACPI: Core revision 20240827 Sep 9 22:02:10.289180 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 9 22:02:10.289194 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 22:02:10.289204 kernel: x2apic enabled Sep 9 22:02:10.289215 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 22:02:10.289226 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 9 22:02:10.289237 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 9 22:02:10.289250 kernel: kvm-guest: setup PV IPIs Sep 9 22:02:10.289260 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 9 22:02:10.289271 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 9 22:02:10.289282 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 9 22:02:10.289293 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 22:02:10.289303 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 9 22:02:10.289314 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 9 22:02:10.289324 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 22:02:10.289335 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 22:02:10.289349 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 22:02:10.289360 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 9 22:02:10.289371 kernel: active return thunk: retbleed_return_thunk Sep 9 22:02:10.289381 kernel: RETBleed: Mitigation: untrained return thunk Sep 9 22:02:10.289392 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 22:02:10.289402 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 22:02:10.289413 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 9 22:02:10.289424 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 9 22:02:10.289438 kernel: active return thunk: srso_return_thunk Sep 9 22:02:10.289449 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 9 22:02:10.289460 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 22:02:10.289470 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 22:02:10.289481 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 22:02:10.289491 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 22:02:10.289502 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 9 22:02:10.289512 kernel: Freeing SMP alternatives memory: 32K Sep 9 22:02:10.289522 kernel: pid_max: default: 32768 minimum: 301 Sep 9 22:02:10.289537 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 22:02:10.289548 kernel: landlock: Up and running. Sep 9 22:02:10.289558 kernel: SELinux: Initializing. Sep 9 22:02:10.289573 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 22:02:10.289584 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 22:02:10.289595 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 9 22:02:10.289605 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 9 22:02:10.289616 kernel: ... version: 0 Sep 9 22:02:10.289627 kernel: ... bit width: 48 Sep 9 22:02:10.289643 kernel: ... generic registers: 6 Sep 9 22:02:10.289654 kernel: ... value mask: 0000ffffffffffff Sep 9 22:02:10.289665 kernel: ... max period: 00007fffffffffff Sep 9 22:02:10.289676 kernel: ... fixed-purpose events: 0 Sep 9 22:02:10.289687 kernel: ... event mask: 000000000000003f Sep 9 22:02:10.289697 kernel: signal: max sigframe size: 1776 Sep 9 22:02:10.289708 kernel: rcu: Hierarchical SRCU implementation. Sep 9 22:02:10.289719 kernel: rcu: Max phase no-delay instances is 400. Sep 9 22:02:10.289729 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 22:02:10.289744 kernel: smp: Bringing up secondary CPUs ... Sep 9 22:02:10.289755 kernel: smpboot: x86: Booting SMP configuration: Sep 9 22:02:10.289765 kernel: .... node #0, CPUs: #1 #2 #3 Sep 9 22:02:10.289775 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 22:02:10.289786 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 9 22:02:10.289797 kernel: Memory: 2428916K/2571752K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54092K init, 2876K bss, 136908K reserved, 0K cma-reserved) Sep 9 22:02:10.289807 kernel: devtmpfs: initialized Sep 9 22:02:10.289818 kernel: x86/mm: Memory block size: 128MB Sep 9 22:02:10.289829 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 22:02:10.289845 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 22:02:10.289855 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 22:02:10.289866 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 22:02:10.289877 kernel: audit: initializing netlink subsys (disabled) Sep 9 22:02:10.289889 kernel: audit: type=2000 audit(1757455325.535:1): state=initialized audit_enabled=0 res=1 Sep 9 22:02:10.289899 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 22:02:10.289911 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 22:02:10.289921 kernel: cpuidle: using governor menu Sep 9 22:02:10.289933 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 22:02:10.289950 kernel: dca service started, version 1.12.1 Sep 9 22:02:10.289961 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 9 22:02:10.289973 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Sep 9 22:02:10.289984 kernel: PCI: Using configuration type 1 for base access Sep 9 22:02:10.289996 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 22:02:10.290008 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 22:02:10.290019 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 22:02:10.290030 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 22:02:10.290041 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 22:02:10.290085 kernel: ACPI: Added _OSI(Module Device) Sep 9 22:02:10.290097 kernel: ACPI: Added _OSI(Processor Device) Sep 9 22:02:10.290108 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 22:02:10.290120 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 22:02:10.290131 kernel: ACPI: Interpreter enabled Sep 9 22:02:10.290143 kernel: ACPI: PM: (supports S0 S3 S5) Sep 9 22:02:10.290166 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 22:02:10.290177 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 22:02:10.290187 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 22:02:10.290203 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 22:02:10.290214 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 22:02:10.290474 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 22:02:10.290636 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 9 22:02:10.290792 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 9 22:02:10.290807 kernel: PCI host bridge to bus 0000:00 Sep 9 22:02:10.290953 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 22:02:10.291093 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 22:02:10.291241 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 22:02:10.291392 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Sep 9 22:02:10.291513 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 22:02:10.291660 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 9 22:02:10.291799 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 22:02:10.292022 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 22:02:10.292229 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 9 22:02:10.292386 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Sep 9 22:02:10.292535 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Sep 9 22:02:10.292673 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Sep 9 22:02:10.292834 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 22:02:10.292996 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 22:02:10.293246 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Sep 9 22:02:10.293428 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Sep 9 22:02:10.293608 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Sep 9 22:02:10.293815 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 9 22:02:10.294023 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Sep 9 22:02:10.294275 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Sep 9 22:02:10.294440 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Sep 9 22:02:10.294636 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 22:02:10.294813 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Sep 9 22:02:10.294991 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Sep 9 22:02:10.295244 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Sep 9 22:02:10.295436 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Sep 9 22:02:10.295641 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 22:02:10.298100 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 22:02:10.298361 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 22:02:10.298528 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Sep 9 22:02:10.298691 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Sep 9 22:02:10.298882 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 22:02:10.299050 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 9 22:02:10.299116 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 22:02:10.299127 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 22:02:10.299140 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 22:02:10.299157 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 22:02:10.299167 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 22:02:10.299177 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 22:02:10.299188 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 22:02:10.299199 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 22:02:10.299209 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 22:02:10.299216 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 22:02:10.299225 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 22:02:10.299236 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 22:02:10.299244 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 22:02:10.299252 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 22:02:10.299260 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 22:02:10.299268 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 22:02:10.299276 kernel: iommu: Default domain type: Translated Sep 9 22:02:10.299285 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 22:02:10.299293 kernel: PCI: Using ACPI for IRQ routing Sep 9 22:02:10.299301 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 22:02:10.299312 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 22:02:10.299320 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Sep 9 22:02:10.299473 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 22:02:10.299599 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 22:02:10.299720 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 22:02:10.299731 kernel: vgaarb: loaded Sep 9 22:02:10.299739 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 9 22:02:10.299748 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 9 22:02:10.299760 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 22:02:10.299771 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 22:02:10.299788 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 22:02:10.299799 kernel: pnp: PnP ACPI init Sep 9 22:02:10.299982 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 22:02:10.299997 kernel: pnp: PnP ACPI: found 6 devices Sep 9 22:02:10.300005 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 22:02:10.300013 kernel: NET: Registered PF_INET protocol family Sep 9 22:02:10.300026 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 22:02:10.300034 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 22:02:10.300042 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 22:02:10.300050 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 22:02:10.300076 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 22:02:10.300084 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 22:02:10.300093 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 22:02:10.300101 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 22:02:10.300109 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 22:02:10.300120 kernel: NET: Registered PF_XDP protocol family Sep 9 22:02:10.300256 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 22:02:10.300395 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 22:02:10.300515 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 22:02:10.300634 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Sep 9 22:02:10.300745 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 22:02:10.300856 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 9 22:02:10.300867 kernel: PCI: CLS 0 bytes, default 64 Sep 9 22:02:10.300879 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 9 22:02:10.300887 kernel: Initialise system trusted keyrings Sep 9 22:02:10.300895 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 22:02:10.300903 kernel: Key type asymmetric registered Sep 9 22:02:10.300911 kernel: Asymmetric key parser 'x509' registered Sep 9 22:02:10.300919 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 22:02:10.300927 kernel: io scheduler mq-deadline registered Sep 9 22:02:10.300935 kernel: io scheduler kyber registered Sep 9 22:02:10.300943 kernel: io scheduler bfq registered Sep 9 22:02:10.300953 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 22:02:10.300962 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 22:02:10.300971 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 22:02:10.300979 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 9 22:02:10.300995 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 22:02:10.301008 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 22:02:10.301019 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 22:02:10.301030 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 22:02:10.301041 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 22:02:10.302388 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 9 22:02:10.302421 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Sep 9 22:02:10.302565 kernel: rtc_cmos 00:04: registered as rtc0 Sep 9 22:02:10.302710 kernel: rtc_cmos 00:04: setting system clock to 2025-09-09T22:02:09 UTC (1757455329) Sep 9 22:02:10.302852 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 9 22:02:10.302867 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 9 22:02:10.302879 kernel: hpet: Lost 2 RTC interrupts Sep 9 22:02:10.303025 kernel: NET: Registered PF_INET6 protocol family Sep 9 22:02:10.303048 kernel: Segment Routing with IPv6 Sep 9 22:02:10.303078 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 22:02:10.303089 kernel: NET: Registered PF_PACKET protocol family Sep 9 22:02:10.303100 kernel: Key type dns_resolver registered Sep 9 22:02:10.303111 kernel: IPI shorthand broadcast: enabled Sep 9 22:02:10.303122 kernel: sched_clock: Marking stable (3669002979, 596330891)->(4778428589, -513094719) Sep 9 22:02:10.303132 kernel: registered taskstats version 1 Sep 9 22:02:10.303144 kernel: Loading compiled-in X.509 certificates Sep 9 22:02:10.303169 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 003b39862f2a560eb5545d7d88a07fc5bdfce075' Sep 9 22:02:10.303185 kernel: Demotion targets for Node 0: null Sep 9 22:02:10.303196 kernel: Key type .fscrypt registered Sep 9 22:02:10.303207 kernel: Key type fscrypt-provisioning registered Sep 9 22:02:10.303218 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 22:02:10.303229 kernel: ima: Allocated hash algorithm: sha1 Sep 9 22:02:10.303240 kernel: ima: No architecture policies found Sep 9 22:02:10.303251 kernel: clk: Disabling unused clocks Sep 9 22:02:10.303261 kernel: Warning: unable to open an initial console. Sep 9 22:02:10.303273 kernel: Freeing unused kernel image (initmem) memory: 54092K Sep 9 22:02:10.303287 kernel: Write protecting the kernel read-only data: 24576k Sep 9 22:02:10.303298 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 22:02:10.303308 kernel: Run /init as init process Sep 9 22:02:10.303320 kernel: with arguments: Sep 9 22:02:10.303331 kernel: /init Sep 9 22:02:10.303341 kernel: with environment: Sep 9 22:02:10.303352 kernel: HOME=/ Sep 9 22:02:10.303363 kernel: TERM=linux Sep 9 22:02:10.303374 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 22:02:10.303405 systemd[1]: Successfully made /usr/ read-only. Sep 9 22:02:10.303425 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 22:02:10.303438 systemd[1]: Detected virtualization kvm. Sep 9 22:02:10.303450 systemd[1]: Detected architecture x86-64. Sep 9 22:02:10.303461 systemd[1]: Running in initrd. Sep 9 22:02:10.303472 systemd[1]: No hostname configured, using default hostname. Sep 9 22:02:10.303483 systemd[1]: Hostname set to . Sep 9 22:02:10.303498 systemd[1]: Initializing machine ID from VM UUID. Sep 9 22:02:10.303510 systemd[1]: Queued start job for default target initrd.target. Sep 9 22:02:10.303521 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:02:10.303533 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:02:10.303547 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 22:02:10.303559 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 22:02:10.303570 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 22:02:10.303587 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 22:02:10.303600 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 22:02:10.303613 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 22:02:10.303624 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:02:10.303636 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:02:10.303647 systemd[1]: Reached target paths.target - Path Units. Sep 9 22:02:10.303658 systemd[1]: Reached target slices.target - Slice Units. Sep 9 22:02:10.303670 systemd[1]: Reached target swap.target - Swaps. Sep 9 22:02:10.303684 systemd[1]: Reached target timers.target - Timer Units. Sep 9 22:02:10.303696 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 22:02:10.303707 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 22:02:10.303718 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 22:02:10.303730 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 22:02:10.303741 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:02:10.303753 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 22:02:10.303764 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:02:10.303776 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 22:02:10.303790 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 22:02:10.303802 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 22:02:10.303817 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 22:02:10.303828 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 22:02:10.303840 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 22:02:10.303853 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 22:02:10.303864 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 22:02:10.303878 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:02:10.303889 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 22:02:10.303900 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:02:10.303914 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 22:02:10.303925 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 22:02:10.303975 systemd-journald[221]: Collecting audit messages is disabled. Sep 9 22:02:10.304009 systemd-journald[221]: Journal started Sep 9 22:02:10.304034 systemd-journald[221]: Runtime Journal (/run/log/journal/8dff244a37a24e7388f540e7201cacc9) is 6M, max 48.6M, 42.5M free. Sep 9 22:02:10.294374 systemd-modules-load[222]: Inserted module 'overlay' Sep 9 22:02:10.339749 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 22:02:10.339782 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 22:02:10.339798 kernel: Bridge firewalling registered Sep 9 22:02:10.332780 systemd-modules-load[222]: Inserted module 'br_netfilter' Sep 9 22:02:10.339344 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 22:02:10.341130 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:02:10.428447 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 22:02:10.434241 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 22:02:10.436292 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 22:02:10.438988 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 22:02:10.444213 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 22:02:10.457326 systemd-tmpfiles[240]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 22:02:10.460578 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:02:10.460872 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:02:10.462705 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:02:10.466814 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 22:02:10.479314 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 22:02:10.480621 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 22:02:10.517116 dracut-cmdline[263]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f0ebd120fc09fb344715b1492c3f1d02e1457be2c9792ea5ffb3fe4b15efa812 Sep 9 22:02:10.526996 systemd-resolved[254]: Positive Trust Anchors: Sep 9 22:02:10.527007 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 22:02:10.527035 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 22:02:10.529895 systemd-resolved[254]: Defaulting to hostname 'linux'. Sep 9 22:02:10.531423 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 22:02:10.637445 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:02:10.660160 kernel: SCSI subsystem initialized Sep 9 22:02:10.669094 kernel: Loading iSCSI transport class v2.0-870. Sep 9 22:02:10.681100 kernel: iscsi: registered transport (tcp) Sep 9 22:02:10.702329 kernel: iscsi: registered transport (qla4xxx) Sep 9 22:02:10.702420 kernel: QLogic iSCSI HBA Driver Sep 9 22:02:10.725639 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 22:02:10.866475 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:02:10.868029 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 22:02:10.937984 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 22:02:10.939748 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 22:02:11.047119 kernel: raid6: avx2x4 gen() 29211 MB/s Sep 9 22:02:11.064107 kernel: raid6: avx2x2 gen() 28738 MB/s Sep 9 22:02:11.081566 kernel: raid6: avx2x1 gen() 20595 MB/s Sep 9 22:02:11.081652 kernel: raid6: using algorithm avx2x4 gen() 29211 MB/s Sep 9 22:02:11.099505 kernel: raid6: .... xor() 5537 MB/s, rmw enabled Sep 9 22:02:11.099619 kernel: raid6: using avx2x2 recovery algorithm Sep 9 22:02:11.126146 kernel: xor: automatically using best checksumming function avx Sep 9 22:02:11.393240 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 22:02:11.405813 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 22:02:11.409466 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:02:11.453806 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 9 22:02:11.461454 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:02:11.462765 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 22:02:11.692818 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation Sep 9 22:02:11.722272 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 22:02:11.723835 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 22:02:11.799680 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:02:12.011216 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 9 22:02:12.015092 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 22:02:12.015468 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 22:02:12.020794 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 22:02:12.020825 kernel: GPT:9289727 != 19775487 Sep 9 22:02:12.020836 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 22:02:12.020854 kernel: GPT:9289727 != 19775487 Sep 9 22:02:12.020875 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 22:02:12.020886 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:02:12.022020 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 22:02:12.024680 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 22:02:12.026180 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:02:12.031366 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:02:12.037165 kernel: libata version 3.00 loaded. Sep 9 22:02:12.039279 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:02:12.045451 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 9 22:02:12.043599 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:02:12.048076 kernel: AES CTR mode by8 optimization enabled Sep 9 22:02:12.059081 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 22:02:12.059349 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 22:02:12.062079 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 22:02:12.062317 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 22:02:12.063504 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 22:02:12.116210 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 22:02:12.263784 kernel: scsi host0: ahci Sep 9 22:02:12.263979 kernel: scsi host1: ahci Sep 9 22:02:12.264156 kernel: scsi host2: ahci Sep 9 22:02:12.264322 kernel: scsi host3: ahci Sep 9 22:02:12.264582 kernel: scsi host4: ahci Sep 9 22:02:12.264766 kernel: scsi host5: ahci Sep 9 22:02:12.264917 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 31 lpm-pol 1 Sep 9 22:02:12.264929 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 31 lpm-pol 1 Sep 9 22:02:12.264940 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 31 lpm-pol 1 Sep 9 22:02:12.264951 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 31 lpm-pol 1 Sep 9 22:02:12.264961 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 31 lpm-pol 1 Sep 9 22:02:12.264972 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 31 lpm-pol 1 Sep 9 22:02:12.261507 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 22:02:12.302987 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:02:12.317762 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 22:02:12.319153 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 22:02:12.331188 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 22:02:12.391364 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 22:02:12.574742 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 22:02:12.574839 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 22:02:12.574872 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 9 22:02:12.574887 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 22:02:12.576110 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 22:02:12.577108 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 9 22:02:12.578146 kernel: ata3.00: LPM support broken, forcing max_power Sep 9 22:02:12.578175 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 9 22:02:12.579386 kernel: ata3.00: applying bridge limits Sep 9 22:02:12.580188 kernel: ata3.00: LPM support broken, forcing max_power Sep 9 22:02:12.580213 kernel: ata3.00: configured for UDMA/100 Sep 9 22:02:12.583119 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 9 22:02:12.785121 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 9 22:02:12.785501 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 22:02:12.803099 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 9 22:02:13.191396 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 22:02:13.372295 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 22:02:13.372427 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:02:13.375833 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 22:02:13.379007 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 22:02:13.410590 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 22:02:14.301090 disk-uuid[634]: Primary Header is updated. Sep 9 22:02:14.301090 disk-uuid[634]: Secondary Entries is updated. Sep 9 22:02:14.301090 disk-uuid[634]: Secondary Header is updated. Sep 9 22:02:14.305308 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:02:14.310089 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:02:15.327093 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 22:02:15.328723 disk-uuid[652]: The operation has completed successfully. Sep 9 22:02:15.369222 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 22:02:15.369378 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 22:02:15.412968 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 22:02:15.565255 sh[663]: Success Sep 9 22:02:15.586340 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 22:02:15.586432 kernel: device-mapper: uevent: version 1.0.3 Sep 9 22:02:15.586451 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 22:02:15.597133 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 9 22:02:15.632179 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 22:02:15.636204 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 22:02:15.739041 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 22:02:15.745616 kernel: BTRFS: device fsid f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (675) Sep 9 22:02:15.745648 kernel: BTRFS info (device dm-0): first mount of filesystem f72d0a81-8b28-47a3-b3ab-bf6ecd8938f0 Sep 9 22:02:15.745666 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:02:15.751625 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 22:02:15.751659 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 22:02:15.753350 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 22:02:15.754167 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 22:02:15.755401 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 22:02:15.756551 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 22:02:15.758719 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 22:02:15.850098 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (706) Sep 9 22:02:15.850158 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:02:15.852086 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:02:15.855203 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:02:15.855232 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:02:15.861116 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:02:15.861775 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 22:02:15.862890 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 22:02:15.965649 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 22:02:16.040386 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 22:02:16.215429 systemd-networkd[847]: lo: Link UP Sep 9 22:02:16.215443 systemd-networkd[847]: lo: Gained carrier Sep 9 22:02:16.238750 ignition[749]: Ignition 2.22.0 Sep 9 22:02:16.217198 systemd-networkd[847]: Enumeration completed Sep 9 22:02:16.238757 ignition[749]: Stage: fetch-offline Sep 9 22:02:16.217933 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:02:16.238787 ignition[749]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:16.217938 systemd-networkd[847]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 22:02:16.238797 ignition[749]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:16.218191 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 22:02:16.238886 ignition[749]: parsed url from cmdline: "" Sep 9 22:02:16.234011 systemd-networkd[847]: eth0: Link UP Sep 9 22:02:16.238890 ignition[749]: no config URL provided Sep 9 22:02:16.314888 systemd-networkd[847]: eth0: Gained carrier Sep 9 22:02:16.238895 ignition[749]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 22:02:16.314913 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:02:16.238904 ignition[749]: no config at "/usr/lib/ignition/user.ign" Sep 9 22:02:16.315101 systemd[1]: Reached target network.target - Network. Sep 9 22:02:16.238926 ignition[749]: op(1): [started] loading QEMU firmware config module Sep 9 22:02:16.238930 ignition[749]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 22:02:16.328866 ignition[749]: op(1): [finished] loading QEMU firmware config module Sep 9 22:02:16.349211 systemd-networkd[847]: eth0: DHCPv4 address 10.0.0.72/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 22:02:16.380474 ignition[749]: parsing config with SHA512: 31daa9e49bdf9661a2ebfc140add4695ad19ff4b85bc2b85980ae25fec9e097a9a8e83d1e53949e2d10aa135a2d8f0b69d97227315054334583693e5e935a753 Sep 9 22:02:16.414450 unknown[749]: fetched base config from "system" Sep 9 22:02:16.414465 unknown[749]: fetched user config from "qemu" Sep 9 22:02:16.415072 ignition[749]: fetch-offline: fetch-offline passed Sep 9 22:02:16.417626 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 22:02:16.415165 ignition[749]: Ignition finished successfully Sep 9 22:02:16.434741 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 22:02:16.435931 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 22:02:16.518603 ignition[860]: Ignition 2.22.0 Sep 9 22:02:16.518619 ignition[860]: Stage: kargs Sep 9 22:02:16.518750 ignition[860]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:16.518762 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:16.519679 ignition[860]: kargs: kargs passed Sep 9 22:02:16.519733 ignition[860]: Ignition finished successfully Sep 9 22:02:16.583798 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 22:02:16.587218 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 22:02:16.657464 ignition[868]: Ignition 2.22.0 Sep 9 22:02:16.657478 ignition[868]: Stage: disks Sep 9 22:02:16.657615 ignition[868]: no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:16.657628 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:16.658397 ignition[868]: disks: disks passed Sep 9 22:02:16.658457 ignition[868]: Ignition finished successfully Sep 9 22:02:16.711737 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 22:02:16.713247 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 22:02:16.714042 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 22:02:16.718410 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 22:02:16.718504 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 22:02:16.720393 systemd[1]: Reached target basic.target - Basic System. Sep 9 22:02:16.724899 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 22:02:16.766495 systemd-fsck[878]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 22:02:17.486297 systemd-networkd[847]: eth0: Gained IPv6LL Sep 9 22:02:18.788374 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 22:02:18.790886 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 22:02:18.973116 kernel: EXT4-fs (vda9): mounted filesystem b54acc07-9600-49db-baed-d5fd6f41a1a5 r/w with ordered data mode. Quota mode: none. Sep 9 22:02:18.973990 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 22:02:18.976200 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 22:02:18.979540 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 22:02:18.982149 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 22:02:18.984451 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 22:02:18.984510 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 22:02:18.984539 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 22:02:19.005339 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 22:02:19.008709 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 22:02:19.012218 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (886) Sep 9 22:02:19.014307 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:02:19.014335 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:02:19.017232 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:02:19.017265 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:02:19.018807 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 22:02:19.054703 initrd-setup-root[910]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 22:02:19.096810 initrd-setup-root[917]: cut: /sysroot/etc/group: No such file or directory Sep 9 22:02:19.102100 initrd-setup-root[924]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 22:02:19.107450 initrd-setup-root[931]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 22:02:19.216979 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 22:02:19.218940 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 22:02:19.221479 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 22:02:19.245084 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 22:02:19.292761 kernel: BTRFS info (device vda6): last unmount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:02:19.314364 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 22:02:19.348919 ignition[1000]: INFO : Ignition 2.22.0 Sep 9 22:02:19.348919 ignition[1000]: INFO : Stage: mount Sep 9 22:02:19.381900 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:19.381900 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:19.381900 ignition[1000]: INFO : mount: mount passed Sep 9 22:02:19.381900 ignition[1000]: INFO : Ignition finished successfully Sep 9 22:02:19.353974 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 22:02:19.381993 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 22:02:19.976582 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 22:02:20.010875 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1012) Sep 9 22:02:20.010930 kernel: BTRFS info (device vda6): first mount of filesystem 0420e4c2-e4f2-4134-b76b-6a7c4e652ed7 Sep 9 22:02:20.010942 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 22:02:20.015560 kernel: BTRFS info (device vda6): turning on async discard Sep 9 22:02:20.015596 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 22:02:20.017683 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 22:02:20.053136 ignition[1029]: INFO : Ignition 2.22.0 Sep 9 22:02:20.053136 ignition[1029]: INFO : Stage: files Sep 9 22:02:20.091590 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:20.091590 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:20.091590 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping Sep 9 22:02:20.091590 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 22:02:20.091590 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 22:02:20.099030 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 22:02:20.099030 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 22:02:20.099030 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 22:02:20.099030 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 22:02:20.099030 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 9 22:02:20.095000 unknown[1029]: wrote ssh authorized keys file for user: core Sep 9 22:02:20.171290 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 22:02:20.480039 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 22:02:20.492508 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 22:02:20.879272 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 22:02:20.944639 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 22:02:20.946895 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 22:02:21.128134 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 22:02:21.128134 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 22:02:21.175462 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 9 22:02:21.486531 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 22:02:22.164643 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 9 22:02:22.164643 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 22:02:22.188682 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 22:02:22.480912 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 22:02:22.480912 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 22:02:22.480912 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 22:02:22.480912 ignition[1029]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 22:02:22.570074 ignition[1029]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 22:02:22.570074 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 22:02:22.570074 ignition[1029]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 22:02:22.595314 ignition[1029]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 22:02:22.601923 ignition[1029]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 22:02:22.603722 ignition[1029]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 22:02:22.603722 ignition[1029]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 22:02:22.606553 ignition[1029]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 22:02:22.606553 ignition[1029]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 22:02:22.606553 ignition[1029]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 22:02:22.606553 ignition[1029]: INFO : files: files passed Sep 9 22:02:22.606553 ignition[1029]: INFO : Ignition finished successfully Sep 9 22:02:22.616944 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 22:02:22.618620 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 22:02:22.622364 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 22:02:22.637867 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 22:02:22.638036 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 22:02:22.728831 initrd-setup-root-after-ignition[1059]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 22:02:22.733397 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:02:22.736303 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:02:22.736303 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 22:02:22.739730 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 22:02:22.747078 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 22:02:22.751114 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 22:02:22.842546 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 22:02:22.842684 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 22:02:22.869945 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 22:02:22.873638 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 22:02:22.876426 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 22:02:22.880434 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 22:02:22.926470 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 22:02:22.928368 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 22:02:23.019538 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:02:23.019740 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:02:23.023310 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 22:02:23.024643 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 22:02:23.024837 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 22:02:23.029616 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 22:02:23.029794 systemd[1]: Stopped target basic.target - Basic System. Sep 9 22:02:23.032580 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 22:02:23.033472 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 22:02:23.033787 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 22:02:23.034121 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 22:02:23.034582 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 22:02:23.034903 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 22:02:23.035405 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 22:02:23.035709 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 22:02:23.036030 systemd[1]: Stopped target swap.target - Swaps. Sep 9 22:02:23.036477 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 22:02:23.036592 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 22:02:23.051693 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:02:23.052704 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:02:23.052981 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 22:02:23.053101 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:02:23.053467 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 22:02:23.053574 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 22:02:23.150306 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 22:02:23.150494 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 22:02:23.151513 systemd[1]: Stopped target paths.target - Path Units. Sep 9 22:02:23.153436 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 22:02:23.153639 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:02:23.155376 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 22:02:23.155698 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 22:02:23.156010 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 22:02:23.156164 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 22:02:23.156521 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 22:02:23.156625 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 22:02:23.163733 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 22:02:23.163874 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 22:02:23.164863 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 22:02:23.164991 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 22:02:23.169378 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 22:02:23.171958 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 22:02:23.172151 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:02:23.177339 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 22:02:23.178347 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 22:02:23.178521 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:02:23.181575 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 22:02:23.181712 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 22:02:23.188217 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 22:02:23.188366 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 22:02:23.206346 ignition[1085]: INFO : Ignition 2.22.0 Sep 9 22:02:23.206346 ignition[1085]: INFO : Stage: umount Sep 9 22:02:23.312869 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 22:02:23.312869 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 22:02:23.312869 ignition[1085]: INFO : umount: umount passed Sep 9 22:02:23.312869 ignition[1085]: INFO : Ignition finished successfully Sep 9 22:02:23.320088 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 22:02:23.320272 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 22:02:23.321656 systemd[1]: Stopped target network.target - Network. Sep 9 22:02:23.325011 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 22:02:23.325142 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 22:02:23.327202 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 22:02:23.327256 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 22:02:23.329324 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 22:02:23.329396 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 22:02:23.330393 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 22:02:23.330456 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 22:02:23.333422 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 22:02:23.335388 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 22:02:23.343441 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 22:02:23.343680 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 22:02:23.349112 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 22:02:23.349527 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 22:02:23.349593 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:02:23.443846 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:02:23.444280 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 22:02:23.444439 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 22:02:23.451861 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 22:02:23.453498 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 22:02:23.453725 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 22:02:23.453793 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:02:23.457973 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 22:02:23.459164 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 22:02:23.459240 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 22:02:23.520656 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 22:02:23.520784 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:02:23.524156 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 22:02:23.524241 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 22:02:23.525342 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:02:23.527193 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 22:02:23.546634 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 22:02:23.546935 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:02:23.611890 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 22:02:23.612049 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 22:02:23.616327 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 22:02:23.616469 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 22:02:23.619043 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 22:02:23.619115 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:02:23.620143 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 22:02:23.620219 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 22:02:23.622780 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 22:02:23.622860 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 22:02:23.625885 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 22:02:23.625961 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 22:02:23.628144 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 22:02:23.631655 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 22:02:23.631767 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:02:23.709339 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 22:02:23.709432 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:02:23.715530 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 22:02:23.715628 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:02:23.722381 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 22:02:23.722465 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 22:02:23.722529 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 22:02:23.732319 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 22:02:23.732514 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 22:02:23.843767 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 22:02:23.843929 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 22:02:23.850915 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 22:02:23.853893 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 22:02:23.854039 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 22:02:23.856290 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 22:02:23.883803 systemd[1]: Switching root. Sep 9 22:02:24.041629 systemd-journald[221]: Journal stopped Sep 9 22:02:31.769860 systemd-journald[221]: Received SIGTERM from PID 1 (systemd). Sep 9 22:02:31.769965 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 22:02:31.769984 kernel: SELinux: policy capability open_perms=1 Sep 9 22:02:31.770002 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 22:02:31.770017 kernel: SELinux: policy capability always_check_network=0 Sep 9 22:02:31.770035 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 22:02:31.770050 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 22:02:31.770429 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 22:02:31.770447 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 22:02:31.770470 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 22:02:31.770491 kernel: audit: type=1403 audit(1757455346.217:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 22:02:31.770507 systemd[1]: Successfully loaded SELinux policy in 156.718ms. Sep 9 22:02:31.770649 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.828ms. Sep 9 22:02:31.770674 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 22:02:31.770691 systemd[1]: Detected virtualization kvm. Sep 9 22:02:31.770707 systemd[1]: Detected architecture x86-64. Sep 9 22:02:31.770722 systemd[1]: Detected first boot. Sep 9 22:02:31.770738 systemd[1]: Initializing machine ID from VM UUID. Sep 9 22:02:31.770753 zram_generator::config[1130]: No configuration found. Sep 9 22:02:31.770769 kernel: Guest personality initialized and is inactive Sep 9 22:02:31.770783 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 22:02:31.770797 kernel: Initialized host personality Sep 9 22:02:31.770816 kernel: NET: Registered PF_VSOCK protocol family Sep 9 22:02:31.770830 systemd[1]: Populated /etc with preset unit settings. Sep 9 22:02:31.770850 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 22:02:31.770866 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 22:02:31.770881 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 22:02:31.770897 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 22:02:31.770912 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 22:02:31.770928 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 22:02:31.770947 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 22:02:31.770963 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 22:02:31.770979 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 22:02:31.774159 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 22:02:31.774186 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 22:02:31.774202 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 22:02:31.774219 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 22:02:31.774236 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 22:02:31.774252 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 22:02:31.774276 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 22:02:31.774293 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 22:02:31.774310 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 22:02:31.774327 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 22:02:31.774343 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 22:02:31.774360 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 22:02:31.774376 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 22:02:31.774392 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 22:02:31.774426 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 22:02:31.774442 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 22:02:31.774458 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 22:02:31.774474 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 22:02:31.774491 systemd[1]: Reached target slices.target - Slice Units. Sep 9 22:02:31.774508 systemd[1]: Reached target swap.target - Swaps. Sep 9 22:02:31.774534 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 22:02:31.774552 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 22:02:31.774568 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 22:02:31.774587 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 22:02:31.774604 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 22:02:31.774620 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 22:02:31.774636 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 22:02:31.774652 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 22:02:31.774669 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 22:02:31.774684 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 22:02:31.774701 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:31.774717 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 22:02:31.774737 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 22:02:31.774753 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 22:02:31.774770 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 22:02:31.774793 systemd[1]: Reached target machines.target - Containers. Sep 9 22:02:31.774809 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 22:02:31.774833 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 22:02:31.774850 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 22:02:31.774866 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 22:02:31.774885 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 22:02:31.774902 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 22:02:31.774919 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 22:02:31.774936 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 22:02:31.774953 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 22:02:31.774971 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 22:02:31.774987 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 22:02:31.775003 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 22:02:31.775019 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 22:02:31.775038 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 22:02:31.775074 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 22:02:31.775094 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 22:02:31.775110 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 22:02:31.775126 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 22:02:31.775141 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 22:02:31.775156 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 22:02:31.775179 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 22:02:31.775200 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 22:02:31.775217 systemd[1]: Stopped verity-setup.service. Sep 9 22:02:31.775233 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:31.775250 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 22:02:31.775267 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 22:02:31.775286 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 22:02:31.775303 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 22:02:31.775321 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 22:02:31.775336 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 22:02:31.775352 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 22:02:31.775368 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 22:02:31.775386 kernel: loop: module loaded Sep 9 22:02:31.775402 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 22:02:31.775417 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 22:02:31.775431 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 22:02:31.775446 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 22:02:31.775461 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 22:02:31.775475 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 22:02:31.775490 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 22:02:31.775508 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 22:02:31.775538 kernel: fuse: init (API version 7.41) Sep 9 22:02:31.775553 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 22:02:31.775569 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 22:02:31.775583 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 22:02:31.775599 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 22:02:31.775615 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 22:02:31.775630 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 22:02:31.775645 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 22:02:31.775664 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 22:02:31.775680 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 22:02:31.775744 systemd-journald[1194]: Collecting audit messages is disabled. Sep 9 22:02:31.775778 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 22:02:31.775797 systemd-journald[1194]: Journal started Sep 9 22:02:31.775826 systemd-journald[1194]: Runtime Journal (/run/log/journal/8dff244a37a24e7388f540e7201cacc9) is 6M, max 48.6M, 42.5M free. Sep 9 22:02:29.536883 systemd[1]: Queued start job for default target multi-user.target. Sep 9 22:02:29.576087 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 22:02:29.580493 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 22:02:29.587977 systemd[1]: systemd-journald.service: Consumed 1.184s CPU time. Sep 9 22:02:31.785088 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 22:02:31.792091 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 22:02:31.797947 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 22:02:31.854093 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 22:02:31.882350 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 22:02:31.879528 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 22:02:31.882875 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 22:02:31.943071 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 22:02:31.944277 kernel: loop0: detected capacity change from 0 to 221472 Sep 9 22:02:31.944357 kernel: ACPI: bus type drm_connector registered Sep 9 22:02:31.947872 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 22:02:31.953422 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 22:02:31.953751 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 22:02:31.955557 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 22:02:31.959265 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 22:02:31.988456 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 22:02:31.993035 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 22:02:32.024230 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 22:02:32.170951 systemd-journald[1194]: Time spent on flushing to /var/log/journal/8dff244a37a24e7388f540e7201cacc9 is 21.163ms for 986 entries. Sep 9 22:02:32.170951 systemd-journald[1194]: System Journal (/var/log/journal/8dff244a37a24e7388f540e7201cacc9) is 8M, max 195.6M, 187.6M free. Sep 9 22:02:32.484488 systemd-journald[1194]: Received client request to flush runtime journal. Sep 9 22:02:32.200146 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 22:02:32.308392 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 22:02:32.337676 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 22:02:32.349730 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 22:02:32.491880 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 22:02:32.770773 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 22:02:33.009285 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 22:02:33.356643 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 22:02:33.720936 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 22:02:33.790327 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 22:02:33.819011 kernel: loop1: detected capacity change from 0 to 110984 Sep 9 22:02:34.241653 kernel: loop2: detected capacity change from 0 to 128016 Sep 9 22:02:34.247994 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Sep 9 22:02:34.248598 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Sep 9 22:02:34.274571 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 22:02:34.440995 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 22:02:34.451890 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 22:02:34.558511 kernel: loop3: detected capacity change from 0 to 221472 Sep 9 22:02:34.735286 kernel: loop4: detected capacity change from 0 to 110984 Sep 9 22:02:34.836120 kernel: loop5: detected capacity change from 0 to 128016 Sep 9 22:02:34.890545 (sd-merge)[1272]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 22:02:34.891383 (sd-merge)[1272]: Merged extensions into '/usr'. Sep 9 22:02:34.907320 systemd[1]: Reload requested from client PID 1215 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 22:02:34.907347 systemd[1]: Reloading... Sep 9 22:02:35.112121 zram_generator::config[1296]: No configuration found. Sep 9 22:02:35.721841 systemd[1]: Reloading finished in 813 ms. Sep 9 22:02:35.733555 ldconfig[1212]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 22:02:35.824687 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 22:02:35.866729 systemd[1]: Starting ensure-sysext.service... Sep 9 22:02:35.874359 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 22:02:35.972291 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 22:02:35.973178 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 22:02:35.978711 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 22:02:35.979131 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 22:02:35.980633 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 22:02:35.982513 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Sep 9 22:02:35.984520 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Sep 9 22:02:36.162595 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 22:02:36.162615 systemd-tmpfiles[1336]: Skipping /boot Sep 9 22:02:36.174449 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 22:02:36.196330 systemd[1]: Reload requested from client PID 1335 ('systemctl') (unit ensure-sysext.service)... Sep 9 22:02:36.198445 systemd[1]: Reloading... Sep 9 22:02:36.220641 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 22:02:36.220664 systemd-tmpfiles[1336]: Skipping /boot Sep 9 22:02:36.393095 zram_generator::config[1364]: No configuration found. Sep 9 22:02:36.818471 systemd[1]: Reloading finished in 611 ms. Sep 9 22:02:36.838144 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 22:02:36.842110 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 22:02:36.888097 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 22:02:36.901694 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 22:02:36.957687 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 22:02:36.969818 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 22:02:36.980918 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 22:02:37.009274 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 22:02:37.017701 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:37.017932 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 22:02:37.030965 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 22:02:37.036833 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 22:02:37.075357 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 22:02:37.077263 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 22:02:37.078483 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 22:02:37.085570 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 22:02:37.087833 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:37.105187 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 22:02:37.109684 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 22:02:37.109987 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 22:02:37.117633 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 22:02:37.117963 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 22:02:37.120706 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 22:02:37.121001 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 22:02:37.129944 systemd-udevd[1408]: Using default interface naming scheme 'v255'. Sep 9 22:02:37.144691 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 22:02:37.233830 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:37.234158 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 22:02:37.246815 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 22:02:37.295999 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 22:02:37.317986 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 22:02:37.326085 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 22:02:37.332693 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 22:02:37.332882 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 22:02:37.336342 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 22:02:37.344488 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 22:02:37.357448 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 22:02:37.361322 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 22:02:37.363886 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 22:02:37.364317 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 22:02:37.367042 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 22:02:37.369893 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 22:02:37.385963 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 22:02:37.386325 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 22:02:37.412436 systemd[1]: Finished ensure-sysext.service. Sep 9 22:02:37.416688 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 22:02:37.417326 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 22:02:37.428572 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 22:02:37.432319 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 22:02:37.443498 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 22:02:37.452420 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 22:02:37.467335 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 22:02:37.519303 augenrules[1480]: No rules Sep 9 22:02:37.518517 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 22:02:37.520012 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 22:02:37.573047 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 22:02:37.575851 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 22:02:37.638827 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 22:02:37.719653 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 22:02:37.735587 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 22:02:37.767465 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 22:02:37.767571 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 9 22:02:37.777161 kernel: ACPI: button: Power Button [PWRF] Sep 9 22:02:37.794358 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 22:02:37.803607 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 22:02:37.803994 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 22:02:37.999617 systemd-networkd[1455]: lo: Link UP Sep 9 22:02:37.999630 systemd-networkd[1455]: lo: Gained carrier Sep 9 22:02:38.002903 systemd-networkd[1455]: Enumeration completed Sep 9 22:02:38.003077 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 22:02:38.009525 systemd-networkd[1455]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:02:38.009533 systemd-networkd[1455]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 22:02:38.011545 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 22:02:38.011896 systemd-networkd[1455]: eth0: Link UP Sep 9 22:02:38.012177 systemd-networkd[1455]: eth0: Gained carrier Sep 9 22:02:38.012210 systemd-networkd[1455]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 22:02:38.017272 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 22:02:38.061616 systemd-networkd[1455]: eth0: DHCPv4 address 10.0.0.72/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 22:02:38.072934 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 22:02:38.710735 systemd-timesyncd[1445]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 22:02:38.710814 systemd-timesyncd[1445]: Initial clock synchronization to Tue 2025-09-09 22:02:38.710629 UTC. Sep 9 22:02:38.711513 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 22:02:38.717567 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 22:02:38.737353 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 22:02:38.778907 systemd-resolved[1406]: Positive Trust Anchors: Sep 9 22:02:38.779382 systemd-resolved[1406]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 22:02:38.779504 systemd-resolved[1406]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 22:02:38.786713 systemd-resolved[1406]: Defaulting to hostname 'linux'. Sep 9 22:02:38.789457 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 22:02:38.790977 systemd[1]: Reached target network.target - Network. Sep 9 22:02:38.792552 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 22:02:38.871360 kernel: kvm_amd: TSC scaling supported Sep 9 22:02:38.871505 kernel: kvm_amd: Nested Virtualization enabled Sep 9 22:02:38.871527 kernel: kvm_amd: Nested Paging enabled Sep 9 22:02:38.871880 kernel: kvm_amd: LBR virtualization supported Sep 9 22:02:38.873080 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 9 22:02:38.873667 kernel: kvm_amd: Virtual GIF supported Sep 9 22:02:39.003309 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 22:02:39.034876 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 22:02:39.036395 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 22:02:39.047656 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 22:02:39.051839 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 22:02:39.062237 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 22:02:39.063690 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 22:02:39.065338 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 22:02:39.067349 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 22:02:39.067393 systemd[1]: Reached target paths.target - Path Units. Sep 9 22:02:39.070114 systemd[1]: Reached target timers.target - Timer Units. Sep 9 22:02:39.088711 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 22:02:39.163918 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 22:02:39.174228 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 22:02:39.186271 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 22:02:39.192637 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 22:02:39.284718 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 22:02:39.365231 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 22:02:39.372978 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 22:02:39.499400 kernel: EDAC MC: Ver: 3.0.0 Sep 9 22:02:39.516440 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 22:02:39.536966 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 22:02:39.545218 systemd[1]: Reached target basic.target - Basic System. Sep 9 22:02:39.554826 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 22:02:39.556821 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 22:02:39.565309 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 22:02:39.581490 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 22:02:39.593075 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 22:02:39.604027 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 22:02:39.613919 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 22:02:39.617616 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 22:02:39.628138 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 22:02:39.643147 jq[1535]: false Sep 9 22:02:39.645216 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 22:02:39.701768 extend-filesystems[1536]: Found /dev/vda6 Sep 9 22:02:39.703355 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 22:02:39.704945 oslogin_cache_refresh[1537]: Refreshing passwd entry cache Sep 9 22:02:39.706438 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Refreshing passwd entry cache Sep 9 22:02:39.710199 extend-filesystems[1536]: Found /dev/vda9 Sep 9 22:02:39.711240 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 22:02:39.718438 extend-filesystems[1536]: Checking size of /dev/vda9 Sep 9 22:02:39.717819 oslogin_cache_refresh[1537]: Failure getting users, quitting Sep 9 22:02:39.730277 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Failure getting users, quitting Sep 9 22:02:39.730277 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 22:02:39.730277 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Refreshing group entry cache Sep 9 22:02:39.721546 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 22:02:39.717847 oslogin_cache_refresh[1537]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 22:02:39.717933 oslogin_cache_refresh[1537]: Refreshing group entry cache Sep 9 22:02:39.741571 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Failure getting groups, quitting Sep 9 22:02:39.741692 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 22:02:39.741584 oslogin_cache_refresh[1537]: Failure getting groups, quitting Sep 9 22:02:39.741612 oslogin_cache_refresh[1537]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 22:02:39.744120 systemd-networkd[1455]: eth0: Gained IPv6LL Sep 9 22:02:39.745519 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 22:02:39.749852 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 22:02:39.752490 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 22:02:39.756741 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 22:02:39.761785 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 22:02:39.767668 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 22:02:39.786216 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 22:02:39.793840 jq[1559]: true Sep 9 22:02:39.792951 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 22:02:39.793317 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 22:02:39.793776 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 22:02:39.794083 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 22:02:39.842256 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 22:02:39.842686 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 22:02:39.846677 extend-filesystems[1536]: Resized partition /dev/vda9 Sep 9 22:02:39.853002 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 22:02:39.854269 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 22:02:39.900628 jq[1564]: true Sep 9 22:02:39.923247 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 22:02:39.978629 (ntainerd)[1571]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 22:02:39.982030 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 22:02:40.021042 update_engine[1557]: I20250909 22:02:40.020594 1557 main.cc:92] Flatcar Update Engine starting Sep 9 22:02:40.027020 extend-filesystems[1597]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 22:02:40.111118 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:02:40.116939 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 22:02:40.134938 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 22:02:40.171646 tar[1563]: linux-amd64/helm Sep 9 22:02:40.208964 dbus-daemon[1533]: [system] SELinux support is enabled Sep 9 22:02:40.216077 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 22:02:40.399082 update_engine[1557]: I20250909 22:02:40.230504 1557 update_check_scheduler.cc:74] Next update check in 5m59s Sep 9 22:02:40.229536 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 22:02:40.229904 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 22:02:40.273268 systemd[1]: Started update-engine.service - Update Engine. Sep 9 22:02:40.279146 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 22:02:40.279290 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 22:02:40.279319 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 22:02:40.286595 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 22:02:40.286624 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 22:02:40.371990 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 22:02:40.403246 systemd-logind[1554]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 22:02:40.403280 systemd-logind[1554]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 22:02:40.414258 systemd-logind[1554]: New seat seat0. Sep 9 22:02:40.417024 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 22:02:40.489427 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 22:02:40.495571 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 22:02:41.267301 kernel: clocksource: Long readout interval, skipping watchdog check: cs_nsec: 1089336543 wd_nsec: 1089335924 Sep 9 22:02:41.279783 extend-filesystems[1597]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 22:02:41.279783 extend-filesystems[1597]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 22:02:41.279783 extend-filesystems[1597]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 22:02:41.334066 extend-filesystems[1536]: Resized filesystem in /dev/vda9 Sep 9 22:02:41.341557 sshd_keygen[1558]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 22:02:41.341762 bash[1595]: Updated "/home/core/.ssh/authorized_keys" Sep 9 22:02:41.290289 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 22:02:41.297662 locksmithd[1610]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 22:02:41.333600 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 22:02:41.334851 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 22:02:41.350247 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 22:02:41.509912 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 22:02:41.540688 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 22:02:41.583257 systemd[1]: Started sshd@0-10.0.0.72:22-10.0.0.1:58090.service - OpenSSH per-connection server daemon (10.0.0.1:58090). Sep 9 22:02:41.618296 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 22:02:41.696389 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 22:02:41.710429 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 22:02:41.833961 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 22:02:41.845952 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 22:02:41.855531 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 22:02:41.862222 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 22:02:41.934767 containerd[1571]: time="2025-09-09T22:02:41Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 22:02:41.935211 containerd[1571]: time="2025-09-09T22:02:41.935147590Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 22:02:41.987765 containerd[1571]: time="2025-09-09T22:02:41.987685297Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="14.778µs" Sep 9 22:02:41.987962 containerd[1571]: time="2025-09-09T22:02:41.987938882Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 22:02:41.988041 containerd[1571]: time="2025-09-09T22:02:41.988024012Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 22:02:41.988393 containerd[1571]: time="2025-09-09T22:02:41.988368608Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 22:02:41.988558 containerd[1571]: time="2025-09-09T22:02:41.988535211Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 22:02:41.988674 containerd[1571]: time="2025-09-09T22:02:41.988642712Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 22:02:41.988849 containerd[1571]: time="2025-09-09T22:02:41.988824062Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 22:02:41.988920 containerd[1571]: time="2025-09-09T22:02:41.988902980Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 22:02:41.989403 containerd[1571]: time="2025-09-09T22:02:41.989376549Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 22:02:41.989504 containerd[1571]: time="2025-09-09T22:02:41.989464564Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 22:02:41.989598 containerd[1571]: time="2025-09-09T22:02:41.989579559Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 22:02:41.989666 containerd[1571]: time="2025-09-09T22:02:41.989639682Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 22:02:41.989946 containerd[1571]: time="2025-09-09T22:02:41.989922993Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 22:02:41.994142 containerd[1571]: time="2025-09-09T22:02:41.994071754Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 22:02:41.994255 containerd[1571]: time="2025-09-09T22:02:41.994159879Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 22:02:41.994255 containerd[1571]: time="2025-09-09T22:02:41.994177973Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 22:02:41.994255 containerd[1571]: time="2025-09-09T22:02:41.994238547Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 22:02:41.994601 containerd[1571]: time="2025-09-09T22:02:41.994569016Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 22:02:41.994711 containerd[1571]: time="2025-09-09T22:02:41.994682369Z" level=info msg="metadata content store policy set" policy=shared Sep 9 22:02:42.054933 containerd[1571]: time="2025-09-09T22:02:42.054852389Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 22:02:42.055181 containerd[1571]: time="2025-09-09T22:02:42.055156770Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 22:02:42.055258 containerd[1571]: time="2025-09-09T22:02:42.055242631Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 22:02:42.055324 containerd[1571]: time="2025-09-09T22:02:42.055309156Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 22:02:42.055418 containerd[1571]: time="2025-09-09T22:02:42.055401820Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 22:02:42.055501 containerd[1571]: time="2025-09-09T22:02:42.055483172Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 22:02:42.055578 containerd[1571]: time="2025-09-09T22:02:42.055560497Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 22:02:42.055947 containerd[1571]: time="2025-09-09T22:02:42.055639636Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 22:02:42.056065 containerd[1571]: time="2025-09-09T22:02:42.056043924Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 22:02:42.056140 containerd[1571]: time="2025-09-09T22:02:42.056125106Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 22:02:42.056214 containerd[1571]: time="2025-09-09T22:02:42.056198664Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 22:02:42.056286 containerd[1571]: time="2025-09-09T22:02:42.056268876Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 22:02:42.056592 containerd[1571]: time="2025-09-09T22:02:42.056569660Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 22:02:42.056688 containerd[1571]: time="2025-09-09T22:02:42.056670980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 22:02:42.056884 containerd[1571]: time="2025-09-09T22:02:42.056865244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 22:02:42.056956 containerd[1571]: time="2025-09-09T22:02:42.056940515Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 22:02:42.057019 containerd[1571]: time="2025-09-09T22:02:42.057004114Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 22:02:42.057099 containerd[1571]: time="2025-09-09T22:02:42.057082351Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 22:02:42.057169 containerd[1571]: time="2025-09-09T22:02:42.057154396Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 22:02:42.057263 containerd[1571]: time="2025-09-09T22:02:42.057243042Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 22:02:42.057335 containerd[1571]: time="2025-09-09T22:02:42.057320598Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 22:02:42.057408 containerd[1571]: time="2025-09-09T22:02:42.057392423Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 22:02:42.057491 containerd[1571]: time="2025-09-09T22:02:42.057459138Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 22:02:42.058792 containerd[1571]: time="2025-09-09T22:02:42.058745029Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 22:02:42.058901 containerd[1571]: time="2025-09-09T22:02:42.058884992Z" level=info msg="Start snapshots syncer" Sep 9 22:02:42.059030 containerd[1571]: time="2025-09-09T22:02:42.059009766Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 22:02:42.059657 containerd[1571]: time="2025-09-09T22:02:42.059575246Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060238630Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060370648Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060604376Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060629403Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060641646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060666513Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060680659Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060691900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060703362Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060731996Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060746543Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060760188Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060798380Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 22:02:42.203447 containerd[1571]: time="2025-09-09T22:02:42.060823908Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060838315Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060850949Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060861719Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060874653Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060887888Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060914458Z" level=info msg="runtime interface created" Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060922212Z" level=info msg="created NRI interface" Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060933744Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060948702Z" level=info msg="Connect containerd service" Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.060979660Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 22:02:42.203925 containerd[1571]: time="2025-09-09T22:02:42.128323794Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 22:02:42.236104 sshd[1638]: Accepted publickey for core from 10.0.0.1 port 58090 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:42.245295 sshd-session[1638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:42.302716 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 22:02:42.305933 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 22:02:42.336959 systemd-logind[1554]: New session 1 of user core. Sep 9 22:02:42.403989 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 22:02:42.424850 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 22:02:42.582900 (systemd)[1661]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 22:02:42.598409 systemd-logind[1554]: New session c1 of user core. Sep 9 22:02:42.640332 containerd[1571]: time="2025-09-09T22:02:42.640244719Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 22:02:42.640492 containerd[1571]: time="2025-09-09T22:02:42.640350788Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 22:02:42.640492 containerd[1571]: time="2025-09-09T22:02:42.640410309Z" level=info msg="Start subscribing containerd event" Sep 9 22:02:42.640542 containerd[1571]: time="2025-09-09T22:02:42.640455775Z" level=info msg="Start recovering state" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.640653425Z" level=info msg="Start event monitor" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.640998392Z" level=info msg="Start cni network conf syncer for default" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641017017Z" level=info msg="Start streaming server" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641040351Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641056912Z" level=info msg="runtime interface starting up..." Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641068083Z" level=info msg="starting plugins..." Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641169814Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 22:02:42.651590 containerd[1571]: time="2025-09-09T22:02:42.641357476Z" level=info msg="containerd successfully booted in 0.711895s" Sep 9 22:02:42.641810 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 22:02:42.769704 tar[1563]: linux-amd64/LICENSE Sep 9 22:02:42.769704 tar[1563]: linux-amd64/README.md Sep 9 22:02:42.817245 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 22:02:42.939110 systemd[1661]: Queued start job for default target default.target. Sep 9 22:02:43.028387 systemd[1661]: Created slice app.slice - User Application Slice. Sep 9 22:02:43.029722 systemd[1661]: Reached target paths.target - Paths. Sep 9 22:02:43.034827 systemd[1661]: Reached target timers.target - Timers. Sep 9 22:02:43.042120 systemd[1661]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 22:02:43.074876 systemd[1661]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 22:02:43.075079 systemd[1661]: Reached target sockets.target - Sockets. Sep 9 22:02:43.075147 systemd[1661]: Reached target basic.target - Basic System. Sep 9 22:02:43.075199 systemd[1661]: Reached target default.target - Main User Target. Sep 9 22:02:43.075246 systemd[1661]: Startup finished in 440ms. Sep 9 22:02:43.075922 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 22:02:43.106175 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 22:02:43.230222 systemd[1]: Started sshd@1-10.0.0.72:22-10.0.0.1:33750.service - OpenSSH per-connection server daemon (10.0.0.1:33750). Sep 9 22:02:43.647128 sshd[1681]: Accepted publickey for core from 10.0.0.1 port 33750 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:43.650592 sshd-session[1681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:43.685179 systemd-logind[1554]: New session 2 of user core. Sep 9 22:02:43.702152 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 22:02:43.972166 sshd[1684]: Connection closed by 10.0.0.1 port 33750 Sep 9 22:02:43.972936 sshd-session[1681]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:44.008202 systemd[1]: sshd@1-10.0.0.72:22-10.0.0.1:33750.service: Deactivated successfully. Sep 9 22:02:44.014137 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 22:02:44.023107 systemd-logind[1554]: Session 2 logged out. Waiting for processes to exit. Sep 9 22:02:44.035766 systemd[1]: Started sshd@2-10.0.0.72:22-10.0.0.1:33752.service - OpenSSH per-connection server daemon (10.0.0.1:33752). Sep 9 22:02:44.042501 systemd-logind[1554]: Removed session 2. Sep 9 22:02:44.227594 sshd[1690]: Accepted publickey for core from 10.0.0.1 port 33752 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:44.228650 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:44.291528 systemd-logind[1554]: New session 3 of user core. Sep 9 22:02:44.649938 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 22:02:44.763748 sshd[1693]: Connection closed by 10.0.0.1 port 33752 Sep 9 22:02:44.764575 sshd-session[1690]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:44.782953 systemd[1]: sshd@2-10.0.0.72:22-10.0.0.1:33752.service: Deactivated successfully. Sep 9 22:02:44.789371 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 22:02:44.800878 systemd-logind[1554]: Session 3 logged out. Waiting for processes to exit. Sep 9 22:02:44.830738 systemd-logind[1554]: Removed session 3. Sep 9 22:02:47.302776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:02:47.303733 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 22:02:47.310257 systemd[1]: Startup finished in 3.756s (kernel) + 16.478s (initrd) + 20.551s (userspace) = 40.785s. Sep 9 22:02:47.362212 (kubelet)[1707]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:02:51.039778 kubelet[1707]: E0909 22:02:51.031527 1707 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:02:51.053835 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:02:51.057612 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:02:51.058405 systemd[1]: kubelet.service: Consumed 4.631s CPU time, 268.2M memory peak. Sep 9 22:02:54.815605 systemd[1]: Started sshd@3-10.0.0.72:22-10.0.0.1:52718.service - OpenSSH per-connection server daemon (10.0.0.1:52718). Sep 9 22:02:54.987679 sshd[1716]: Accepted publickey for core from 10.0.0.1 port 52718 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:54.993109 sshd-session[1716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:55.007004 systemd-logind[1554]: New session 4 of user core. Sep 9 22:02:55.020789 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 22:02:55.094764 sshd[1719]: Connection closed by 10.0.0.1 port 52718 Sep 9 22:02:55.094981 sshd-session[1716]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:55.107722 systemd[1]: sshd@3-10.0.0.72:22-10.0.0.1:52718.service: Deactivated successfully. Sep 9 22:02:55.120419 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 22:02:55.122327 systemd-logind[1554]: Session 4 logged out. Waiting for processes to exit. Sep 9 22:02:55.131666 systemd[1]: Started sshd@4-10.0.0.72:22-10.0.0.1:52734.service - OpenSSH per-connection server daemon (10.0.0.1:52734). Sep 9 22:02:55.132591 systemd-logind[1554]: Removed session 4. Sep 9 22:02:55.240925 sshd[1725]: Accepted publickey for core from 10.0.0.1 port 52734 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:55.245249 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:55.265358 systemd-logind[1554]: New session 5 of user core. Sep 9 22:02:55.281837 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 22:02:55.379024 sshd[1728]: Connection closed by 10.0.0.1 port 52734 Sep 9 22:02:55.379716 sshd-session[1725]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:55.397102 systemd[1]: sshd@4-10.0.0.72:22-10.0.0.1:52734.service: Deactivated successfully. Sep 9 22:02:55.401896 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 22:02:55.408904 systemd-logind[1554]: Session 5 logged out. Waiting for processes to exit. Sep 9 22:02:55.414050 systemd[1]: Started sshd@5-10.0.0.72:22-10.0.0.1:52746.service - OpenSSH per-connection server daemon (10.0.0.1:52746). Sep 9 22:02:55.419549 systemd-logind[1554]: Removed session 5. Sep 9 22:02:55.571802 sshd[1734]: Accepted publickey for core from 10.0.0.1 port 52746 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:55.582533 sshd-session[1734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:55.605774 systemd-logind[1554]: New session 6 of user core. Sep 9 22:02:55.644037 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 22:02:55.726318 sshd[1737]: Connection closed by 10.0.0.1 port 52746 Sep 9 22:02:55.725021 sshd-session[1734]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:55.753311 systemd[1]: sshd@5-10.0.0.72:22-10.0.0.1:52746.service: Deactivated successfully. Sep 9 22:02:55.765998 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 22:02:55.777789 systemd-logind[1554]: Session 6 logged out. Waiting for processes to exit. Sep 9 22:02:55.792642 systemd[1]: Started sshd@6-10.0.0.72:22-10.0.0.1:52748.service - OpenSSH per-connection server daemon (10.0.0.1:52748). Sep 9 22:02:55.798079 systemd-logind[1554]: Removed session 6. Sep 9 22:02:55.928700 sshd[1743]: Accepted publickey for core from 10.0.0.1 port 52748 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:55.932138 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:55.950653 systemd-logind[1554]: New session 7 of user core. Sep 9 22:02:55.974273 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 22:02:56.098326 sudo[1747]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 22:02:56.101807 sudo[1747]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:02:56.161069 sudo[1747]: pam_unix(sudo:session): session closed for user root Sep 9 22:02:56.178395 sshd[1746]: Connection closed by 10.0.0.1 port 52748 Sep 9 22:02:56.178893 sshd-session[1743]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:56.221923 systemd[1]: sshd@6-10.0.0.72:22-10.0.0.1:52748.service: Deactivated successfully. Sep 9 22:02:56.232531 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 22:02:56.250151 systemd-logind[1554]: Session 7 logged out. Waiting for processes to exit. Sep 9 22:02:56.273915 systemd[1]: Started sshd@7-10.0.0.72:22-10.0.0.1:52750.service - OpenSSH per-connection server daemon (10.0.0.1:52750). Sep 9 22:02:56.292947 systemd-logind[1554]: Removed session 7. Sep 9 22:02:56.462044 sshd[1753]: Accepted publickey for core from 10.0.0.1 port 52750 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:56.458109 sshd-session[1753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:56.518214 systemd-logind[1554]: New session 8 of user core. Sep 9 22:02:56.538828 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 22:02:56.630572 sudo[1758]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 22:02:56.633924 sudo[1758]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:02:56.786829 sudo[1758]: pam_unix(sudo:session): session closed for user root Sep 9 22:02:56.810355 sudo[1757]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 22:02:56.813026 sudo[1757]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:02:56.862465 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 22:02:57.060013 augenrules[1780]: No rules Sep 9 22:02:57.061643 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 22:02:57.061996 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 22:02:57.066746 sudo[1757]: pam_unix(sudo:session): session closed for user root Sep 9 22:02:57.077088 sshd[1756]: Connection closed by 10.0.0.1 port 52750 Sep 9 22:02:57.075132 sshd-session[1753]: pam_unix(sshd:session): session closed for user core Sep 9 22:02:57.111808 systemd[1]: sshd@7-10.0.0.72:22-10.0.0.1:52750.service: Deactivated successfully. Sep 9 22:02:57.122162 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 22:02:57.126546 systemd-logind[1554]: Session 8 logged out. Waiting for processes to exit. Sep 9 22:02:57.132619 systemd[1]: Started sshd@8-10.0.0.72:22-10.0.0.1:52762.service - OpenSSH per-connection server daemon (10.0.0.1:52762). Sep 9 22:02:57.134452 systemd-logind[1554]: Removed session 8. Sep 9 22:02:57.218829 sshd[1789]: Accepted publickey for core from 10.0.0.1 port 52762 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:02:57.219403 sshd-session[1789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:02:57.245318 systemd-logind[1554]: New session 9 of user core. Sep 9 22:02:57.258841 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 22:02:57.345587 sudo[1793]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 22:02:57.351356 sudo[1793]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 22:03:00.633019 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 22:03:00.674206 (dockerd)[1814]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 22:03:01.289716 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 22:03:01.319256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:02.553413 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:02.580863 (kubelet)[1828]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:03:02.670559 dockerd[1814]: time="2025-09-09T22:03:02.669900894Z" level=info msg="Starting up" Sep 9 22:03:02.671330 dockerd[1814]: time="2025-09-09T22:03:02.671057583Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 22:03:02.714710 dockerd[1814]: time="2025-09-09T22:03:02.714625598Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 22:03:02.731623 kubelet[1828]: E0909 22:03:02.731537 1828 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:03:02.739310 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:03:02.739660 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:03:02.740174 systemd[1]: kubelet.service: Consumed 642ms CPU time, 110.4M memory peak. Sep 9 22:03:03.180239 dockerd[1814]: time="2025-09-09T22:03:03.180157367Z" level=info msg="Loading containers: start." Sep 9 22:03:03.316512 kernel: Initializing XFRM netlink socket Sep 9 22:03:04.154738 systemd-networkd[1455]: docker0: Link UP Sep 9 22:03:04.604281 dockerd[1814]: time="2025-09-09T22:03:04.604181971Z" level=info msg="Loading containers: done." Sep 9 22:03:04.660635 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1868634077-merged.mount: Deactivated successfully. Sep 9 22:03:04.802498 dockerd[1814]: time="2025-09-09T22:03:04.802390262Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 22:03:04.802734 dockerd[1814]: time="2025-09-09T22:03:04.802573296Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 22:03:04.802734 dockerd[1814]: time="2025-09-09T22:03:04.802694242Z" level=info msg="Initializing buildkit" Sep 9 22:03:05.864326 dockerd[1814]: time="2025-09-09T22:03:05.864261069Z" level=info msg="Completed buildkit initialization" Sep 9 22:03:05.871744 dockerd[1814]: time="2025-09-09T22:03:05.871692496Z" level=info msg="Daemon has completed initialization" Sep 9 22:03:05.871866 dockerd[1814]: time="2025-09-09T22:03:05.871824994Z" level=info msg="API listen on /run/docker.sock" Sep 9 22:03:05.871941 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 22:03:06.947619 containerd[1571]: time="2025-09-09T22:03:06.947554000Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 22:03:10.441299 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount663123136.mount: Deactivated successfully. Sep 9 22:03:12.786138 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 22:03:12.787949 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:13.027798 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:13.046900 (kubelet)[2111]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:03:13.916500 kubelet[2111]: E0909 22:03:13.916363 2111 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:03:13.920804 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:03:13.921064 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:03:13.921528 systemd[1]: kubelet.service: Consumed 370ms CPU time, 109M memory peak. Sep 9 22:03:13.931355 containerd[1571]: time="2025-09-09T22:03:13.931274123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:13.965799 containerd[1571]: time="2025-09-09T22:03:13.965706423Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 9 22:03:14.010756 containerd[1571]: time="2025-09-09T22:03:14.010667253Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:14.105934 containerd[1571]: time="2025-09-09T22:03:14.105819624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:14.107362 containerd[1571]: time="2025-09-09T22:03:14.107270737Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 7.159640583s" Sep 9 22:03:14.107362 containerd[1571]: time="2025-09-09T22:03:14.107341232Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 9 22:03:14.108320 containerd[1571]: time="2025-09-09T22:03:14.108191082Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 22:03:17.890106 containerd[1571]: time="2025-09-09T22:03:17.889993825Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:17.892359 containerd[1571]: time="2025-09-09T22:03:17.892310670Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 9 22:03:17.896396 containerd[1571]: time="2025-09-09T22:03:17.896283470Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:17.899976 containerd[1571]: time="2025-09-09T22:03:17.899882605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:17.901565 containerd[1571]: time="2025-09-09T22:03:17.901459126Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 3.793227547s" Sep 9 22:03:17.901565 containerd[1571]: time="2025-09-09T22:03:17.901549179Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 9 22:03:17.902914 containerd[1571]: time="2025-09-09T22:03:17.902766835Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 22:03:21.404635 containerd[1571]: time="2025-09-09T22:03:21.404556999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:21.405666 containerd[1571]: time="2025-09-09T22:03:21.405602599Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 9 22:03:21.407498 containerd[1571]: time="2025-09-09T22:03:21.407403323Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:21.410261 containerd[1571]: time="2025-09-09T22:03:21.410230673Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:21.411590 containerd[1571]: time="2025-09-09T22:03:21.411529523Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 3.508704668s" Sep 9 22:03:21.411590 containerd[1571]: time="2025-09-09T22:03:21.411585008Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 9 22:03:21.412169 containerd[1571]: time="2025-09-09T22:03:21.412132640Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 22:03:24.036259 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 22:03:24.043263 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:25.388385 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:25.393371 (kubelet)[2136]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:03:25.539577 update_engine[1557]: I20250909 22:03:25.539419 1557 update_attempter.cc:509] Updating boot flags... Sep 9 22:03:25.560702 kubelet[2136]: E0909 22:03:25.560592 2136 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:03:25.565790 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:03:25.566116 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:03:25.566816 systemd[1]: kubelet.service: Consumed 348ms CPU time, 110.5M memory peak. Sep 9 22:03:29.281071 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2226735738.mount: Deactivated successfully. Sep 9 22:03:30.756930 containerd[1571]: time="2025-09-09T22:03:30.756811505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:30.758027 containerd[1571]: time="2025-09-09T22:03:30.757880175Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 9 22:03:30.759775 containerd[1571]: time="2025-09-09T22:03:30.759665430Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:30.763183 containerd[1571]: time="2025-09-09T22:03:30.763096988Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:30.763932 containerd[1571]: time="2025-09-09T22:03:30.763846915Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 9.351677033s" Sep 9 22:03:30.763932 containerd[1571]: time="2025-09-09T22:03:30.763920876Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 9 22:03:30.764867 containerd[1571]: time="2025-09-09T22:03:30.764777705Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 22:03:31.386250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1792786981.mount: Deactivated successfully. Sep 9 22:03:33.447740 containerd[1571]: time="2025-09-09T22:03:33.447633840Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:33.448769 containerd[1571]: time="2025-09-09T22:03:33.448697648Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 9 22:03:33.450803 containerd[1571]: time="2025-09-09T22:03:33.450737279Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:33.454275 containerd[1571]: time="2025-09-09T22:03:33.454190708Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:33.455554 containerd[1571]: time="2025-09-09T22:03:33.455448343Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.690590876s" Sep 9 22:03:33.455554 containerd[1571]: time="2025-09-09T22:03:33.455519116Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 22:03:33.456100 containerd[1571]: time="2025-09-09T22:03:33.456053454Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 22:03:35.734612 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 22:03:35.804647 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:35.806988 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount369108.mount: Deactivated successfully. Sep 9 22:03:35.827413 containerd[1571]: time="2025-09-09T22:03:35.827321076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:03:35.831287 containerd[1571]: time="2025-09-09T22:03:35.831225641Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 9 22:03:35.836304 containerd[1571]: time="2025-09-09T22:03:35.836244138Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:03:35.847166 containerd[1571]: time="2025-09-09T22:03:35.844543122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 22:03:35.847166 containerd[1571]: time="2025-09-09T22:03:35.846708958Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 2.390612832s" Sep 9 22:03:35.847166 containerd[1571]: time="2025-09-09T22:03:35.846757368Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 22:03:35.847457 containerd[1571]: time="2025-09-09T22:03:35.847293700Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 22:03:36.180096 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:36.202720 (kubelet)[2235]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:03:36.603495 kubelet[2235]: E0909 22:03:36.603396 2235 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:03:36.618369 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:03:36.618697 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:03:36.619445 systemd[1]: kubelet.service: Consumed 591ms CPU time, 110.4M memory peak. Sep 9 22:03:37.598622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2913904765.mount: Deactivated successfully. Sep 9 22:03:45.720017 containerd[1571]: time="2025-09-09T22:03:45.719765793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:45.723997 containerd[1571]: time="2025-09-09T22:03:45.721054958Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 9 22:03:45.732665 containerd[1571]: time="2025-09-09T22:03:45.729721610Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:45.739742 containerd[1571]: time="2025-09-09T22:03:45.739637623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:03:45.744711 containerd[1571]: time="2025-09-09T22:03:45.742541476Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 9.895205615s" Sep 9 22:03:45.744711 containerd[1571]: time="2025-09-09T22:03:45.742599785Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 9 22:03:46.796614 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 9 22:03:46.806809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:47.349904 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:47.369221 (kubelet)[2330]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 22:03:47.542536 kubelet[2330]: E0909 22:03:47.542282 2330 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 22:03:47.550648 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 22:03:47.551278 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 22:03:47.551928 systemd[1]: kubelet.service: Consumed 483ms CPU time, 112.8M memory peak. Sep 9 22:03:50.115051 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:50.115307 systemd[1]: kubelet.service: Consumed 483ms CPU time, 112.8M memory peak. Sep 9 22:03:50.128546 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:50.344707 systemd[1]: Reload requested from client PID 2346 ('systemctl') (unit session-9.scope)... Sep 9 22:03:50.344746 systemd[1]: Reloading... Sep 9 22:03:50.603588 zram_generator::config[2392]: No configuration found. Sep 9 22:03:52.558340 systemd[1]: Reloading finished in 2213 ms. Sep 9 22:03:52.671746 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 22:03:52.671905 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 22:03:52.674380 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:52.674457 systemd[1]: kubelet.service: Consumed 480ms CPU time, 98.4M memory peak. Sep 9 22:03:52.680546 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:03:53.254645 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:03:53.284714 (kubelet)[2437]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 22:03:53.394912 kubelet[2437]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:03:53.394912 kubelet[2437]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 22:03:53.394912 kubelet[2437]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:03:53.397539 kubelet[2437]: I0909 22:03:53.396061 2437 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 22:03:54.172184 kubelet[2437]: I0909 22:03:54.171706 2437 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 22:03:54.172184 kubelet[2437]: I0909 22:03:54.172004 2437 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 22:03:54.174088 kubelet[2437]: I0909 22:03:54.174039 2437 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 22:03:54.240440 kubelet[2437]: E0909 22:03:54.240291 2437 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.72:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:54.244365 kubelet[2437]: I0909 22:03:54.244150 2437 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 22:03:54.288153 kubelet[2437]: I0909 22:03:54.283818 2437 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 22:03:54.318958 kubelet[2437]: I0909 22:03:54.318187 2437 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 22:03:54.318958 kubelet[2437]: I0909 22:03:54.318378 2437 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 22:03:54.318958 kubelet[2437]: I0909 22:03:54.318568 2437 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 22:03:54.319925 kubelet[2437]: I0909 22:03:54.318616 2437 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 22:03:54.319925 kubelet[2437]: I0909 22:03:54.319547 2437 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 22:03:54.319925 kubelet[2437]: I0909 22:03:54.319567 2437 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 22:03:54.319925 kubelet[2437]: I0909 22:03:54.319742 2437 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:03:54.350355 kubelet[2437]: I0909 22:03:54.349761 2437 kubelet.go:408] "Attempting to sync node with API server" Sep 9 22:03:54.350355 kubelet[2437]: I0909 22:03:54.349834 2437 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 22:03:54.350355 kubelet[2437]: I0909 22:03:54.349909 2437 kubelet.go:314] "Adding apiserver pod source" Sep 9 22:03:54.350355 kubelet[2437]: I0909 22:03:54.349959 2437 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 22:03:54.350681 kubelet[2437]: W0909 22:03:54.350604 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:54.350725 kubelet[2437]: E0909 22:03:54.350692 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:54.354398 kubelet[2437]: W0909 22:03:54.352585 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:54.354398 kubelet[2437]: E0909 22:03:54.352646 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:54.360339 kubelet[2437]: I0909 22:03:54.359685 2437 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 22:03:54.360339 kubelet[2437]: I0909 22:03:54.360183 2437 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 22:03:54.360339 kubelet[2437]: W0909 22:03:54.360278 2437 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 22:03:54.371929 kubelet[2437]: I0909 22:03:54.371343 2437 server.go:1274] "Started kubelet" Sep 9 22:03:54.374352 kubelet[2437]: I0909 22:03:54.372869 2437 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 22:03:54.374352 kubelet[2437]: I0909 22:03:54.373346 2437 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 22:03:54.374352 kubelet[2437]: I0909 22:03:54.373431 2437 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 22:03:54.378571 kubelet[2437]: I0909 22:03:54.378534 2437 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 22:03:54.381863 kubelet[2437]: I0909 22:03:54.380359 2437 server.go:449] "Adding debug handlers to kubelet server" Sep 9 22:03:54.382986 kubelet[2437]: I0909 22:03:54.382949 2437 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 22:03:54.388362 kubelet[2437]: I0909 22:03:54.388248 2437 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 22:03:54.388691 kubelet[2437]: E0909 22:03:54.388658 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.389047 kubelet[2437]: I0909 22:03:54.389019 2437 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 22:03:54.389142 kubelet[2437]: I0909 22:03:54.389122 2437 reconciler.go:26] "Reconciler: start to sync state" Sep 9 22:03:54.389581 kubelet[2437]: W0909 22:03:54.389516 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:54.389581 kubelet[2437]: E0909 22:03:54.389577 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:54.389788 kubelet[2437]: E0909 22:03:54.389755 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="200ms" Sep 9 22:03:54.393707 kubelet[2437]: I0909 22:03:54.393680 2437 factory.go:221] Registration of the systemd container factory successfully Sep 9 22:03:54.394087 kubelet[2437]: I0909 22:03:54.394007 2437 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 22:03:54.396320 kubelet[2437]: E0909 22:03:54.396112 2437 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 22:03:54.399441 kubelet[2437]: I0909 22:03:54.398491 2437 factory.go:221] Registration of the containerd container factory successfully Sep 9 22:03:54.406834 kubelet[2437]: E0909 22:03:54.386701 2437 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.72:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863bc67fa4779a1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,LastTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 22:03:54.507524 kubelet[2437]: E0909 22:03:54.507263 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.536436 kubelet[2437]: I0909 22:03:54.535914 2437 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 22:03:54.536436 kubelet[2437]: I0909 22:03:54.535947 2437 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 22:03:54.536436 kubelet[2437]: I0909 22:03:54.535971 2437 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:03:54.562942 kubelet[2437]: I0909 22:03:54.562851 2437 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 22:03:54.568893 kubelet[2437]: I0909 22:03:54.568847 2437 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 22:03:54.569396 kubelet[2437]: I0909 22:03:54.569091 2437 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 22:03:54.569396 kubelet[2437]: I0909 22:03:54.569130 2437 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 22:03:54.569396 kubelet[2437]: E0909 22:03:54.569184 2437 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 22:03:54.570042 kubelet[2437]: W0909 22:03:54.570000 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:54.572350 kubelet[2437]: E0909 22:03:54.571986 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:54.590444 kubelet[2437]: E0909 22:03:54.590377 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="400ms" Sep 9 22:03:54.608788 kubelet[2437]: E0909 22:03:54.607726 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.670175 kubelet[2437]: E0909 22:03:54.669981 2437 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 22:03:54.709280 kubelet[2437]: E0909 22:03:54.709034 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.810800 kubelet[2437]: E0909 22:03:54.810162 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.871594 kubelet[2437]: E0909 22:03:54.871247 2437 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 22:03:54.912362 kubelet[2437]: E0909 22:03:54.911705 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:54.991709 kubelet[2437]: E0909 22:03:54.991614 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="800ms" Sep 9 22:03:55.015822 kubelet[2437]: E0909 22:03:55.015726 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.117580 kubelet[2437]: E0909 22:03:55.116232 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.216900 kubelet[2437]: E0909 22:03:55.216739 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.272220 kubelet[2437]: E0909 22:03:55.272081 2437 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 22:03:55.317809 kubelet[2437]: E0909 22:03:55.317726 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.381659 kubelet[2437]: W0909 22:03:55.381396 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:55.381659 kubelet[2437]: E0909 22:03:55.381520 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:55.418571 kubelet[2437]: E0909 22:03:55.418410 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.519256 kubelet[2437]: E0909 22:03:55.519106 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.622552 kubelet[2437]: E0909 22:03:55.622426 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.701075 kubelet[2437]: I0909 22:03:55.700771 2437 policy_none.go:49] "None policy: Start" Sep 9 22:03:55.705799 kubelet[2437]: I0909 22:03:55.705736 2437 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 22:03:55.705799 kubelet[2437]: I0909 22:03:55.705794 2437 state_mem.go:35] "Initializing new in-memory state store" Sep 9 22:03:55.723375 kubelet[2437]: E0909 22:03:55.723295 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.735602 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 22:03:55.774091 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 22:03:55.792705 kubelet[2437]: E0909 22:03:55.792643 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="1.6s" Sep 9 22:03:55.807213 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 22:03:55.817633 kubelet[2437]: I0909 22:03:55.814214 2437 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 22:03:55.830440 kubelet[2437]: E0909 22:03:55.828261 2437 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:03:55.830440 kubelet[2437]: I0909 22:03:55.829114 2437 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 22:03:55.840864 kubelet[2437]: I0909 22:03:55.829310 2437 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 22:03:55.840864 kubelet[2437]: I0909 22:03:55.837531 2437 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 22:03:55.866583 kubelet[2437]: E0909 22:03:55.865706 2437 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 22:03:55.890560 kubelet[2437]: W0909 22:03:55.885146 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:55.890560 kubelet[2437]: E0909 22:03:55.890432 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:55.907667 kubelet[2437]: W0909 22:03:55.907507 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:55.907667 kubelet[2437]: E0909 22:03:55.907583 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:55.921060 kubelet[2437]: W0909 22:03:55.918816 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:55.921060 kubelet[2437]: E0909 22:03:55.920661 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:55.940629 kubelet[2437]: I0909 22:03:55.939978 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:03:55.940629 kubelet[2437]: E0909 22:03:55.940438 2437 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 22:03:56.104069 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 9 22:03:56.129209 kubelet[2437]: I0909 22:03:56.128167 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:03:56.129209 kubelet[2437]: I0909 22:03:56.128230 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:03:56.129209 kubelet[2437]: I0909 22:03:56.128322 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:03:56.129209 kubelet[2437]: I0909 22:03:56.128410 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:03:56.129209 kubelet[2437]: I0909 22:03:56.128493 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:03:56.129578 kubelet[2437]: I0909 22:03:56.128530 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:03:56.129578 kubelet[2437]: I0909 22:03:56.128566 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:03:56.129578 kubelet[2437]: I0909 22:03:56.128634 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:03:56.129578 kubelet[2437]: I0909 22:03:56.128669 2437 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 22:03:56.139988 systemd[1]: Created slice kubepods-burstable-podeee96a5d2f569f8950a31211aa4049a7.slice - libcontainer container kubepods-burstable-podeee96a5d2f569f8950a31211aa4049a7.slice. Sep 9 22:03:56.145239 kubelet[2437]: I0909 22:03:56.145203 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:03:56.146837 kubelet[2437]: E0909 22:03:56.146026 2437 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 22:03:56.161301 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 9 22:03:56.324746 kubelet[2437]: E0909 22:03:56.324666 2437 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.72:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:56.431508 kubelet[2437]: E0909 22:03:56.431168 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:56.432681 containerd[1571]: time="2025-09-09T22:03:56.432566286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 9 22:03:56.451149 kubelet[2437]: E0909 22:03:56.451077 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:56.452514 containerd[1571]: time="2025-09-09T22:03:56.452104963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:eee96a5d2f569f8950a31211aa4049a7,Namespace:kube-system,Attempt:0,}" Sep 9 22:03:56.466697 kubelet[2437]: E0909 22:03:56.466635 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:56.471396 containerd[1571]: time="2025-09-09T22:03:56.470156938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 9 22:03:56.552511 kubelet[2437]: I0909 22:03:56.552382 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:03:56.559654 kubelet[2437]: E0909 22:03:56.556244 2437 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 22:03:56.930190 containerd[1571]: time="2025-09-09T22:03:56.929616374Z" level=info msg="connecting to shim 687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac" address="unix:///run/containerd/s/ded12dcec1c56713ad6eeeb549647fa77a5866ef8fc3e199061971da068d0996" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:03:56.943302 containerd[1571]: time="2025-09-09T22:03:56.942864727Z" level=info msg="connecting to shim c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0" address="unix:///run/containerd/s/1554a301dee7943178f634882913b051a0fcfd85f43a3f1af63068cc2f232c9b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:03:57.243709 kubelet[2437]: W0909 22:03:57.238198 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:57.243709 kubelet[2437]: E0909 22:03:57.238838 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:57.278349 containerd[1571]: time="2025-09-09T22:03:57.275830377Z" level=info msg="connecting to shim 4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804" address="unix:///run/containerd/s/56e3b1e49c84ece116e6eeaf2ea1c2b003e493232a82d028d5ec47ca347529ad" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:03:57.316350 systemd[1]: Started cri-containerd-687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac.scope - libcontainer container 687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac. Sep 9 22:03:57.363781 kubelet[2437]: I0909 22:03:57.363731 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:03:57.365647 kubelet[2437]: E0909 22:03:57.365579 2437 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 22:03:57.379555 systemd[1]: Started cri-containerd-c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0.scope - libcontainer container c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0. Sep 9 22:03:57.388498 kubelet[2437]: E0909 22:03:57.388262 2437 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.72:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863bc67fa4779a1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,LastTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 22:03:57.407906 kubelet[2437]: E0909 22:03:57.407770 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="3.2s" Sep 9 22:03:57.553336 systemd[1]: Started cri-containerd-4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804.scope - libcontainer container 4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804. Sep 9 22:03:58.417501 kubelet[2437]: W0909 22:03:58.415559 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:58.417501 kubelet[2437]: E0909 22:03:58.415633 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:58.422723 kubelet[2437]: W0909 22:03:58.418185 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:58.422723 kubelet[2437]: E0909 22:03:58.418271 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:58.454847 containerd[1571]: time="2025-09-09T22:03:58.454731198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac\"" Sep 9 22:03:58.463343 kubelet[2437]: E0909 22:03:58.460823 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:58.482814 containerd[1571]: time="2025-09-09T22:03:58.482757997Z" level=info msg="CreateContainer within sandbox \"687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 22:03:58.541787 containerd[1571]: time="2025-09-09T22:03:58.537399045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0\"" Sep 9 22:03:58.540173 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1435254848.mount: Deactivated successfully. Sep 9 22:03:58.542459 kubelet[2437]: E0909 22:03:58.541921 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:58.544938 containerd[1571]: time="2025-09-09T22:03:58.544748095Z" level=info msg="CreateContainer within sandbox \"c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 22:03:58.551653 containerd[1571]: time="2025-09-09T22:03:58.550500336Z" level=info msg="Container d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:03:58.601273 containerd[1571]: time="2025-09-09T22:03:58.600676589Z" level=info msg="CreateContainer within sandbox \"687890fe290e1afb5dbc99220f1b17b9f64e51d56310293bd0f2d5b50d4eceac\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e\"" Sep 9 22:03:58.615793 containerd[1571]: time="2025-09-09T22:03:58.615717413Z" level=info msg="StartContainer for \"d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e\"" Sep 9 22:03:58.616713 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1402024776.mount: Deactivated successfully. Sep 9 22:03:58.622665 containerd[1571]: time="2025-09-09T22:03:58.622583997Z" level=info msg="connecting to shim d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e" address="unix:///run/containerd/s/ded12dcec1c56713ad6eeeb549647fa77a5866ef8fc3e199061971da068d0996" protocol=ttrpc version=3 Sep 9 22:03:58.662347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount965461257.mount: Deactivated successfully. Sep 9 22:03:58.667324 containerd[1571]: time="2025-09-09T22:03:58.667277705Z" level=info msg="Container bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:03:58.695919 systemd[1]: Started cri-containerd-d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e.scope - libcontainer container d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e. Sep 9 22:03:58.727336 containerd[1571]: time="2025-09-09T22:03:58.727253620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:eee96a5d2f569f8950a31211aa4049a7,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804\"" Sep 9 22:03:58.730744 kubelet[2437]: E0909 22:03:58.730501 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:58.735953 containerd[1571]: time="2025-09-09T22:03:58.735642703Z" level=info msg="CreateContainer within sandbox \"4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 22:03:58.746058 containerd[1571]: time="2025-09-09T22:03:58.746005223Z" level=info msg="CreateContainer within sandbox \"c9031a0bec6fc485569712ed35284406b80423bcb91f6f7e7d34cb62daf3b1d0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf\"" Sep 9 22:03:58.748521 containerd[1571]: time="2025-09-09T22:03:58.747073259Z" level=info msg="StartContainer for \"bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf\"" Sep 9 22:03:58.753316 containerd[1571]: time="2025-09-09T22:03:58.751695949Z" level=info msg="connecting to shim bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf" address="unix:///run/containerd/s/1554a301dee7943178f634882913b051a0fcfd85f43a3f1af63068cc2f232c9b" protocol=ttrpc version=3 Sep 9 22:03:58.825288 containerd[1571]: time="2025-09-09T22:03:58.823311171Z" level=info msg="Container e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:03:58.832578 systemd[1]: Started cri-containerd-bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf.scope - libcontainer container bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf. Sep 9 22:03:58.848261 containerd[1571]: time="2025-09-09T22:03:58.845398417Z" level=info msg="CreateContainer within sandbox \"4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\"" Sep 9 22:03:58.855569 containerd[1571]: time="2025-09-09T22:03:58.855500397Z" level=info msg="StartContainer for \"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\"" Sep 9 22:03:58.861508 containerd[1571]: time="2025-09-09T22:03:58.861426686Z" level=info msg="connecting to shim e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4" address="unix:///run/containerd/s/56e3b1e49c84ece116e6eeaf2ea1c2b003e493232a82d028d5ec47ca347529ad" protocol=ttrpc version=3 Sep 9 22:03:58.970957 kubelet[2437]: I0909 22:03:58.969742 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:03:58.970957 kubelet[2437]: E0909 22:03:58.970289 2437 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 22:03:58.982503 kubelet[2437]: W0909 22:03:58.982339 2437 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:03:58.982826 kubelet[2437]: E0909 22:03:58.982537 2437 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 22:03:59.027844 systemd[1]: Started cri-containerd-e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4.scope - libcontainer container e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4. Sep 9 22:03:59.054906 containerd[1571]: time="2025-09-09T22:03:59.054849313Z" level=info msg="StartContainer for \"d4f817e70c13744ad03b0ee9383ccb01c3ec1354b68eabfb210beb5c5675f07e\" returns successfully" Sep 9 22:03:59.164274 containerd[1571]: time="2025-09-09T22:03:59.164203425Z" level=info msg="StartContainer for \"bed72092ae6c2a08ae736f7bd5e9be4b807acacbc1fd548692c5c8e35ccf9fdf\" returns successfully" Sep 9 22:03:59.208362 containerd[1571]: time="2025-09-09T22:03:59.205391693Z" level=info msg="StartContainer for \"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\" returns successfully" Sep 9 22:03:59.443245 kubelet[2437]: E0909 22:03:59.436547 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:59.444421 kubelet[2437]: E0909 22:03:59.444388 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:03:59.445707 kubelet[2437]: E0909 22:03:59.445681 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:00.466439 kubelet[2437]: E0909 22:04:00.464077 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:00.475744 kubelet[2437]: E0909 22:04:00.475634 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:02.190417 kubelet[2437]: I0909 22:04:02.189711 2437 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:04:02.810214 kubelet[2437]: E0909 22:04:02.810155 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:02.822132 kubelet[2437]: E0909 22:04:02.822060 2437 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 22:04:03.418302 kubelet[2437]: I0909 22:04:03.418204 2437 apiserver.go:52] "Watching apiserver" Sep 9 22:04:03.490085 kubelet[2437]: I0909 22:04:03.489946 2437 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 22:04:03.816100 kubelet[2437]: I0909 22:04:03.816005 2437 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 22:04:10.927049 kubelet[2437]: E0909 22:04:10.926967 2437 kubelet_node_status.go:535] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-09T22:04:03Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-09T22:04:03Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-09T22:04:03Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-09T22:04:03Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"localhost\": Patch \"https://10.0.0.72:6443/api/v1/nodes/localhost/status?timeout=10s\": unexpected EOF" Sep 9 22:04:10.927712 kubelet[2437]: E0909 22:04:10.927192 2437 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/default/events\": unexpected EOF" event="&Event{ObjectMeta:{localhost.1863bc67fa4779a1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,LastTimestamp:2025-09-09 22:03:54.371266977 +0000 UTC m=+1.071704729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 22:04:10.927712 kubelet[2437]: E0909 22:04:10.927452 2437 kubelet.go:1915] "Failed creating a mirror pod for" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/kube-system/pods\": unexpected EOF" pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:10.927712 kubelet[2437]: E0909 22:04:10.927652 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:10.927810 kubelet[2437]: E0909 22:04:10.927784 2437 kubelet.go:1915] "Failed creating a mirror pod for" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/kube-system/pods\": unexpected EOF" pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:10.927962 kubelet[2437]: E0909 22:04:10.927913 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:10.933357 systemd[1]: cri-containerd-e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4.scope: Deactivated successfully. Sep 9 22:04:10.933952 systemd[1]: cri-containerd-e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4.scope: Consumed 4.184s CPU time, 158.1M memory peak. Sep 9 22:04:10.935464 containerd[1571]: time="2025-09-09T22:04:10.935418633Z" level=info msg="received exit event container_id:\"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\" id:\"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\" pid:2667 exit_status:255 exited_at:{seconds:1757455450 nanos:934684285}" Sep 9 22:04:10.936094 containerd[1571]: time="2025-09-09T22:04:10.935510455Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\" id:\"e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4\" pid:2667 exit_status:255 exited_at:{seconds:1757455450 nanos:934684285}" Sep 9 22:04:10.958724 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4-rootfs.mount: Deactivated successfully. Sep 9 22:04:11.257406 kubelet[2437]: E0909 22:04:11.257206 2437 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csinodes/localhost": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:04:11.590593 kubelet[2437]: E0909 22:04:11.590536 2437 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csinodes/localhost": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:04:11.934034 kubelet[2437]: E0909 22:04:11.933887 2437 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": Get \"https://10.0.0.72:6443/api/v1/nodes/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused - error from a previous attempt: read tcp 10.0.0.72:55612->10.0.0.72:6443: read: connection reset by peer" Sep 9 22:04:11.934533 kubelet[2437]: E0909 22:04:11.934099 2437 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": Get \"https://10.0.0.72:6443/api/v1/nodes/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" Sep 9 22:04:11.934533 kubelet[2437]: E0909 22:04:11.934263 2437 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": Get \"https://10.0.0.72:6443/api/v1/nodes/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" Sep 9 22:04:11.934533 kubelet[2437]: E0909 22:04:11.934413 2437 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": Get \"https://10.0.0.72:6443/api/v1/nodes/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" Sep 9 22:04:11.934533 kubelet[2437]: E0909 22:04:11.934426 2437 kubelet_node_status.go:522] "Unable to update node status" err="update node status exceeds retry count" Sep 9 22:04:12.008868 kubelet[2437]: E0909 22:04:12.008793 2437 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csinodes/localhost": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 22:04:12.484729 kubelet[2437]: E0909 22:04:12.484656 2437 kubelet.go:1915] "Failed creating a mirror pod for" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/kube-system/pods\": dial tcp 10.0.0.72:6443: connect: connection refused" pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:12.484933 kubelet[2437]: I0909 22:04:12.484796 2437 scope.go:117] "RemoveContainer" containerID="e1db8d1351a1870928fee3dca892e1ff93354f2989904635c299b16755a2f7d4" Sep 9 22:04:12.484933 kubelet[2437]: E0909 22:04:12.484918 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:12.486982 containerd[1571]: time="2025-09-09T22:04:12.486935194Z" level=info msg="CreateContainer within sandbox \"4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:1,}" Sep 9 22:04:12.505536 containerd[1571]: time="2025-09-09T22:04:12.505453883Z" level=info msg="Container 4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:04:12.519034 containerd[1571]: time="2025-09-09T22:04:12.518962679Z" level=info msg="CreateContainer within sandbox \"4f7a889a0df5aebd6cbfad6c5e57c3b80a5513ee7b770c05c715c09d8c871804\" for &ContainerMetadata{Name:kube-apiserver,Attempt:1,} returns container id \"4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914\"" Sep 9 22:04:12.519687 containerd[1571]: time="2025-09-09T22:04:12.519656702Z" level=info msg="StartContainer for \"4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914\"" Sep 9 22:04:12.521237 containerd[1571]: time="2025-09-09T22:04:12.521194286Z" level=info msg="connecting to shim 4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914" address="unix:///run/containerd/s/56e3b1e49c84ece116e6eeaf2ea1c2b003e493232a82d028d5ec47ca347529ad" protocol=ttrpc version=3 Sep 9 22:04:12.552873 systemd[1]: Started cri-containerd-4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914.scope - libcontainer container 4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914. Sep 9 22:04:12.632331 containerd[1571]: time="2025-09-09T22:04:12.632255237Z" level=info msg="StartContainer for \"4b8d4232a157658663f7e14fa588953a0fbde67439b00f1ec6f3beddb1901914\" returns successfully" Sep 9 22:04:13.935291 kubelet[2437]: E0909 22:04:13.935189 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:13.958410 kubelet[2437]: E0909 22:04:13.958309 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:13.997892 kubelet[2437]: E0909 22:04:13.997853 2437 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="200ms" Sep 9 22:04:14.545219 kubelet[2437]: E0909 22:04:14.545162 2437 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:14.545435 kubelet[2437]: E0909 22:04:14.545408 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:15.700173 kubelet[2437]: E0909 22:04:15.700068 2437 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:15.700956 kubelet[2437]: E0909 22:04:15.700375 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:16.573950 kubelet[2437]: E0909 22:04:16.573572 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:16.573950 kubelet[2437]: E0909 22:04:16.573899 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:17.071678 systemd[1]: Reload requested from client PID 2757 ('systemctl') (unit session-9.scope)... Sep 9 22:04:17.072804 systemd[1]: Reloading... Sep 9 22:04:17.341789 kubelet[2437]: E0909 22:04:17.341343 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:17.415517 zram_generator::config[2800]: No configuration found. Sep 9 22:04:17.607637 kubelet[2437]: E0909 22:04:17.508941 2437 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:17.744908 kubelet[2437]: I0909 22:04:17.744622 2437 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=4.744589363 podStartE2EDuration="4.744589363s" podCreationTimestamp="2025-09-09 22:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:04:17.740917635 +0000 UTC m=+24.441355417" watchObservedRunningTime="2025-09-09 22:04:17.744589363 +0000 UTC m=+24.445027115" Sep 9 22:04:18.059280 kubelet[2437]: I0909 22:04:18.059173 2437 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=5.059146063 podStartE2EDuration="5.059146063s" podCreationTimestamp="2025-09-09 22:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:04:18.035050827 +0000 UTC m=+24.735488579" watchObservedRunningTime="2025-09-09 22:04:18.059146063 +0000 UTC m=+24.759583825" Sep 9 22:04:18.059280 kubelet[2437]: I0909 22:04:18.059262 2437 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.059256507 podStartE2EDuration="1.059256507s" podCreationTimestamp="2025-09-09 22:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:04:18.058929504 +0000 UTC m=+24.759367266" watchObservedRunningTime="2025-09-09 22:04:18.059256507 +0000 UTC m=+24.759694260" Sep 9 22:04:18.198827 systemd[1]: Reloading finished in 1119 ms. Sep 9 22:04:18.241509 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:04:18.275409 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 22:04:18.275858 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:04:18.275931 systemd[1]: kubelet.service: Consumed 2.158s CPU time, 135.4M memory peak. Sep 9 22:04:18.283973 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 22:04:18.825102 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 22:04:18.852796 (kubelet)[2845]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 22:04:18.931191 kubelet[2845]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:04:18.931855 kubelet[2845]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 22:04:18.931855 kubelet[2845]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 22:04:18.932737 kubelet[2845]: I0909 22:04:18.931958 2845 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 22:04:18.946176 kubelet[2845]: I0909 22:04:18.946016 2845 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 22:04:18.946176 kubelet[2845]: I0909 22:04:18.946078 2845 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 22:04:18.946583 kubelet[2845]: I0909 22:04:18.946553 2845 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 22:04:18.948718 kubelet[2845]: I0909 22:04:18.948642 2845 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 22:04:18.952725 kubelet[2845]: I0909 22:04:18.952552 2845 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 22:04:18.980772 kubelet[2845]: I0909 22:04:18.980025 2845 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 22:04:18.994421 kubelet[2845]: I0909 22:04:18.994216 2845 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 22:04:18.994606 kubelet[2845]: I0909 22:04:18.994456 2845 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 22:04:18.994817 kubelet[2845]: I0909 22:04:18.994700 2845 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 22:04:18.995003 kubelet[2845]: I0909 22:04:18.994742 2845 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 22:04:18.995206 kubelet[2845]: I0909 22:04:18.995003 2845 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 22:04:18.995206 kubelet[2845]: I0909 22:04:18.995019 2845 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 22:04:18.995206 kubelet[2845]: I0909 22:04:18.995061 2845 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:04:18.996064 kubelet[2845]: I0909 22:04:18.995229 2845 kubelet.go:408] "Attempting to sync node with API server" Sep 9 22:04:18.996064 kubelet[2845]: I0909 22:04:18.996040 2845 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 22:04:18.996311 kubelet[2845]: I0909 22:04:18.996105 2845 kubelet.go:314] "Adding apiserver pod source" Sep 9 22:04:18.996311 kubelet[2845]: I0909 22:04:18.996123 2845 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 22:04:18.998075 kubelet[2845]: I0909 22:04:18.997982 2845 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 22:04:18.998603 kubelet[2845]: I0909 22:04:18.998553 2845 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 22:04:19.000863 kubelet[2845]: I0909 22:04:18.999493 2845 server.go:1274] "Started kubelet" Sep 9 22:04:19.007862 kubelet[2845]: I0909 22:04:19.003649 2845 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 22:04:19.007862 kubelet[2845]: I0909 22:04:19.004376 2845 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 22:04:19.007862 kubelet[2845]: I0909 22:04:19.004439 2845 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 22:04:19.011579 kubelet[2845]: I0909 22:04:19.010395 2845 server.go:449] "Adding debug handlers to kubelet server" Sep 9 22:04:19.012491 kubelet[2845]: I0909 22:04:19.011926 2845 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 22:04:19.015970 kubelet[2845]: I0909 22:04:19.015262 2845 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 22:04:19.015970 kubelet[2845]: I0909 22:04:19.015883 2845 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 22:04:19.016952 kubelet[2845]: I0909 22:04:19.016025 2845 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 22:04:19.016952 kubelet[2845]: I0909 22:04:19.016196 2845 reconciler.go:26] "Reconciler: start to sync state" Sep 9 22:04:19.017187 kubelet[2845]: E0909 22:04:19.016975 2845 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 22:04:19.022503 kubelet[2845]: E0909 22:04:19.020741 2845 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 22:04:19.039353 kubelet[2845]: I0909 22:04:19.039315 2845 factory.go:221] Registration of the containerd container factory successfully Sep 9 22:04:19.039743 kubelet[2845]: I0909 22:04:19.039729 2845 factory.go:221] Registration of the systemd container factory successfully Sep 9 22:04:19.039954 kubelet[2845]: I0909 22:04:19.039929 2845 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 22:04:19.048998 kubelet[2845]: I0909 22:04:19.048914 2845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 22:04:19.054136 kubelet[2845]: I0909 22:04:19.054072 2845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 22:04:19.054136 kubelet[2845]: I0909 22:04:19.054121 2845 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 22:04:19.054136 kubelet[2845]: I0909 22:04:19.054148 2845 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 22:04:19.054375 kubelet[2845]: E0909 22:04:19.054223 2845 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 22:04:19.130414 kubelet[2845]: I0909 22:04:19.130247 2845 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 22:04:19.130414 kubelet[2845]: I0909 22:04:19.130270 2845 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 22:04:19.130414 kubelet[2845]: I0909 22:04:19.130307 2845 state_mem.go:36] "Initialized new in-memory state store" Sep 9 22:04:19.130694 kubelet[2845]: I0909 22:04:19.130553 2845 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 22:04:19.130694 kubelet[2845]: I0909 22:04:19.130569 2845 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 22:04:19.130694 kubelet[2845]: I0909 22:04:19.130593 2845 policy_none.go:49] "None policy: Start" Sep 9 22:04:19.131790 kubelet[2845]: I0909 22:04:19.131695 2845 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 22:04:19.131790 kubelet[2845]: I0909 22:04:19.131727 2845 state_mem.go:35] "Initializing new in-memory state store" Sep 9 22:04:19.132107 kubelet[2845]: I0909 22:04:19.131934 2845 state_mem.go:75] "Updated machine memory state" Sep 9 22:04:19.146077 kubelet[2845]: I0909 22:04:19.145347 2845 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 22:04:19.146077 kubelet[2845]: I0909 22:04:19.145682 2845 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 22:04:19.146077 kubelet[2845]: I0909 22:04:19.145694 2845 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 22:04:19.146614 kubelet[2845]: I0909 22:04:19.146574 2845 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 22:04:19.196403 kubelet[2845]: E0909 22:04:19.196321 2845 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:19.196403 kubelet[2845]: E0909 22:04:19.196373 2845 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.196695 kubelet[2845]: E0909 22:04:19.196321 2845 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 22:04:19.217106 kubelet[2845]: I0909 22:04:19.217033 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:19.217106 kubelet[2845]: I0909 22:04:19.217091 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:19.217106 kubelet[2845]: I0909 22:04:19.217118 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 22:04:19.217631 kubelet[2845]: I0909 22:04:19.217144 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.217631 kubelet[2845]: I0909 22:04:19.217164 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.217631 kubelet[2845]: I0909 22:04:19.217186 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.217631 kubelet[2845]: I0909 22:04:19.217206 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 22:04:19.217631 kubelet[2845]: I0909 22:04:19.217224 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.217809 kubelet[2845]: I0909 22:04:19.217244 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 22:04:19.261060 kubelet[2845]: I0909 22:04:19.260216 2845 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 22:04:19.285369 kubelet[2845]: I0909 22:04:19.285288 2845 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 9 22:04:19.285605 kubelet[2845]: I0909 22:04:19.285441 2845 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 22:04:19.497315 kubelet[2845]: E0909 22:04:19.497089 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:19.497315 kubelet[2845]: E0909 22:04:19.497098 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:19.497315 kubelet[2845]: E0909 22:04:19.497265 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:19.997014 kubelet[2845]: I0909 22:04:19.996924 2845 apiserver.go:52] "Watching apiserver" Sep 9 22:04:20.017136 kubelet[2845]: I0909 22:04:20.017050 2845 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 22:04:20.089163 kubelet[2845]: E0909 22:04:20.087919 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:20.089163 kubelet[2845]: E0909 22:04:20.088117 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:20.089163 kubelet[2845]: E0909 22:04:20.088171 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:21.089669 kubelet[2845]: E0909 22:04:21.089495 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:21.089669 kubelet[2845]: E0909 22:04:21.089570 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:21.778624 kubelet[2845]: I0909 22:04:21.778571 2845 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 22:04:21.779038 containerd[1571]: time="2025-09-09T22:04:21.778926422Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 22:04:21.779532 kubelet[2845]: I0909 22:04:21.779215 2845 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 22:04:22.090562 kubelet[2845]: E0909 22:04:22.090503 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:22.954529 kubelet[2845]: E0909 22:04:22.954451 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:23.092129 kubelet[2845]: E0909 22:04:23.092091 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:24.987609 systemd[1]: Created slice kubepods-besteffort-podb1b4c2eb_d8ad_4af5_965a_2f9434e4d186.slice - libcontainer container kubepods-besteffort-podb1b4c2eb_d8ad_4af5_965a_2f9434e4d186.slice. Sep 9 22:04:25.051439 kubelet[2845]: I0909 22:04:25.051345 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b1b4c2eb-d8ad-4af5-965a-2f9434e4d186-kube-proxy\") pod \"kube-proxy-97qfw\" (UID: \"b1b4c2eb-d8ad-4af5-965a-2f9434e4d186\") " pod="kube-system/kube-proxy-97qfw" Sep 9 22:04:25.051439 kubelet[2845]: I0909 22:04:25.051404 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b1b4c2eb-d8ad-4af5-965a-2f9434e4d186-xtables-lock\") pod \"kube-proxy-97qfw\" (UID: \"b1b4c2eb-d8ad-4af5-965a-2f9434e4d186\") " pod="kube-system/kube-proxy-97qfw" Sep 9 22:04:25.051439 kubelet[2845]: I0909 22:04:25.051429 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1b4c2eb-d8ad-4af5-965a-2f9434e4d186-lib-modules\") pod \"kube-proxy-97qfw\" (UID: \"b1b4c2eb-d8ad-4af5-965a-2f9434e4d186\") " pod="kube-system/kube-proxy-97qfw" Sep 9 22:04:25.051439 kubelet[2845]: I0909 22:04:25.051448 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxj9d\" (UniqueName: \"kubernetes.io/projected/b1b4c2eb-d8ad-4af5-965a-2f9434e4d186-kube-api-access-mxj9d\") pod \"kube-proxy-97qfw\" (UID: \"b1b4c2eb-d8ad-4af5-965a-2f9434e4d186\") " pod="kube-system/kube-proxy-97qfw" Sep 9 22:04:25.272866 systemd[1]: Created slice kubepods-besteffort-podcae04bfc_424f_4b63_bd99_8900a6535d8e.slice - libcontainer container kubepods-besteffort-podcae04bfc_424f_4b63_bd99_8900a6535d8e.slice. Sep 9 22:04:25.305921 kubelet[2845]: E0909 22:04:25.305864 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:25.307178 containerd[1571]: time="2025-09-09T22:04:25.307027981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-97qfw,Uid:b1b4c2eb-d8ad-4af5-965a-2f9434e4d186,Namespace:kube-system,Attempt:0,}" Sep 9 22:04:25.355180 kubelet[2845]: I0909 22:04:25.355078 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/cae04bfc-424f-4b63-bd99-8900a6535d8e-var-lib-calico\") pod \"tigera-operator-58fc44c59b-zhkdf\" (UID: \"cae04bfc-424f-4b63-bd99-8900a6535d8e\") " pod="tigera-operator/tigera-operator-58fc44c59b-zhkdf" Sep 9 22:04:25.355180 kubelet[2845]: I0909 22:04:25.355135 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjpf\" (UniqueName: \"kubernetes.io/projected/cae04bfc-424f-4b63-bd99-8900a6535d8e-kube-api-access-cwjpf\") pod \"tigera-operator-58fc44c59b-zhkdf\" (UID: \"cae04bfc-424f-4b63-bd99-8900a6535d8e\") " pod="tigera-operator/tigera-operator-58fc44c59b-zhkdf" Sep 9 22:04:25.577351 containerd[1571]: time="2025-09-09T22:04:25.577283925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zhkdf,Uid:cae04bfc-424f-4b63-bd99-8900a6535d8e,Namespace:tigera-operator,Attempt:0,}" Sep 9 22:04:27.101837 containerd[1571]: time="2025-09-09T22:04:27.101768818Z" level=info msg="connecting to shim 8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f" address="unix:///run/containerd/s/a2896ead00d51595b3ed04fbd6f87a97835bbd5b4e2418a9469b071273f43975" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:04:27.191158 systemd[1]: Started cri-containerd-8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f.scope - libcontainer container 8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f. Sep 9 22:04:27.260193 containerd[1571]: time="2025-09-09T22:04:27.260123740Z" level=info msg="connecting to shim 5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3" address="unix:///run/containerd/s/8ec838784621ea684cdadf3e483594ee1dc11e0eff23a8bfe99b81b6f4267147" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:04:27.300638 systemd[1]: Started cri-containerd-5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3.scope - libcontainer container 5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3. Sep 9 22:04:27.337233 containerd[1571]: time="2025-09-09T22:04:27.337150927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-97qfw,Uid:b1b4c2eb-d8ad-4af5-965a-2f9434e4d186,Namespace:kube-system,Attempt:0,} returns sandbox id \"8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f\"" Sep 9 22:04:27.338347 kubelet[2845]: E0909 22:04:27.338322 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:27.343234 containerd[1571]: time="2025-09-09T22:04:27.343130167Z" level=info msg="CreateContainer within sandbox \"8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 22:04:27.467703 containerd[1571]: time="2025-09-09T22:04:27.467515616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-zhkdf,Uid:cae04bfc-424f-4b63-bd99-8900a6535d8e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3\"" Sep 9 22:04:27.469866 containerd[1571]: time="2025-09-09T22:04:27.469672305Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 22:04:27.761883 kubelet[2845]: E0909 22:04:27.761731 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:28.102843 kubelet[2845]: E0909 22:04:28.102808 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:28.569024 containerd[1571]: time="2025-09-09T22:04:28.568970807Z" level=info msg="Container c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:04:29.104722 kubelet[2845]: E0909 22:04:29.104685 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:30.284903 containerd[1571]: time="2025-09-09T22:04:30.284834995Z" level=info msg="CreateContainer within sandbox \"8748ebb71e71292ec90a3d894e8838ff6052a1cee1a30a50c9a33d683d2a451f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d\"" Sep 9 22:04:30.286504 containerd[1571]: time="2025-09-09T22:04:30.285669076Z" level=info msg="StartContainer for \"c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d\"" Sep 9 22:04:30.287499 containerd[1571]: time="2025-09-09T22:04:30.287434105Z" level=info msg="connecting to shim c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d" address="unix:///run/containerd/s/a2896ead00d51595b3ed04fbd6f87a97835bbd5b4e2418a9469b071273f43975" protocol=ttrpc version=3 Sep 9 22:04:30.312791 systemd[1]: Started cri-containerd-c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d.scope - libcontainer container c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d. Sep 9 22:04:30.898938 containerd[1571]: time="2025-09-09T22:04:30.898871277Z" level=info msg="StartContainer for \"c33e0009d513b8e1245386d7e7164ea4d6110445c3c677c3ddda115780a4275d\" returns successfully" Sep 9 22:04:31.112405 kubelet[2845]: E0909 22:04:31.112339 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:31.257874 kubelet[2845]: I0909 22:04:31.257638 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-97qfw" podStartSLOduration=9.257613164 podStartE2EDuration="9.257613164s" podCreationTimestamp="2025-09-09 22:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:04:31.257537219 +0000 UTC m=+12.396225520" watchObservedRunningTime="2025-09-09 22:04:31.257613164 +0000 UTC m=+12.396301465" Sep 9 22:04:32.115920 kubelet[2845]: E0909 22:04:32.115711 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:32.800427 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3799455702.mount: Deactivated successfully. Sep 9 22:04:35.002528 containerd[1571]: time="2025-09-09T22:04:35.001548059Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:04:35.006213 containerd[1571]: time="2025-09-09T22:04:35.004393456Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 22:04:35.007690 containerd[1571]: time="2025-09-09T22:04:35.007642745Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:04:35.013772 containerd[1571]: time="2025-09-09T22:04:35.013701403Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 7.543979822s" Sep 9 22:04:35.013772 containerd[1571]: time="2025-09-09T22:04:35.013758101Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 22:04:35.014195 containerd[1571]: time="2025-09-09T22:04:35.014163558Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:04:35.024388 containerd[1571]: time="2025-09-09T22:04:35.024315350Z" level=info msg="CreateContainer within sandbox \"5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 22:04:35.071524 containerd[1571]: time="2025-09-09T22:04:35.071432991Z" level=info msg="Container 2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:04:35.098379 containerd[1571]: time="2025-09-09T22:04:35.094695609Z" level=info msg="CreateContainer within sandbox \"5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\"" Sep 9 22:04:35.104641 containerd[1571]: time="2025-09-09T22:04:35.101549067Z" level=info msg="StartContainer for \"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\"" Sep 9 22:04:35.104641 containerd[1571]: time="2025-09-09T22:04:35.102774403Z" level=info msg="connecting to shim 2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a" address="unix:///run/containerd/s/8ec838784621ea684cdadf3e483594ee1dc11e0eff23a8bfe99b81b6f4267147" protocol=ttrpc version=3 Sep 9 22:04:35.246814 systemd[1]: Started cri-containerd-2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a.scope - libcontainer container 2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a. Sep 9 22:04:35.416970 containerd[1571]: time="2025-09-09T22:04:35.416562655Z" level=info msg="StartContainer for \"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\" returns successfully" Sep 9 22:04:40.514957 systemd[1]: cri-containerd-2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a.scope: Deactivated successfully. Sep 9 22:04:40.517729 containerd[1571]: time="2025-09-09T22:04:40.517679022Z" level=info msg="received exit event container_id:\"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\" id:\"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\" pid:3171 exit_status:1 exited_at:{seconds:1757455480 nanos:516634537}" Sep 9 22:04:40.518764 containerd[1571]: time="2025-09-09T22:04:40.518431429Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\" id:\"2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a\" pid:3171 exit_status:1 exited_at:{seconds:1757455480 nanos:516634537}" Sep 9 22:04:40.576888 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a-rootfs.mount: Deactivated successfully. Sep 9 22:04:42.229501 kubelet[2845]: I0909 22:04:42.228602 2845 scope.go:117] "RemoveContainer" containerID="2f255bf9cdcab26c287f5cd1f6f5385a80702e4bc774fec76bb3e825ad18637a" Sep 9 22:04:42.237722 containerd[1571]: time="2025-09-09T22:04:42.237502235Z" level=info msg="CreateContainer within sandbox \"5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 9 22:04:42.439623 containerd[1571]: time="2025-09-09T22:04:42.439537973Z" level=info msg="Container 219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:04:42.775927 containerd[1571]: time="2025-09-09T22:04:42.775357692Z" level=info msg="CreateContainer within sandbox \"5ed1006f94e615858f5e37948b1b2c8484ddf715bb9c3164bd15d79947b752e3\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b\"" Sep 9 22:04:42.784902 containerd[1571]: time="2025-09-09T22:04:42.777434144Z" level=info msg="StartContainer for \"219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b\"" Sep 9 22:04:42.784902 containerd[1571]: time="2025-09-09T22:04:42.778603986Z" level=info msg="connecting to shim 219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b" address="unix:///run/containerd/s/8ec838784621ea684cdadf3e483594ee1dc11e0eff23a8bfe99b81b6f4267147" protocol=ttrpc version=3 Sep 9 22:04:42.856980 systemd[1]: Started cri-containerd-219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b.scope - libcontainer container 219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b. Sep 9 22:04:42.991619 containerd[1571]: time="2025-09-09T22:04:42.991536373Z" level=info msg="StartContainer for \"219e3819204d4ca345277aed9aadf48365c4079558f92f1a2cdbdfe1fe4b397b\" returns successfully" Sep 9 22:04:43.283420 kubelet[2845]: I0909 22:04:43.282939 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-zhkdf" podStartSLOduration=12.734657614 podStartE2EDuration="20.282913979s" podCreationTimestamp="2025-09-09 22:04:23 +0000 UTC" firstStartedPulling="2025-09-09 22:04:27.469100174 +0000 UTC m=+8.607788475" lastFinishedPulling="2025-09-09 22:04:35.017356539 +0000 UTC m=+16.156044840" observedRunningTime="2025-09-09 22:04:36.223037188 +0000 UTC m=+17.361725489" watchObservedRunningTime="2025-09-09 22:04:43.282913979 +0000 UTC m=+24.421602280" Sep 9 22:04:46.445824 sudo[1793]: pam_unix(sudo:session): session closed for user root Sep 9 22:04:46.459834 sshd[1792]: Connection closed by 10.0.0.1 port 52762 Sep 9 22:04:46.466460 sshd-session[1789]: pam_unix(sshd:session): session closed for user core Sep 9 22:04:46.490546 systemd[1]: sshd@8-10.0.0.72:22-10.0.0.1:52762.service: Deactivated successfully. Sep 9 22:04:46.501354 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 22:04:46.501724 systemd[1]: session-9.scope: Consumed 9.012s CPU time, 227.1M memory peak. Sep 9 22:04:46.517680 systemd-logind[1554]: Session 9 logged out. Waiting for processes to exit. Sep 9 22:04:46.521813 systemd-logind[1554]: Removed session 9. Sep 9 22:04:58.294250 kubelet[2845]: I0909 22:04:58.293050 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/723f461c-c224-4d4a-b9de-0425531fbcf9-tigera-ca-bundle\") pod \"calico-typha-599fc75b9d-2ts5n\" (UID: \"723f461c-c224-4d4a-b9de-0425531fbcf9\") " pod="calico-system/calico-typha-599fc75b9d-2ts5n" Sep 9 22:04:58.294250 kubelet[2845]: I0909 22:04:58.293966 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8kn\" (UniqueName: \"kubernetes.io/projected/723f461c-c224-4d4a-b9de-0425531fbcf9-kube-api-access-4w8kn\") pod \"calico-typha-599fc75b9d-2ts5n\" (UID: \"723f461c-c224-4d4a-b9de-0425531fbcf9\") " pod="calico-system/calico-typha-599fc75b9d-2ts5n" Sep 9 22:04:58.294250 kubelet[2845]: I0909 22:04:58.293994 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/723f461c-c224-4d4a-b9de-0425531fbcf9-typha-certs\") pod \"calico-typha-599fc75b9d-2ts5n\" (UID: \"723f461c-c224-4d4a-b9de-0425531fbcf9\") " pod="calico-system/calico-typha-599fc75b9d-2ts5n" Sep 9 22:04:58.316999 systemd[1]: Created slice kubepods-besteffort-pod723f461c_c224_4d4a_b9de_0425531fbcf9.slice - libcontainer container kubepods-besteffort-pod723f461c_c224_4d4a_b9de_0425531fbcf9.slice. Sep 9 22:04:58.610177 systemd[1]: Created slice kubepods-besteffort-pod7707f6bd_2d0a_4be0_8df8_d87d9c2abd58.slice - libcontainer container kubepods-besteffort-pod7707f6bd_2d0a_4be0_8df8_d87d9c2abd58.slice. Sep 9 22:04:58.624222 kubelet[2845]: E0909 22:04:58.621290 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:58.624451 containerd[1571]: time="2025-09-09T22:04:58.623239952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599fc75b9d-2ts5n,Uid:723f461c-c224-4d4a-b9de-0425531fbcf9,Namespace:calico-system,Attempt:0,}" Sep 9 22:04:58.708942 kubelet[2845]: I0909 22:04:58.708383 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-flexvol-driver-host\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.708942 kubelet[2845]: I0909 22:04:58.708453 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-policysync\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.708942 kubelet[2845]: I0909 22:04:58.708516 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-var-run-calico\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.708942 kubelet[2845]: I0909 22:04:58.708548 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-cni-log-dir\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.708942 kubelet[2845]: I0909 22:04:58.708574 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-cni-net-dir\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709306 kubelet[2845]: I0909 22:04:58.708598 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-tigera-ca-bundle\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709306 kubelet[2845]: I0909 22:04:58.708624 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-xtables-lock\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709306 kubelet[2845]: I0909 22:04:58.708653 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-node-certs\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709306 kubelet[2845]: I0909 22:04:58.708677 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-cni-bin-dir\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709306 kubelet[2845]: I0909 22:04:58.708698 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-var-lib-calico\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709517 kubelet[2845]: I0909 22:04:58.708719 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-lib-modules\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.709517 kubelet[2845]: I0909 22:04:58.708740 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwzw\" (UniqueName: \"kubernetes.io/projected/7707f6bd-2d0a-4be0-8df8-d87d9c2abd58-kube-api-access-vdwzw\") pod \"calico-node-xxbpp\" (UID: \"7707f6bd-2d0a-4be0-8df8-d87d9c2abd58\") " pod="calico-system/calico-node-xxbpp" Sep 9 22:04:58.911643 kubelet[2845]: E0909 22:04:58.905240 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.911643 kubelet[2845]: W0909 22:04:58.909636 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.918456 kubelet[2845]: E0909 22:04:58.917863 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.922304 kubelet[2845]: E0909 22:04:58.921883 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.922304 kubelet[2845]: W0909 22:04:58.921922 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.922304 kubelet[2845]: E0909 22:04:58.921957 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.923376 kubelet[2845]: E0909 22:04:58.923029 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.923376 kubelet[2845]: W0909 22:04:58.923047 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.923376 kubelet[2845]: E0909 22:04:58.923066 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.923376 kubelet[2845]: E0909 22:04:58.923260 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.923376 kubelet[2845]: W0909 22:04:58.923269 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.923376 kubelet[2845]: E0909 22:04:58.923278 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.923784 kubelet[2845]: E0909 22:04:58.923768 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.923862 kubelet[2845]: W0909 22:04:58.923845 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.923943 kubelet[2845]: E0909 22:04:58.923927 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.925938 kubelet[2845]: E0909 22:04:58.925871 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.925938 kubelet[2845]: W0909 22:04:58.925903 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.925938 kubelet[2845]: E0909 22:04:58.925933 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.926935 kubelet[2845]: E0909 22:04:58.926661 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.926935 kubelet[2845]: W0909 22:04:58.926676 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.926935 kubelet[2845]: E0909 22:04:58.926690 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.932366 kubelet[2845]: E0909 22:04:58.932277 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.991107 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.992661 kubelet[2845]: W0909 22:04:58.991179 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.991213 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.992046 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.992661 kubelet[2845]: W0909 22:04:58.992083 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.992101 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.992444 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.992661 kubelet[2845]: W0909 22:04:58.992459 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.992661 kubelet[2845]: E0909 22:04:58.992509 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:58.994913 kubelet[2845]: E0909 22:04:58.994850 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:58.994913 kubelet[2845]: W0909 22:04:58.994871 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:58.994913 kubelet[2845]: E0909 22:04:58.994891 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.001422 kubelet[2845]: E0909 22:04:59.001200 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.001422 kubelet[2845]: W0909 22:04:59.001239 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.001422 kubelet[2845]: E0909 22:04:59.001272 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.004239 kubelet[2845]: E0909 22:04:59.003777 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.004239 kubelet[2845]: W0909 22:04:59.003801 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.004239 kubelet[2845]: E0909 22:04:59.003826 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.004526 containerd[1571]: time="2025-09-09T22:04:59.004205054Z" level=info msg="connecting to shim 7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9" address="unix:///run/containerd/s/781e5461f2ed203f55ae5c97a4ef9c27338b83febe7f68c36065dd9741913245" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:04:59.005452 kubelet[2845]: E0909 22:04:59.005429 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.005452 kubelet[2845]: W0909 22:04:59.005448 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.005716 kubelet[2845]: E0909 22:04:59.005481 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.005883 kubelet[2845]: E0909 22:04:59.005773 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.005883 kubelet[2845]: W0909 22:04:59.005784 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.005883 kubelet[2845]: E0909 22:04:59.005797 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006008 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.007277 kubelet[2845]: W0909 22:04:59.006019 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006030 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006235 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.007277 kubelet[2845]: W0909 22:04:59.006247 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006258 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006448 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.007277 kubelet[2845]: W0909 22:04:59.006458 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.007277 kubelet[2845]: E0909 22:04:59.006507 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.007929 kubelet[2845]: E0909 22:04:59.007877 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.007929 kubelet[2845]: W0909 22:04:59.007896 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.007929 kubelet[2845]: E0909 22:04:59.007911 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.011602 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.013494 kubelet[2845]: W0909 22:04:59.011632 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.011660 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.011889 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.013494 kubelet[2845]: W0909 22:04:59.011898 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.011908 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.012112 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.013494 kubelet[2845]: W0909 22:04:59.012121 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.012130 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.013494 kubelet[2845]: E0909 22:04:59.012324 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.014219 kubelet[2845]: W0909 22:04:59.012335 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.012344 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.012555 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.014219 kubelet[2845]: W0909 22:04:59.012569 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.012581 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.012793 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.014219 kubelet[2845]: W0909 22:04:59.012804 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.012814 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.014219 kubelet[2845]: E0909 22:04:59.013072 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.014219 kubelet[2845]: W0909 22:04:59.013082 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.014947 kubelet[2845]: E0909 22:04:59.013093 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.014947 kubelet[2845]: E0909 22:04:59.013404 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.014947 kubelet[2845]: W0909 22:04:59.013415 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.014947 kubelet[2845]: E0909 22:04:59.013425 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.023087 kubelet[2845]: E0909 22:04:59.023044 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.023323 kubelet[2845]: W0909 22:04:59.023304 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.023422 kubelet[2845]: E0909 22:04:59.023406 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.023533 kubelet[2845]: I0909 22:04:59.023512 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2140908-6680-44bb-ab1a-8d40f2e95451-kubelet-dir\") pod \"csi-node-driver-6mwcf\" (UID: \"d2140908-6680-44bb-ab1a-8d40f2e95451\") " pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:04:59.025566 kubelet[2845]: E0909 22:04:59.025541 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.025707 kubelet[2845]: W0909 22:04:59.025679 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.025956 kubelet[2845]: E0909 22:04:59.025939 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.027648 kubelet[2845]: I0909 22:04:59.027621 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d2140908-6680-44bb-ab1a-8d40f2e95451-registration-dir\") pod \"csi-node-driver-6mwcf\" (UID: \"d2140908-6680-44bb-ab1a-8d40f2e95451\") " pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:04:59.027891 kubelet[2845]: E0909 22:04:59.027862 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.027891 kubelet[2845]: W0909 22:04:59.027880 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.028016 kubelet[2845]: E0909 22:04:59.027896 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.028229 kubelet[2845]: E0909 22:04:59.028207 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.028279 kubelet[2845]: W0909 22:04:59.028221 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.028279 kubelet[2845]: E0909 22:04:59.028255 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.038259 kubelet[2845]: E0909 22:04:59.038188 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.038259 kubelet[2845]: W0909 22:04:59.038224 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.038259 kubelet[2845]: E0909 22:04:59.038255 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.038550 kubelet[2845]: I0909 22:04:59.038295 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d2140908-6680-44bb-ab1a-8d40f2e95451-socket-dir\") pod \"csi-node-driver-6mwcf\" (UID: \"d2140908-6680-44bb-ab1a-8d40f2e95451\") " pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:04:59.042196 kubelet[2845]: E0909 22:04:59.041946 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.042196 kubelet[2845]: W0909 22:04:59.041982 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.044078 kubelet[2845]: E0909 22:04:59.043783 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.044715 kubelet[2845]: E0909 22:04:59.044120 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.044715 kubelet[2845]: W0909 22:04:59.044344 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.044715 kubelet[2845]: E0909 22:04:59.044500 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.044715 kubelet[2845]: I0909 22:04:59.044675 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d2140908-6680-44bb-ab1a-8d40f2e95451-varrun\") pod \"csi-node-driver-6mwcf\" (UID: \"d2140908-6680-44bb-ab1a-8d40f2e95451\") " pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:04:59.045041 kubelet[2845]: E0909 22:04:59.044899 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.045041 kubelet[2845]: W0909 22:04:59.044916 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.045041 kubelet[2845]: E0909 22:04:59.044940 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.045212 kubelet[2845]: E0909 22:04:59.045187 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.045212 kubelet[2845]: W0909 22:04:59.045205 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.045298 kubelet[2845]: E0909 22:04:59.045224 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.045908 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.047684 kubelet[2845]: W0909 22:04:59.045920 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.045945 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.046165 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.047684 kubelet[2845]: W0909 22:04:59.046179 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.046189 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.046453 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.047684 kubelet[2845]: W0909 22:04:59.046488 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.046502 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.047684 kubelet[2845]: E0909 22:04:59.046866 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.048312 kubelet[2845]: W0909 22:04:59.046879 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.048312 kubelet[2845]: E0909 22:04:59.046890 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.048312 kubelet[2845]: I0909 22:04:59.046926 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnrcl\" (UniqueName: \"kubernetes.io/projected/d2140908-6680-44bb-ab1a-8d40f2e95451-kube-api-access-rnrcl\") pod \"csi-node-driver-6mwcf\" (UID: \"d2140908-6680-44bb-ab1a-8d40f2e95451\") " pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:04:59.048312 kubelet[2845]: E0909 22:04:59.047294 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.048312 kubelet[2845]: W0909 22:04:59.047308 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.048312 kubelet[2845]: E0909 22:04:59.047325 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.049444 kubelet[2845]: E0909 22:04:59.049355 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.049444 kubelet[2845]: W0909 22:04:59.049377 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.049444 kubelet[2845]: E0909 22:04:59.049392 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.085431 systemd[1]: Started cri-containerd-7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9.scope - libcontainer container 7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9. Sep 9 22:04:59.157944 kubelet[2845]: E0909 22:04:59.153436 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.157944 kubelet[2845]: W0909 22:04:59.154822 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.157944 kubelet[2845]: E0909 22:04:59.155865 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.186769 kubelet[2845]: E0909 22:04:59.183406 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.186769 kubelet[2845]: W0909 22:04:59.183507 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.186769 kubelet[2845]: E0909 22:04:59.183547 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.186769 kubelet[2845]: E0909 22:04:59.184032 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.186769 kubelet[2845]: W0909 22:04:59.184044 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.186769 kubelet[2845]: E0909 22:04:59.184057 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.186769 kubelet[2845]: E0909 22:04:59.186728 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.186769 kubelet[2845]: W0909 22:04:59.186744 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.189102 kubelet[2845]: E0909 22:04:59.187700 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.189102 kubelet[2845]: E0909 22:04:59.188094 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.189102 kubelet[2845]: W0909 22:04:59.188107 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.189102 kubelet[2845]: E0909 22:04:59.188127 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.189102 kubelet[2845]: E0909 22:04:59.188744 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.189102 kubelet[2845]: W0909 22:04:59.188757 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.189102 kubelet[2845]: E0909 22:04:59.188808 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.189227 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.200803 kubelet[2845]: W0909 22:04:59.189238 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.189251 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.189596 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.200803 kubelet[2845]: W0909 22:04:59.189610 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.189756 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.190025 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.200803 kubelet[2845]: W0909 22:04:59.190035 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.190112 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.200803 kubelet[2845]: E0909 22:04:59.190387 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208587 kubelet[2845]: W0909 22:04:59.190399 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.190555 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.190771 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208587 kubelet[2845]: W0909 22:04:59.190792 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.190850 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.191054 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208587 kubelet[2845]: W0909 22:04:59.191065 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.191137 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208587 kubelet[2845]: E0909 22:04:59.191382 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208587 kubelet[2845]: W0909 22:04:59.191392 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.191450 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.191700 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208999 kubelet[2845]: W0909 22:04:59.191711 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.191724 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.191945 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208999 kubelet[2845]: W0909 22:04:59.191955 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.191966 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.192231 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.208999 kubelet[2845]: W0909 22:04:59.192242 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.208999 kubelet[2845]: E0909 22:04:59.192286 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.192445 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209339 kubelet[2845]: W0909 22:04:59.192455 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.192615 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.193111 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209339 kubelet[2845]: W0909 22:04:59.193122 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.194227 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.194437 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209339 kubelet[2845]: W0909 22:04:59.194449 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.194576 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209339 kubelet[2845]: E0909 22:04:59.195275 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209815 kubelet[2845]: W0909 22:04:59.195290 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.195464 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.196039 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209815 kubelet[2845]: W0909 22:04:59.196050 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.196699 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.196859 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209815 kubelet[2845]: W0909 22:04:59.196869 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.196898 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.209815 kubelet[2845]: E0909 22:04:59.198320 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.209815 kubelet[2845]: W0909 22:04:59.198332 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.210335 kubelet[2845]: E0909 22:04:59.198952 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.210335 kubelet[2845]: E0909 22:04:59.202394 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.210335 kubelet[2845]: W0909 22:04:59.202417 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.210335 kubelet[2845]: E0909 22:04:59.202453 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.210335 kubelet[2845]: E0909 22:04:59.207935 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.210335 kubelet[2845]: W0909 22:04:59.207964 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.210335 kubelet[2845]: E0909 22:04:59.207997 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.215716 containerd[1571]: time="2025-09-09T22:04:59.215646446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xxbpp,Uid:7707f6bd-2d0a-4be0-8df8-d87d9c2abd58,Namespace:calico-system,Attempt:0,}" Sep 9 22:04:59.269114 kubelet[2845]: E0909 22:04:59.269074 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:04:59.269357 kubelet[2845]: W0909 22:04:59.269335 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:04:59.270346 kubelet[2845]: E0909 22:04:59.270063 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:04:59.297418 containerd[1571]: time="2025-09-09T22:04:59.296564283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599fc75b9d-2ts5n,Uid:723f461c-c224-4d4a-b9de-0425531fbcf9,Namespace:calico-system,Attempt:0,} returns sandbox id \"7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9\"" Sep 9 22:04:59.312128 kubelet[2845]: E0909 22:04:59.308550 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:04:59.323691 containerd[1571]: time="2025-09-09T22:04:59.323485120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 22:04:59.362827 containerd[1571]: time="2025-09-09T22:04:59.362757783Z" level=info msg="connecting to shim aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3" address="unix:///run/containerd/s/fc8221fe3c8a0891bffc56138712abaed1a982c1378cc736586f8329c8771a1e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:04:59.433821 systemd[1]: Started cri-containerd-aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3.scope - libcontainer container aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3. Sep 9 22:04:59.567588 containerd[1571]: time="2025-09-09T22:04:59.567452304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xxbpp,Uid:7707f6bd-2d0a-4be0-8df8-d87d9c2abd58,Namespace:calico-system,Attempt:0,} returns sandbox id \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\"" Sep 9 22:05:00.063746 kubelet[2845]: E0909 22:05:00.063557 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:01.903310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4068322777.mount: Deactivated successfully. Sep 9 22:05:02.104674 kubelet[2845]: E0909 22:05:02.101205 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:04.055556 kubelet[2845]: E0909 22:05:04.055457 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:04.371769 containerd[1571]: time="2025-09-09T22:05:04.371561692Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:04.476296 containerd[1571]: time="2025-09-09T22:05:04.476181313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 22:05:04.488845 containerd[1571]: time="2025-09-09T22:05:04.488768357Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:04.500435 containerd[1571]: time="2025-09-09T22:05:04.500317284Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:04.501218 containerd[1571]: time="2025-09-09T22:05:04.501113643Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 5.177559103s" Sep 9 22:05:04.501218 containerd[1571]: time="2025-09-09T22:05:04.501157657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 22:05:04.502928 containerd[1571]: time="2025-09-09T22:05:04.502868800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 22:05:04.514133 containerd[1571]: time="2025-09-09T22:05:04.514059027Z" level=info msg="CreateContainer within sandbox \"7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 22:05:04.528039 containerd[1571]: time="2025-09-09T22:05:04.527823603Z" level=info msg="Container 47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:04.542663 containerd[1571]: time="2025-09-09T22:05:04.542587042Z" level=info msg="CreateContainer within sandbox \"7164ad25c3195f243c2eb97d927709bde4ae2e55cd4d169ca0f7d9b3492462b9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50\"" Sep 9 22:05:04.543817 containerd[1571]: time="2025-09-09T22:05:04.543718396Z" level=info msg="StartContainer for \"47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50\"" Sep 9 22:05:04.546824 containerd[1571]: time="2025-09-09T22:05:04.546766694Z" level=info msg="connecting to shim 47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50" address="unix:///run/containerd/s/781e5461f2ed203f55ae5c97a4ef9c27338b83febe7f68c36065dd9741913245" protocol=ttrpc version=3 Sep 9 22:05:04.579727 systemd[1]: Started cri-containerd-47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50.scope - libcontainer container 47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50. Sep 9 22:05:04.642980 containerd[1571]: time="2025-09-09T22:05:04.642853553Z" level=info msg="StartContainer for \"47bdbffd76b4437739cc629bc9722228c9d070ba8d632728bb099e81dc155a50\" returns successfully" Sep 9 22:05:05.447500 kubelet[2845]: E0909 22:05:05.447424 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:05.512781 kubelet[2845]: E0909 22:05:05.512721 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.512781 kubelet[2845]: W0909 22:05:05.512756 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.512781 kubelet[2845]: E0909 22:05:05.512782 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.513238 kubelet[2845]: E0909 22:05:05.513194 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.513238 kubelet[2845]: W0909 22:05:05.513223 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.513238 kubelet[2845]: E0909 22:05:05.513254 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.513641 kubelet[2845]: E0909 22:05:05.513607 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.513699 kubelet[2845]: W0909 22:05:05.513641 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.513699 kubelet[2845]: E0909 22:05:05.513673 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.513979 kubelet[2845]: E0909 22:05:05.513961 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.513979 kubelet[2845]: W0909 22:05:05.513975 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.514045 kubelet[2845]: E0909 22:05:05.513985 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.514277 kubelet[2845]: E0909 22:05:05.514253 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.514277 kubelet[2845]: W0909 22:05:05.514269 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.514277 kubelet[2845]: E0909 22:05:05.514281 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.514477 kubelet[2845]: E0909 22:05:05.514456 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.514518 kubelet[2845]: W0909 22:05:05.514495 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.514518 kubelet[2845]: E0909 22:05:05.514505 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.514680 kubelet[2845]: E0909 22:05:05.514665 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.514680 kubelet[2845]: W0909 22:05:05.514675 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.514680 kubelet[2845]: E0909 22:05:05.514683 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.514894 kubelet[2845]: E0909 22:05:05.514876 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.514894 kubelet[2845]: W0909 22:05:05.514886 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.514894 kubelet[2845]: E0909 22:05:05.514894 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.515088 kubelet[2845]: E0909 22:05:05.515071 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.515088 kubelet[2845]: W0909 22:05:05.515081 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.515155 kubelet[2845]: E0909 22:05:05.515089 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.515269 kubelet[2845]: E0909 22:05:05.515252 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.515269 kubelet[2845]: W0909 22:05:05.515262 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.515269 kubelet[2845]: E0909 22:05:05.515269 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.515541 kubelet[2845]: E0909 22:05:05.515515 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.515541 kubelet[2845]: W0909 22:05:05.515535 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.515628 kubelet[2845]: E0909 22:05:05.515552 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.515793 kubelet[2845]: E0909 22:05:05.515776 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.515793 kubelet[2845]: W0909 22:05:05.515788 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.515852 kubelet[2845]: E0909 22:05:05.515797 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.515986 kubelet[2845]: E0909 22:05:05.515970 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.515986 kubelet[2845]: W0909 22:05:05.515981 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.516037 kubelet[2845]: E0909 22:05:05.515989 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.516187 kubelet[2845]: E0909 22:05:05.516171 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.516187 kubelet[2845]: W0909 22:05:05.516182 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.516246 kubelet[2845]: E0909 22:05:05.516191 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.516364 kubelet[2845]: E0909 22:05:05.516348 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.516364 kubelet[2845]: W0909 22:05:05.516358 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.516433 kubelet[2845]: E0909 22:05:05.516368 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.607319 kubelet[2845]: E0909 22:05:05.607265 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.607319 kubelet[2845]: W0909 22:05:05.607294 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.607319 kubelet[2845]: E0909 22:05:05.607317 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.607639 kubelet[2845]: E0909 22:05:05.607535 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.607639 kubelet[2845]: W0909 22:05:05.607545 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.607639 kubelet[2845]: E0909 22:05:05.607560 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.607761 kubelet[2845]: E0909 22:05:05.607744 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.607761 kubelet[2845]: W0909 22:05:05.607754 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.607820 kubelet[2845]: E0909 22:05:05.607771 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.608002 kubelet[2845]: E0909 22:05:05.607983 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.608002 kubelet[2845]: W0909 22:05:05.607994 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.608057 kubelet[2845]: E0909 22:05:05.608006 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.608352 kubelet[2845]: E0909 22:05:05.608319 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.608352 kubelet[2845]: W0909 22:05:05.608339 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.608414 kubelet[2845]: E0909 22:05:05.608359 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.608612 kubelet[2845]: E0909 22:05:05.608589 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.608612 kubelet[2845]: W0909 22:05:05.608606 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.608682 kubelet[2845]: E0909 22:05:05.608625 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.608894 kubelet[2845]: E0909 22:05:05.608872 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.608894 kubelet[2845]: W0909 22:05:05.608886 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.608947 kubelet[2845]: E0909 22:05:05.608924 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.609152 kubelet[2845]: E0909 22:05:05.609129 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.609152 kubelet[2845]: W0909 22:05:05.609145 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.609281 kubelet[2845]: E0909 22:05:05.609217 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.609524 kubelet[2845]: E0909 22:05:05.609460 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.609585 kubelet[2845]: W0909 22:05:05.609522 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.609585 kubelet[2845]: E0909 22:05:05.609566 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.609955 kubelet[2845]: E0909 22:05:05.609923 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.609955 kubelet[2845]: W0909 22:05:05.609937 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.609955 kubelet[2845]: E0909 22:05:05.609953 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.610209 kubelet[2845]: E0909 22:05:05.610182 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.610209 kubelet[2845]: W0909 22:05:05.610195 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.610209 kubelet[2845]: E0909 22:05:05.610209 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.610457 kubelet[2845]: E0909 22:05:05.610428 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.610457 kubelet[2845]: W0909 22:05:05.610442 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.610554 kubelet[2845]: E0909 22:05:05.610507 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.610672 kubelet[2845]: E0909 22:05:05.610643 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.610672 kubelet[2845]: W0909 22:05:05.610663 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.610764 kubelet[2845]: E0909 22:05:05.610705 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.610886 kubelet[2845]: E0909 22:05:05.610865 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.610886 kubelet[2845]: W0909 22:05:05.610877 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.610958 kubelet[2845]: E0909 22:05:05.610894 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.611112 kubelet[2845]: E0909 22:05:05.611079 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.611112 kubelet[2845]: W0909 22:05:05.611102 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.611186 kubelet[2845]: E0909 22:05:05.611115 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.611353 kubelet[2845]: E0909 22:05:05.611333 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.611353 kubelet[2845]: W0909 22:05:05.611344 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.611423 kubelet[2845]: E0909 22:05:05.611357 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.611783 kubelet[2845]: E0909 22:05:05.611733 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.611783 kubelet[2845]: W0909 22:05:05.611755 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.611783 kubelet[2845]: E0909 22:05:05.611778 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.612106 kubelet[2845]: E0909 22:05:05.612025 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:05.612106 kubelet[2845]: W0909 22:05:05.612043 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:05.612106 kubelet[2845]: E0909 22:05:05.612057 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:05.924291 kubelet[2845]: I0909 22:05:05.924064 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-599fc75b9d-2ts5n" podStartSLOduration=3.731097411 podStartE2EDuration="8.924041951s" podCreationTimestamp="2025-09-09 22:04:57 +0000 UTC" firstStartedPulling="2025-09-09 22:04:59.309632647 +0000 UTC m=+40.448320948" lastFinishedPulling="2025-09-09 22:05:04.502577187 +0000 UTC m=+45.641265488" observedRunningTime="2025-09-09 22:05:05.921369848 +0000 UTC m=+47.060058139" watchObservedRunningTime="2025-09-09 22:05:05.924041951 +0000 UTC m=+47.062730252" Sep 9 22:05:06.055535 kubelet[2845]: E0909 22:05:06.055439 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:06.448927 kubelet[2845]: E0909 22:05:06.448874 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:06.523821 kubelet[2845]: E0909 22:05:06.523778 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.523821 kubelet[2845]: W0909 22:05:06.523808 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.523999 kubelet[2845]: E0909 22:05:06.523839 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.524153 kubelet[2845]: E0909 22:05:06.524128 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.524153 kubelet[2845]: W0909 22:05:06.524142 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.524245 kubelet[2845]: E0909 22:05:06.524154 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.524412 kubelet[2845]: E0909 22:05:06.524389 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.524412 kubelet[2845]: W0909 22:05:06.524404 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.524514 kubelet[2845]: E0909 22:05:06.524415 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.524668 kubelet[2845]: E0909 22:05:06.524652 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.524668 kubelet[2845]: W0909 22:05:06.524666 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.524764 kubelet[2845]: E0909 22:05:06.524679 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.524931 kubelet[2845]: E0909 22:05:06.524915 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.524931 kubelet[2845]: W0909 22:05:06.524928 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.524995 kubelet[2845]: E0909 22:05:06.524939 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.525157 kubelet[2845]: E0909 22:05:06.525142 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.525157 kubelet[2845]: W0909 22:05:06.525155 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.525230 kubelet[2845]: E0909 22:05:06.525166 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.525378 kubelet[2845]: E0909 22:05:06.525362 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.525411 kubelet[2845]: W0909 22:05:06.525375 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.525411 kubelet[2845]: E0909 22:05:06.525388 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.525622 kubelet[2845]: E0909 22:05:06.525606 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.525622 kubelet[2845]: W0909 22:05:06.525619 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.525717 kubelet[2845]: E0909 22:05:06.525631 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.525847 kubelet[2845]: E0909 22:05:06.525832 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.525847 kubelet[2845]: W0909 22:05:06.525845 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.525907 kubelet[2845]: E0909 22:05:06.525855 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.526066 kubelet[2845]: E0909 22:05:06.526051 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.526066 kubelet[2845]: W0909 22:05:06.526064 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.526170 kubelet[2845]: E0909 22:05:06.526074 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.526319 kubelet[2845]: E0909 22:05:06.526291 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.526319 kubelet[2845]: W0909 22:05:06.526303 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.526385 kubelet[2845]: E0909 22:05:06.526318 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.526555 kubelet[2845]: E0909 22:05:06.526540 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.526555 kubelet[2845]: W0909 22:05:06.526553 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.526555 kubelet[2845]: E0909 22:05:06.526564 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.526776 kubelet[2845]: E0909 22:05:06.526760 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.526776 kubelet[2845]: W0909 22:05:06.526773 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.526836 kubelet[2845]: E0909 22:05:06.526786 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.527009 kubelet[2845]: E0909 22:05:06.526994 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.527009 kubelet[2845]: W0909 22:05:06.527007 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.527110 kubelet[2845]: E0909 22:05:06.527018 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.527253 kubelet[2845]: E0909 22:05:06.527238 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.527253 kubelet[2845]: W0909 22:05:06.527250 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.527317 kubelet[2845]: E0909 22:05:06.527261 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.615336 kubelet[2845]: E0909 22:05:06.615269 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.615336 kubelet[2845]: W0909 22:05:06.615305 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.615336 kubelet[2845]: E0909 22:05:06.615334 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.615713 kubelet[2845]: E0909 22:05:06.615614 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.615713 kubelet[2845]: W0909 22:05:06.615627 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.615713 kubelet[2845]: E0909 22:05:06.615644 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.615883 kubelet[2845]: E0909 22:05:06.615861 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.615883 kubelet[2845]: W0909 22:05:06.615877 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.615994 kubelet[2845]: E0909 22:05:06.615889 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.616357 kubelet[2845]: E0909 22:05:06.616320 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.616425 kubelet[2845]: W0909 22:05:06.616355 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.616425 kubelet[2845]: E0909 22:05:06.616388 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.616637 kubelet[2845]: E0909 22:05:06.616615 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.616637 kubelet[2845]: W0909 22:05:06.616632 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.616715 kubelet[2845]: E0909 22:05:06.616654 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.616904 kubelet[2845]: E0909 22:05:06.616882 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.616904 kubelet[2845]: W0909 22:05:06.616895 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.616951 kubelet[2845]: E0909 22:05:06.616915 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.617268 kubelet[2845]: E0909 22:05:06.617244 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.617268 kubelet[2845]: W0909 22:05:06.617258 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.617348 kubelet[2845]: E0909 22:05:06.617298 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.617502 kubelet[2845]: E0909 22:05:06.617487 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.617502 kubelet[2845]: W0909 22:05:06.617498 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.617560 kubelet[2845]: E0909 22:05:06.617535 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.617719 kubelet[2845]: E0909 22:05:06.617704 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.617719 kubelet[2845]: W0909 22:05:06.617715 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.617785 kubelet[2845]: E0909 22:05:06.617729 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.618013 kubelet[2845]: E0909 22:05:06.617980 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.618013 kubelet[2845]: W0909 22:05:06.618001 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.618078 kubelet[2845]: E0909 22:05:06.618023 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.618266 kubelet[2845]: E0909 22:05:06.618246 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.618266 kubelet[2845]: W0909 22:05:06.618262 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.618340 kubelet[2845]: E0909 22:05:06.618280 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.618541 kubelet[2845]: E0909 22:05:06.618514 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.618541 kubelet[2845]: W0909 22:05:06.618528 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.618592 kubelet[2845]: E0909 22:05:06.618546 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.618765 kubelet[2845]: E0909 22:05:06.618751 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.618788 kubelet[2845]: W0909 22:05:06.618763 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.618788 kubelet[2845]: E0909 22:05:06.618780 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.619022 kubelet[2845]: E0909 22:05:06.619009 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.619052 kubelet[2845]: W0909 22:05:06.619023 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.619092 kubelet[2845]: E0909 22:05:06.619039 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.619355 kubelet[2845]: E0909 22:05:06.619334 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.619355 kubelet[2845]: W0909 22:05:06.619350 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.619427 kubelet[2845]: E0909 22:05:06.619368 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.619632 kubelet[2845]: E0909 22:05:06.619614 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.619632 kubelet[2845]: W0909 22:05:06.619628 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.619686 kubelet[2845]: E0909 22:05:06.619645 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.619917 kubelet[2845]: E0909 22:05:06.619897 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.619968 kubelet[2845]: W0909 22:05:06.619917 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.619968 kubelet[2845]: E0909 22:05:06.619940 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:06.620194 kubelet[2845]: E0909 22:05:06.620176 2845 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 22:05:06.620194 kubelet[2845]: W0909 22:05:06.620190 2845 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 22:05:06.620242 kubelet[2845]: E0909 22:05:06.620201 2845 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 22:05:07.235901 containerd[1571]: time="2025-09-09T22:05:07.235784402Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:07.237013 containerd[1571]: time="2025-09-09T22:05:07.236897471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 22:05:07.239866 containerd[1571]: time="2025-09-09T22:05:07.239301925Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:07.244324 containerd[1571]: time="2025-09-09T22:05:07.244188620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:07.245401 containerd[1571]: time="2025-09-09T22:05:07.245284396Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.742121137s" Sep 9 22:05:07.245401 containerd[1571]: time="2025-09-09T22:05:07.245360169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 22:05:07.248396 containerd[1571]: time="2025-09-09T22:05:07.248345684Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 22:05:07.268894 containerd[1571]: time="2025-09-09T22:05:07.268806196Z" level=info msg="Container a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:07.288756 containerd[1571]: time="2025-09-09T22:05:07.288611497Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\"" Sep 9 22:05:07.292512 containerd[1571]: time="2025-09-09T22:05:07.289434637Z" level=info msg="StartContainer for \"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\"" Sep 9 22:05:07.293552 containerd[1571]: time="2025-09-09T22:05:07.293492042Z" level=info msg="connecting to shim a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092" address="unix:///run/containerd/s/fc8221fe3c8a0891bffc56138712abaed1a982c1378cc736586f8329c8771a1e" protocol=ttrpc version=3 Sep 9 22:05:07.331920 systemd[1]: Started cri-containerd-a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092.scope - libcontainer container a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092. Sep 9 22:05:07.396724 containerd[1571]: time="2025-09-09T22:05:07.396674099Z" level=info msg="StartContainer for \"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\" returns successfully" Sep 9 22:05:07.407995 systemd[1]: cri-containerd-a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092.scope: Deactivated successfully. Sep 9 22:05:07.413700 containerd[1571]: time="2025-09-09T22:05:07.413638560Z" level=info msg="received exit event container_id:\"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\" id:\"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\" pid:3638 exited_at:{seconds:1757455507 nanos:413001924}" Sep 9 22:05:07.414036 containerd[1571]: time="2025-09-09T22:05:07.413790538Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\" id:\"a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092\" pid:3638 exited_at:{seconds:1757455507 nanos:413001924}" Sep 9 22:05:07.453045 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a2b6a3eceed6fe62592979c8530ff9be3bb92c803b8b7b0b1dc29e7576f56092-rootfs.mount: Deactivated successfully. Sep 9 22:05:07.460911 kubelet[2845]: E0909 22:05:07.460866 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:08.055650 kubelet[2845]: E0909 22:05:08.055535 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:09.468215 containerd[1571]: time="2025-09-09T22:05:09.468156064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 22:05:10.055194 kubelet[2845]: E0909 22:05:10.055042 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:12.055354 kubelet[2845]: E0909 22:05:12.054674 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:14.056111 kubelet[2845]: E0909 22:05:14.055969 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:16.055760 kubelet[2845]: E0909 22:05:16.055695 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:16.700095 containerd[1571]: time="2025-09-09T22:05:16.699923184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:16.702651 containerd[1571]: time="2025-09-09T22:05:16.702618180Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 22:05:16.704579 containerd[1571]: time="2025-09-09T22:05:16.704532201Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:16.708808 containerd[1571]: time="2025-09-09T22:05:16.708722455Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:16.709834 containerd[1571]: time="2025-09-09T22:05:16.709753043Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 7.241543317s" Sep 9 22:05:16.709834 containerd[1571]: time="2025-09-09T22:05:16.709822406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 22:05:16.712682 containerd[1571]: time="2025-09-09T22:05:16.712609997Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 22:05:16.729892 containerd[1571]: time="2025-09-09T22:05:16.729840934Z" level=info msg="Container 4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:16.749409 containerd[1571]: time="2025-09-09T22:05:16.749313950Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\"" Sep 9 22:05:16.758035 containerd[1571]: time="2025-09-09T22:05:16.757937329Z" level=info msg="StartContainer for \"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\"" Sep 9 22:05:16.760115 containerd[1571]: time="2025-09-09T22:05:16.760057789Z" level=info msg="connecting to shim 4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b" address="unix:///run/containerd/s/fc8221fe3c8a0891bffc56138712abaed1a982c1378cc736586f8329c8771a1e" protocol=ttrpc version=3 Sep 9 22:05:16.790852 systemd[1]: Started cri-containerd-4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b.scope - libcontainer container 4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b. Sep 9 22:05:16.851580 containerd[1571]: time="2025-09-09T22:05:16.851389824Z" level=info msg="StartContainer for \"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\" returns successfully" Sep 9 22:05:18.055994 kubelet[2845]: E0909 22:05:18.055753 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:18.408174 containerd[1571]: time="2025-09-09T22:05:18.407677385Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 22:05:18.412414 systemd[1]: cri-containerd-4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b.scope: Deactivated successfully. Sep 9 22:05:18.412875 systemd[1]: cri-containerd-4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b.scope: Consumed 792ms CPU time, 177.8M memory peak, 3.1M read from disk, 171.3M written to disk. Sep 9 22:05:18.413016 kubelet[2845]: I0909 22:05:18.412875 2845 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 22:05:18.413786 containerd[1571]: time="2025-09-09T22:05:18.413742803Z" level=info msg="received exit event container_id:\"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\" id:\"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\" pid:3697 exited_at:{seconds:1757455518 nanos:413401398}" Sep 9 22:05:18.414649 containerd[1571]: time="2025-09-09T22:05:18.414579986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\" id:\"4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b\" pid:3697 exited_at:{seconds:1757455518 nanos:413401398}" Sep 9 22:05:18.456300 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4fac949e5ec713092e089cf08c319729f19b2c283e321d04f791761d5221ff2b-rootfs.mount: Deactivated successfully. Sep 9 22:05:18.513869 systemd[1]: Created slice kubepods-besteffort-podc8d9b567_61ce_4f43_b430_38feae93a0e4.slice - libcontainer container kubepods-besteffort-podc8d9b567_61ce_4f43_b430_38feae93a0e4.slice. Sep 9 22:05:18.714401 kubelet[2845]: I0909 22:05:18.714180 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r8z\" (UniqueName: \"kubernetes.io/projected/38023746-1010-4fa8-a0d6-8863807eb181-kube-api-access-m9r8z\") pod \"calico-apiserver-7f6c6b6d-9txl4\" (UID: \"38023746-1010-4fa8-a0d6-8863807eb181\") " pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:18.714401 kubelet[2845]: I0909 22:05:18.714253 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96hj\" (UniqueName: \"kubernetes.io/projected/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-kube-api-access-r96hj\") pod \"whisker-5bbc555c75-7252t\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:18.714401 kubelet[2845]: I0909 22:05:18.714283 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4d9g\" (UniqueName: \"kubernetes.io/projected/593c5a36-33f4-4cef-ac0c-f4a95824b3e7-kube-api-access-x4d9g\") pod \"coredns-7c65d6cfc9-zqjqr\" (UID: \"593c5a36-33f4-4cef-ac0c-f4a95824b3e7\") " pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:18.714401 kubelet[2845]: I0909 22:05:18.714307 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/26331079-5184-44f3-8131-980ec1b1f932-goldmane-key-pair\") pod \"goldmane-7988f88666-svg5l\" (UID: \"26331079-5184-44f3-8131-980ec1b1f932\") " pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:18.714401 kubelet[2845]: I0909 22:05:18.714332 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6471ca4-fc51-48a3-aa84-42a6f5df3c60-config-volume\") pod \"coredns-7c65d6cfc9-w7dk5\" (UID: \"c6471ca4-fc51-48a3-aa84-42a6f5df3c60\") " pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:18.714782 kubelet[2845]: I0909 22:05:18.714364 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsg9g\" (UniqueName: \"kubernetes.io/projected/b81662e5-c9f0-4152-9bb6-b076bea88390-kube-api-access-qsg9g\") pod \"calico-apiserver-6787546b8c-mbr2m\" (UID: \"b81662e5-c9f0-4152-9bb6-b076bea88390\") " pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:18.714782 kubelet[2845]: I0909 22:05:18.714389 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-backend-key-pair\") pod \"whisker-5bbc555c75-7252t\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:18.714782 kubelet[2845]: I0909 22:05:18.714413 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26331079-5184-44f3-8131-980ec1b1f932-config\") pod \"goldmane-7988f88666-svg5l\" (UID: \"26331079-5184-44f3-8131-980ec1b1f932\") " pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:18.714782 kubelet[2845]: I0909 22:05:18.714489 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8d9b567-61ce-4f43-b430-38feae93a0e4-tigera-ca-bundle\") pod \"calico-kube-controllers-57f6cff959-8vhb8\" (UID: \"c8d9b567-61ce-4f43-b430-38feae93a0e4\") " pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:18.714782 kubelet[2845]: I0909 22:05:18.714520 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4x2\" (UniqueName: \"kubernetes.io/projected/c6471ca4-fc51-48a3-aa84-42a6f5df3c60-kube-api-access-sv4x2\") pod \"coredns-7c65d6cfc9-w7dk5\" (UID: \"c6471ca4-fc51-48a3-aa84-42a6f5df3c60\") " pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:18.714948 kubelet[2845]: I0909 22:05:18.714554 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqqp\" (UniqueName: \"kubernetes.io/projected/c8d9b567-61ce-4f43-b430-38feae93a0e4-kube-api-access-qmqqp\") pod \"calico-kube-controllers-57f6cff959-8vhb8\" (UID: \"c8d9b567-61ce-4f43-b430-38feae93a0e4\") " pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:18.714948 kubelet[2845]: I0909 22:05:18.714589 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/38023746-1010-4fa8-a0d6-8863807eb181-calico-apiserver-certs\") pod \"calico-apiserver-7f6c6b6d-9txl4\" (UID: \"38023746-1010-4fa8-a0d6-8863807eb181\") " pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:18.714948 kubelet[2845]: I0909 22:05:18.714615 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czscx\" (UniqueName: \"kubernetes.io/projected/26331079-5184-44f3-8131-980ec1b1f932-kube-api-access-czscx\") pod \"goldmane-7988f88666-svg5l\" (UID: \"26331079-5184-44f3-8131-980ec1b1f932\") " pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:18.714948 kubelet[2845]: I0909 22:05:18.714646 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b81662e5-c9f0-4152-9bb6-b076bea88390-calico-apiserver-certs\") pod \"calico-apiserver-6787546b8c-mbr2m\" (UID: \"b81662e5-c9f0-4152-9bb6-b076bea88390\") " pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:18.714948 kubelet[2845]: I0909 22:05:18.714677 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26331079-5184-44f3-8131-980ec1b1f932-goldmane-ca-bundle\") pod \"goldmane-7988f88666-svg5l\" (UID: \"26331079-5184-44f3-8131-980ec1b1f932\") " pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:18.715104 kubelet[2845]: I0909 22:05:18.714704 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-ca-bundle\") pod \"whisker-5bbc555c75-7252t\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:18.715104 kubelet[2845]: I0909 22:05:18.714738 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593c5a36-33f4-4cef-ac0c-f4a95824b3e7-config-volume\") pod \"coredns-7c65d6cfc9-zqjqr\" (UID: \"593c5a36-33f4-4cef-ac0c-f4a95824b3e7\") " pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:18.715104 kubelet[2845]: I0909 22:05:18.714770 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghrn\" (UniqueName: \"kubernetes.io/projected/dd79b509-695c-4e9d-acec-db0b4753f41e-kube-api-access-cghrn\") pod \"calico-apiserver-7f6c6b6d-td8gs\" (UID: \"dd79b509-695c-4e9d-acec-db0b4753f41e\") " pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:18.715104 kubelet[2845]: I0909 22:05:18.714801 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd79b509-695c-4e9d-acec-db0b4753f41e-calico-apiserver-certs\") pod \"calico-apiserver-7f6c6b6d-td8gs\" (UID: \"dd79b509-695c-4e9d-acec-db0b4753f41e\") " pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:18.721723 systemd[1]: Created slice kubepods-besteffort-podcc8dac7f_68b7_41f2_aa4a_c1e355a3ccc9.slice - libcontainer container kubepods-besteffort-podcc8dac7f_68b7_41f2_aa4a_c1e355a3ccc9.slice. Sep 9 22:05:18.731874 systemd[1]: Created slice kubepods-burstable-pod593c5a36_33f4_4cef_ac0c_f4a95824b3e7.slice - libcontainer container kubepods-burstable-pod593c5a36_33f4_4cef_ac0c_f4a95824b3e7.slice. Sep 9 22:05:18.742817 systemd[1]: Created slice kubepods-besteffort-podb81662e5_c9f0_4152_9bb6_b076bea88390.slice - libcontainer container kubepods-besteffort-podb81662e5_c9f0_4152_9bb6_b076bea88390.slice. Sep 9 22:05:18.757849 systemd[1]: Created slice kubepods-besteffort-pod26331079_5184_44f3_8131_980ec1b1f932.slice - libcontainer container kubepods-besteffort-pod26331079_5184_44f3_8131_980ec1b1f932.slice. Sep 9 22:05:18.765531 systemd[1]: Created slice kubepods-besteffort-poddd79b509_695c_4e9d_acec_db0b4753f41e.slice - libcontainer container kubepods-besteffort-poddd79b509_695c_4e9d_acec_db0b4753f41e.slice. Sep 9 22:05:18.771823 systemd[1]: Created slice kubepods-besteffort-pod38023746_1010_4fa8_a0d6_8863807eb181.slice - libcontainer container kubepods-besteffort-pod38023746_1010_4fa8_a0d6_8863807eb181.slice. Sep 9 22:05:18.777607 systemd[1]: Created slice kubepods-burstable-podc6471ca4_fc51_48a3_aa84_42a6f5df3c60.slice - libcontainer container kubepods-burstable-podc6471ca4_fc51_48a3_aa84_42a6f5df3c60.slice. Sep 9 22:05:19.027421 containerd[1571]: time="2025-09-09T22:05:19.027243605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbc555c75-7252t,Uid:cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:19.036765 kubelet[2845]: E0909 22:05:19.036671 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:19.037728 containerd[1571]: time="2025-09-09T22:05:19.037611905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:19.053753 containerd[1571]: time="2025-09-09T22:05:19.053660383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:19.062816 containerd[1571]: time="2025-09-09T22:05:19.062771777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:19.070192 containerd[1571]: time="2025-09-09T22:05:19.070130809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:19.090289 kubelet[2845]: E0909 22:05:19.089408 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:19.091563 containerd[1571]: time="2025-09-09T22:05:19.091499885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:19.091755 containerd[1571]: time="2025-09-09T22:05:19.091730019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:19.127269 containerd[1571]: time="2025-09-09T22:05:19.127208278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:19.330002 containerd[1571]: time="2025-09-09T22:05:19.329902468Z" level=error msg="Failed to destroy network for sandbox \"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.334690 containerd[1571]: time="2025-09-09T22:05:19.334510940Z" level=error msg="Failed to destroy network for sandbox \"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.345216 containerd[1571]: time="2025-09-09T22:05:19.345155563Z" level=error msg="Failed to destroy network for sandbox \"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.345597 containerd[1571]: time="2025-09-09T22:05:19.345546712Z" level=error msg="Failed to destroy network for sandbox \"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.346184 containerd[1571]: time="2025-09-09T22:05:19.346088105Z" level=error msg="Failed to destroy network for sandbox \"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.346908 containerd[1571]: time="2025-09-09T22:05:19.346851148Z" level=error msg="Failed to destroy network for sandbox \"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.348611 containerd[1571]: time="2025-09-09T22:05:19.348304395Z" level=error msg="Failed to destroy network for sandbox \"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.368181 containerd[1571]: time="2025-09-09T22:05:19.368053809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.368580 containerd[1571]: time="2025-09-09T22:05:19.368107160Z" level=error msg="Failed to destroy network for sandbox \"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.368740 containerd[1571]: time="2025-09-09T22:05:19.368131546Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.368833 containerd[1571]: time="2025-09-09T22:05:19.368146925Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.368833 containerd[1571]: time="2025-09-09T22:05:19.368246223Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.369738 containerd[1571]: time="2025-09-09T22:05:19.369652632Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.371125 containerd[1571]: time="2025-09-09T22:05:19.371060473Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.372332 containerd[1571]: time="2025-09-09T22:05:19.372272734Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbc555c75-7252t,Uid:cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.373833 containerd[1571]: time="2025-09-09T22:05:19.373741040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.380734 kubelet[2845]: E0909 22:05:19.380584 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.380734 kubelet[2845]: E0909 22:05:19.380644 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.380734 kubelet[2845]: E0909 22:05:19.380675 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.380734 kubelet[2845]: E0909 22:05:19.380694 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.380734 kubelet[2845]: E0909 22:05:19.380595 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.381045 kubelet[2845]: E0909 22:05:19.380728 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:19.381045 kubelet[2845]: E0909 22:05:19.380698 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:19.381045 kubelet[2845]: E0909 22:05:19.380777 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:19.381045 kubelet[2845]: E0909 22:05:19.380603 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.381202 kubelet[2845]: E0909 22:05:19.380869 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6c6b6d-9txl4_calico-apiserver(38023746-1010-4fa8-a0d6-8863807eb181)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6c6b6d-9txl4_calico-apiserver(38023746-1010-4fa8-a0d6-8863807eb181)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c74bb70ec222b79ec3968ad84001f138bb40ce3681e3ad16282c48344273fc6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" podUID="38023746-1010-4fa8-a0d6-8863807eb181" Sep 9 22:05:19.381202 kubelet[2845]: E0909 22:05:19.380892 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:19.381202 kubelet[2845]: E0909 22:05:19.380918 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:19.381373 kubelet[2845]: E0909 22:05:19.380745 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:19.381373 kubelet[2845]: E0909 22:05:19.380955 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:19.381373 kubelet[2845]: E0909 22:05:19.380965 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-zqjqr_kube-system(593c5a36-33f4-4cef-ac0c-f4a95824b3e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-zqjqr_kube-system(593c5a36-33f4-4cef-ac0c-f4a95824b3e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76a6a114c947795fbcd78762716db6d31c097caaa20a7e71790d939edbcf7c31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-zqjqr" podUID="593c5a36-33f4-4cef-ac0c-f4a95824b3e7" Sep 9 22:05:19.381534 kubelet[2845]: E0909 22:05:19.380756 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:19.381534 kubelet[2845]: E0909 22:05:19.381004 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6c6b6d-td8gs_calico-apiserver(dd79b509-695c-4e9d-acec-db0b4753f41e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6c6b6d-td8gs_calico-apiserver(dd79b509-695c-4e9d-acec-db0b4753f41e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"573700d1c8ef06d02eff5ee3397b549480698a4dc08e33366b39471a1a96249b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" podUID="dd79b509-695c-4e9d-acec-db0b4753f41e" Sep 9 22:05:19.381534 kubelet[2845]: E0909 22:05:19.380650 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.381652 kubelet[2845]: E0909 22:05:19.380994 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6787546b8c-mbr2m_calico-apiserver(b81662e5-c9f0-4152-9bb6-b076bea88390)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6787546b8c-mbr2m_calico-apiserver(b81662e5-c9f0-4152-9bb6-b076bea88390)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"91d1203e581e41384917190887e00ec905dfe754f66924e5133bba2804540ac0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" podUID="b81662e5-c9f0-4152-9bb6-b076bea88390" Sep 9 22:05:19.381652 kubelet[2845]: E0909 22:05:19.380698 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:19.381652 kubelet[2845]: E0909 22:05:19.381046 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:19.381776 kubelet[2845]: E0909 22:05:19.381080 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-w7dk5_kube-system(c6471ca4-fc51-48a3-aa84-42a6f5df3c60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-w7dk5_kube-system(c6471ca4-fc51-48a3-aa84-42a6f5df3c60)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fdad5f29e0e6532be4fc676169ee24f271bc692eb4f0e9f951e36a27c5671da1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-w7dk5" podUID="c6471ca4-fc51-48a3-aa84-42a6f5df3c60" Sep 9 22:05:19.381776 kubelet[2845]: E0909 22:05:19.380730 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:19.381776 kubelet[2845]: E0909 22:05:19.381131 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:19.381914 kubelet[2845]: E0909 22:05:19.381179 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5bbc555c75-7252t_calico-system(cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5bbc555c75-7252t_calico-system(cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf6a89a3b7b442a299b19da8008fccf878f67f90ceb2adfcc55ae5dec5ef8da8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bbc555c75-7252t" podUID="cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" Sep 9 22:05:19.381914 kubelet[2845]: E0909 22:05:19.381031 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:19.381914 kubelet[2845]: E0909 22:05:19.381222 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:19.382040 kubelet[2845]: E0909 22:05:19.381253 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-svg5l_calico-system(26331079-5184-44f3-8131-980ec1b1f932)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-svg5l_calico-system(26331079-5184-44f3-8131-980ec1b1f932)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf59ccd8237eed2158a5ddddc517700b2429eaf588411398bccb32f90507bea7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-svg5l" podUID="26331079-5184-44f3-8131-980ec1b1f932" Sep 9 22:05:19.382040 kubelet[2845]: E0909 22:05:19.380610 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:19.382040 kubelet[2845]: E0909 22:05:19.381308 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:19.382176 kubelet[2845]: E0909 22:05:19.381325 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:19.382176 kubelet[2845]: E0909 22:05:19.381362 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-57f6cff959-8vhb8_calico-system(c8d9b567-61ce-4f43-b430-38feae93a0e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-57f6cff959-8vhb8_calico-system(c8d9b567-61ce-4f43-b430-38feae93a0e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e6a75f4562045f26eb14274eb48609b9d5d6c66b791851bfe7f4ac09ee7dfc5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" podUID="c8d9b567-61ce-4f43-b430-38feae93a0e4" Sep 9 22:05:19.535751 containerd[1571]: time="2025-09-09T22:05:19.535362749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 22:05:20.062120 systemd[1]: Created slice kubepods-besteffort-podd2140908_6680_44bb_ab1a_8d40f2e95451.slice - libcontainer container kubepods-besteffort-podd2140908_6680_44bb_ab1a_8d40f2e95451.slice. Sep 9 22:05:20.070223 containerd[1571]: time="2025-09-09T22:05:20.070162550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:20.494662 containerd[1571]: time="2025-09-09T22:05:20.494377990Z" level=error msg="Failed to destroy network for sandbox \"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:20.497546 systemd[1]: run-netns-cni\x2dec9b3de5\x2dfbbc\x2d10ee\x2d59a7\x2de44bcfdae40a.mount: Deactivated successfully. Sep 9 22:05:20.565375 containerd[1571]: time="2025-09-09T22:05:20.565212146Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:20.566007 kubelet[2845]: E0909 22:05:20.565634 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:20.566007 kubelet[2845]: E0909 22:05:20.565714 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:05:20.566007 kubelet[2845]: E0909 22:05:20.565736 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:05:20.566309 kubelet[2845]: E0909 22:05:20.565791 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6mwcf_calico-system(d2140908-6680-44bb-ab1a-8d40f2e95451)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6mwcf_calico-system(d2140908-6680-44bb-ab1a-8d40f2e95451)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47ae05a2bb7530d459addbe78d5689513d387589def5136dc255b679cab6b29e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:21.558459 kernel: hrtimer: interrupt took 10610342 ns Sep 9 22:05:29.055596 kubelet[2845]: E0909 22:05:29.055545 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:31.228992 kubelet[2845]: E0909 22:05:31.228793 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:31.229904 kubelet[2845]: E0909 22:05:31.229860 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:31.237657 containerd[1571]: time="2025-09-09T22:05:31.237578752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:31.238234 containerd[1571]: time="2025-09-09T22:05:31.238055642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:31.238300 containerd[1571]: time="2025-09-09T22:05:31.238217317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:31.238300 containerd[1571]: time="2025-09-09T22:05:31.238273743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:31.238729 containerd[1571]: time="2025-09-09T22:05:31.238241693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:32.126263 containerd[1571]: time="2025-09-09T22:05:32.125936292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbc555c75-7252t,Uid:cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:32.126263 containerd[1571]: time="2025-09-09T22:05:32.126148643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:32.314955 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount443079182.mount: Deactivated successfully. Sep 9 22:05:33.055901 kubelet[2845]: E0909 22:05:33.055843 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:33.057021 containerd[1571]: time="2025-09-09T22:05:33.056182644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:33.057021 containerd[1571]: time="2025-09-09T22:05:33.056454306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:36.948662 containerd[1571]: time="2025-09-09T22:05:36.948591619Z" level=error msg="Failed to destroy network for sandbox \"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:36.951109 systemd[1]: run-netns-cni\x2dcc52c3a3\x2d7e15\x2dd17e\x2d937e\x2dac51c7b42bdc.mount: Deactivated successfully. Sep 9 22:05:37.849660 containerd[1571]: time="2025-09-09T22:05:37.849579519Z" level=error msg="Failed to destroy network for sandbox \"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:37.852169 systemd[1]: run-netns-cni\x2d68bf5290\x2db4ef\x2de660\x2df1f5\x2df9bd81f22c9c.mount: Deactivated successfully. Sep 9 22:05:37.992080 containerd[1571]: time="2025-09-09T22:05:37.992004985Z" level=error msg="Failed to destroy network for sandbox \"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:37.996325 systemd[1]: run-netns-cni\x2dbfc0c3a0\x2df990\x2d3e7f\x2dabb0\x2d4def41e03f1a.mount: Deactivated successfully. Sep 9 22:05:38.035959 containerd[1571]: time="2025-09-09T22:05:38.035884309Z" level=error msg="Failed to destroy network for sandbox \"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.229658 containerd[1571]: time="2025-09-09T22:05:38.229460158Z" level=error msg="Failed to destroy network for sandbox \"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.259539 containerd[1571]: time="2025-09-09T22:05:38.259397656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.259890 kubelet[2845]: E0909 22:05:38.259809 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.260358 kubelet[2845]: E0909 22:05:38.259895 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:38.260358 kubelet[2845]: E0909 22:05:38.259920 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-svg5l" Sep 9 22:05:38.260358 kubelet[2845]: E0909 22:05:38.259974 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-svg5l_calico-system(26331079-5184-44f3-8131-980ec1b1f932)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-svg5l_calico-system(26331079-5184-44f3-8131-980ec1b1f932)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9606439a008cbd6705f6ec4d5d905116359eb33d5967081ad141649a709f0248\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-svg5l" podUID="26331079-5184-44f3-8131-980ec1b1f932" Sep 9 22:05:38.460741 containerd[1571]: time="2025-09-09T22:05:38.460650680Z" level=error msg="Failed to destroy network for sandbox \"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.517174 systemd[1]: run-netns-cni\x2d2a9f1c45\x2d4525\x2d23bb\x2d4f3f\x2d26a08329694b.mount: Deactivated successfully. Sep 9 22:05:38.517322 systemd[1]: run-netns-cni\x2d55b7ac6a\x2d1a2e\x2da516\x2d20f6\x2df5d490349216.mount: Deactivated successfully. Sep 9 22:05:38.517422 systemd[1]: run-netns-cni\x2dc7e61af2\x2d21ec\x2d2c95\x2d9bce\x2dc61f775a37b4.mount: Deactivated successfully. Sep 9 22:05:38.631360 containerd[1571]: time="2025-09-09T22:05:38.631280593Z" level=error msg="Failed to destroy network for sandbox \"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:38.634509 systemd[1]: run-netns-cni\x2d92ae68da\x2d870a\x2d8123\x2d421e\x2d840f6d9fe799.mount: Deactivated successfully. Sep 9 22:05:39.057377 containerd[1571]: time="2025-09-09T22:05:39.057127114Z" level=error msg="Failed to destroy network for sandbox \"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.064223 systemd[1]: run-netns-cni\x2d31d602db\x2d2112\x2dae23\x2da0eb\x2d25de34ecd09b.mount: Deactivated successfully. Sep 9 22:05:39.110217 containerd[1571]: time="2025-09-09T22:05:39.110069190Z" level=error msg="Failed to destroy network for sandbox \"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.113650 systemd[1]: run-netns-cni\x2d9b98419d\x2d6adb\x2d6bbe\x2d72fd\x2dccbbbbd44551.mount: Deactivated successfully. Sep 9 22:05:39.552610 systemd[1]: Started sshd@9-10.0.0.72:22-10.0.0.1:50440.service - OpenSSH per-connection server daemon (10.0.0.1:50440). Sep 9 22:05:39.690925 containerd[1571]: time="2025-09-09T22:05:39.690828808Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.691302 kubelet[2845]: E0909 22:05:39.691220 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.691302 kubelet[2845]: E0909 22:05:39.691314 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:39.691989 kubelet[2845]: E0909 22:05:39.691341 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" Sep 9 22:05:39.691989 kubelet[2845]: E0909 22:05:39.691410 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6c6b6d-td8gs_calico-apiserver(dd79b509-695c-4e9d-acec-db0b4753f41e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6c6b6d-td8gs_calico-apiserver(dd79b509-695c-4e9d-acec-db0b4753f41e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"274e8fb2063d7d316a0e4ee928026848762f9c52817098c510ff111ffccd020a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" podUID="dd79b509-695c-4e9d-acec-db0b4753f41e" Sep 9 22:05:39.711230 containerd[1571]: time="2025-09-09T22:05:39.711145150Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.711624 kubelet[2845]: E0909 22:05:39.711511 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.711694 kubelet[2845]: E0909 22:05:39.711630 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:39.711694 kubelet[2845]: E0909 22:05:39.711656 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-w7dk5" Sep 9 22:05:39.711755 kubelet[2845]: E0909 22:05:39.711714 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-w7dk5_kube-system(c6471ca4-fc51-48a3-aa84-42a6f5df3c60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-w7dk5_kube-system(c6471ca4-fc51-48a3-aa84-42a6f5df3c60)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d698be75f02d90eeb3aadd2842086cf9ebfd2ea14aabd4f5624ecbc041166cd3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-w7dk5" podUID="c6471ca4-fc51-48a3-aa84-42a6f5df3c60" Sep 9 22:05:39.776689 containerd[1571]: time="2025-09-09T22:05:39.776402261Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.776934 kubelet[2845]: E0909 22:05:39.776842 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.776934 kubelet[2845]: E0909 22:05:39.776909 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:39.776934 kubelet[2845]: E0909 22:05:39.776928 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" Sep 9 22:05:39.777068 kubelet[2845]: E0909 22:05:39.776979 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f6c6b6d-9txl4_calico-apiserver(38023746-1010-4fa8-a0d6-8863807eb181)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f6c6b6d-9txl4_calico-apiserver(38023746-1010-4fa8-a0d6-8863807eb181)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10fa5f68e6efd563f13a3bc4ce6448d51c33bf4b11cc3d4b4683843f5450b789\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" podUID="38023746-1010-4fa8-a0d6-8863807eb181" Sep 9 22:05:39.847394 sshd[4337]: Accepted publickey for core from 10.0.0.1 port 50440 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:05:39.848005 containerd[1571]: time="2025-09-09T22:05:39.847925024Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.848569 kubelet[2845]: E0909 22:05:39.848460 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.848748 kubelet[2845]: E0909 22:05:39.848661 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:39.848748 kubelet[2845]: E0909 22:05:39.848694 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" Sep 9 22:05:39.848817 kubelet[2845]: E0909 22:05:39.848756 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-57f6cff959-8vhb8_calico-system(c8d9b567-61ce-4f43-b430-38feae93a0e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-57f6cff959-8vhb8_calico-system(c8d9b567-61ce-4f43-b430-38feae93a0e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba1e38c16269f9f39d7b12482d343169c916ab856bc4d6ac38233453f56dba08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" podUID="c8d9b567-61ce-4f43-b430-38feae93a0e4" Sep 9 22:05:39.849496 sshd-session[4337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:05:39.859886 systemd-logind[1554]: New session 10 of user core. Sep 9 22:05:39.869632 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 22:05:39.899073 containerd[1571]: time="2025-09-09T22:05:39.898972456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bbc555c75-7252t,Uid:cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.900656 kubelet[2845]: E0909 22:05:39.899367 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.900656 kubelet[2845]: E0909 22:05:39.899449 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:39.900656 kubelet[2845]: E0909 22:05:39.899499 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5bbc555c75-7252t" Sep 9 22:05:39.900779 kubelet[2845]: E0909 22:05:39.899553 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5bbc555c75-7252t_calico-system(cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5bbc555c75-7252t_calico-system(cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e7f997d3e5a9ad732753dac0aff489be1d66c44c301dc2200f36d92de3e7e444\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5bbc555c75-7252t" podUID="cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" Sep 9 22:05:39.945006 containerd[1571]: time="2025-09-09T22:05:39.944883717Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.945506 kubelet[2845]: E0909 22:05:39.945442 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.945631 kubelet[2845]: E0909 22:05:39.945540 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:39.945631 kubelet[2845]: E0909 22:05:39.945599 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" Sep 9 22:05:39.945763 kubelet[2845]: E0909 22:05:39.945697 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6787546b8c-mbr2m_calico-apiserver(b81662e5-c9f0-4152-9bb6-b076bea88390)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6787546b8c-mbr2m_calico-apiserver(b81662e5-c9f0-4152-9bb6-b076bea88390)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43fd881ceb34b88cd8cf9baa1b645bec3028ae8dc09c1271d9a11768b359bead\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" podUID="b81662e5-c9f0-4152-9bb6-b076bea88390" Sep 9 22:05:39.951449 containerd[1571]: time="2025-09-09T22:05:39.951239390Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:39.968053 containerd[1571]: time="2025-09-09T22:05:39.967949467Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.969362 kubelet[2845]: E0909 22:05:39.968875 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.969765 kubelet[2845]: E0909 22:05:39.969650 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:05:39.969955 kubelet[2845]: E0909 22:05:39.969725 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6mwcf" Sep 9 22:05:39.970546 kubelet[2845]: E0909 22:05:39.970153 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6mwcf_calico-system(d2140908-6680-44bb-ab1a-8d40f2e95451)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6mwcf_calico-system(d2140908-6680-44bb-ab1a-8d40f2e95451)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"635bde580b27822c2a57107f3862c31bba33daef44e3421aed72589e1f6b87ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6mwcf" podUID="d2140908-6680-44bb-ab1a-8d40f2e95451" Sep 9 22:05:39.974739 containerd[1571]: time="2025-09-09T22:05:39.974384247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.976066 kubelet[2845]: E0909 22:05:39.975828 2845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 22:05:39.976521 kubelet[2845]: E0909 22:05:39.976307 2845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:39.977015 kubelet[2845]: E0909 22:05:39.976691 2845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zqjqr" Sep 9 22:05:39.977343 kubelet[2845]: E0909 22:05:39.977205 2845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-zqjqr_kube-system(593c5a36-33f4-4cef-ac0c-f4a95824b3e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-zqjqr_kube-system(593c5a36-33f4-4cef-ac0c-f4a95824b3e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32b76b6d72847cb72daea8de2f3c02f3b7b734635f0e59cafbec6a904a5dbbb7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-zqjqr" podUID="593c5a36-33f4-4cef-ac0c-f4a95824b3e7" Sep 9 22:05:39.978934 containerd[1571]: time="2025-09-09T22:05:39.978875921Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 22:05:39.983440 containerd[1571]: time="2025-09-09T22:05:39.983259422Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:39.998309 containerd[1571]: time="2025-09-09T22:05:39.998211292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:40.000176 containerd[1571]: time="2025-09-09T22:05:40.000002151Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 20.464587264s" Sep 9 22:05:40.000176 containerd[1571]: time="2025-09-09T22:05:40.000036495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 22:05:40.019146 containerd[1571]: time="2025-09-09T22:05:40.019076596Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 22:05:40.130121 containerd[1571]: time="2025-09-09T22:05:40.129858086Z" level=info msg="Container ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:40.130092 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2696843305.mount: Deactivated successfully. Sep 9 22:05:40.136435 sshd[4340]: Connection closed by 10.0.0.1 port 50440 Sep 9 22:05:40.136869 sshd-session[4337]: pam_unix(sshd:session): session closed for user core Sep 9 22:05:40.147444 systemd[1]: sshd@9-10.0.0.72:22-10.0.0.1:50440.service: Deactivated successfully. Sep 9 22:05:40.152050 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 22:05:40.154198 systemd-logind[1554]: Session 10 logged out. Waiting for processes to exit. Sep 9 22:05:40.157945 systemd-logind[1554]: Removed session 10. Sep 9 22:05:40.185536 containerd[1571]: time="2025-09-09T22:05:40.185436689Z" level=info msg="CreateContainer within sandbox \"aecffa234f9d209b207cef34f7f944c347170e13222b84f44ec7eff8b19854a3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\"" Sep 9 22:05:40.186292 containerd[1571]: time="2025-09-09T22:05:40.186215519Z" level=info msg="StartContainer for \"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\"" Sep 9 22:05:40.198976 containerd[1571]: time="2025-09-09T22:05:40.198903134Z" level=info msg="connecting to shim ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b" address="unix:///run/containerd/s/fc8221fe3c8a0891bffc56138712abaed1a982c1378cc736586f8329c8771a1e" protocol=ttrpc version=3 Sep 9 22:05:40.248964 systemd[1]: Started cri-containerd-ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b.scope - libcontainer container ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b. Sep 9 22:05:40.320159 containerd[1571]: time="2025-09-09T22:05:40.320013149Z" level=info msg="StartContainer for \"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" returns successfully" Sep 9 22:05:40.437570 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 22:05:40.437790 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 22:05:40.670837 kubelet[2845]: I0909 22:05:40.670452 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xxbpp" podStartSLOduration=2.244703105 podStartE2EDuration="42.670427376s" podCreationTimestamp="2025-09-09 22:04:58 +0000 UTC" firstStartedPulling="2025-09-09 22:04:59.576702214 +0000 UTC m=+40.715390525" lastFinishedPulling="2025-09-09 22:05:40.002426495 +0000 UTC m=+81.141114796" observedRunningTime="2025-09-09 22:05:40.667131718 +0000 UTC m=+81.805820019" watchObservedRunningTime="2025-09-09 22:05:40.670427376 +0000 UTC m=+81.809115667" Sep 9 22:05:40.794453 kubelet[2845]: I0909 22:05:40.794230 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-ca-bundle\") pod \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " Sep 9 22:05:40.794453 kubelet[2845]: I0909 22:05:40.794354 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r96hj\" (UniqueName: \"kubernetes.io/projected/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-kube-api-access-r96hj\") pod \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " Sep 9 22:05:40.798436 kubelet[2845]: I0909 22:05:40.798385 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" (UID: "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 22:05:40.802280 kubelet[2845]: I0909 22:05:40.802199 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-backend-key-pair\") pod \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\" (UID: \"cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9\") " Sep 9 22:05:40.803497 kubelet[2845]: I0909 22:05:40.803444 2845 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 22:05:40.809391 kubelet[2845]: I0909 22:05:40.809317 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-kube-api-access-r96hj" (OuterVolumeSpecName: "kube-api-access-r96hj") pod "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" (UID: "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9"). InnerVolumeSpecName "kube-api-access-r96hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 22:05:40.813768 kubelet[2845]: I0909 22:05:40.813549 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" (UID: "cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 22:05:40.904389 kubelet[2845]: I0909 22:05:40.904303 2845 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 22:05:40.904389 kubelet[2845]: I0909 22:05:40.904357 2845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r96hj\" (UniqueName: \"kubernetes.io/projected/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9-kube-api-access-r96hj\") on node \"localhost\" DevicePath \"\"" Sep 9 22:05:40.936379 containerd[1571]: time="2025-09-09T22:05:40.936193827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" id:\"6452d3fcc7c8cc9ededc0fb8c6a29e4d0e5f4311b4af2f4db8f990e55d252e38\" pid:4424 exit_status:1 exited_at:{seconds:1757455540 nanos:935738809}" Sep 9 22:05:41.010970 systemd[1]: var-lib-kubelet-pods-cc8dac7f\x2d68b7\x2d41f2\x2daa4a\x2dc1e355a3ccc9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr96hj.mount: Deactivated successfully. Sep 9 22:05:41.011125 systemd[1]: var-lib-kubelet-pods-cc8dac7f\x2d68b7\x2d41f2\x2daa4a\x2dc1e355a3ccc9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 22:05:41.066651 systemd[1]: Removed slice kubepods-besteffort-podcc8dac7f_68b7_41f2_aa4a_c1e355a3ccc9.slice - libcontainer container kubepods-besteffort-podcc8dac7f_68b7_41f2_aa4a_c1e355a3ccc9.slice. Sep 9 22:05:41.699845 systemd[1]: Created slice kubepods-besteffort-poddc9ed8e4_88c8_4f4c_abde_059855814e9d.slice - libcontainer container kubepods-besteffort-poddc9ed8e4_88c8_4f4c_abde_059855814e9d.slice. Sep 9 22:05:41.733794 containerd[1571]: time="2025-09-09T22:05:41.733738673Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" id:\"08b9b6aa2a58f328c89bec44f4f1e870f32a5cea50eb4b42be74a22aff261dfc\" pid:4460 exit_status:1 exited_at:{seconds:1757455541 nanos:733354760}" Sep 9 22:05:41.812390 kubelet[2845]: I0909 22:05:41.812317 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zx4\" (UniqueName: \"kubernetes.io/projected/dc9ed8e4-88c8-4f4c-abde-059855814e9d-kube-api-access-d4zx4\") pod \"whisker-694b9f4766-tzttj\" (UID: \"dc9ed8e4-88c8-4f4c-abde-059855814e9d\") " pod="calico-system/whisker-694b9f4766-tzttj" Sep 9 22:05:41.812390 kubelet[2845]: I0909 22:05:41.812416 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc9ed8e4-88c8-4f4c-abde-059855814e9d-whisker-backend-key-pair\") pod \"whisker-694b9f4766-tzttj\" (UID: \"dc9ed8e4-88c8-4f4c-abde-059855814e9d\") " pod="calico-system/whisker-694b9f4766-tzttj" Sep 9 22:05:41.813011 kubelet[2845]: I0909 22:05:41.812443 2845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc9ed8e4-88c8-4f4c-abde-059855814e9d-whisker-ca-bundle\") pod \"whisker-694b9f4766-tzttj\" (UID: \"dc9ed8e4-88c8-4f4c-abde-059855814e9d\") " pod="calico-system/whisker-694b9f4766-tzttj" Sep 9 22:05:42.305337 containerd[1571]: time="2025-09-09T22:05:42.305280297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-694b9f4766-tzttj,Uid:dc9ed8e4-88c8-4f4c-abde-059855814e9d,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:42.971803 systemd-networkd[1455]: vxlan.calico: Link UP Sep 9 22:05:42.971817 systemd-networkd[1455]: vxlan.calico: Gained carrier Sep 9 22:05:43.058410 kubelet[2845]: I0909 22:05:43.058133 2845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9" path="/var/lib/kubelet/pods/cc8dac7f-68b7-41f2-aa4a-c1e355a3ccc9/volumes" Sep 9 22:05:44.164333 systemd-networkd[1455]: cali3d26e64e205: Link UP Sep 9 22:05:44.165302 systemd-networkd[1455]: cali3d26e64e205: Gained carrier Sep 9 22:05:44.189865 containerd[1571]: 2025-09-09 22:05:43.818 [INFO][4654] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--694b9f4766--tzttj-eth0 whisker-694b9f4766- calico-system dc9ed8e4-88c8-4f4c-abde-059855814e9d 1046 0 2025-09-09 22:05:41 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:694b9f4766 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-694b9f4766-tzttj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3d26e64e205 [] [] }} ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-" Sep 9 22:05:44.189865 containerd[1571]: 2025-09-09 22:05:43.818 [INFO][4654] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.189865 containerd[1571]: 2025-09-09 22:05:44.111 [INFO][4684] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" HandleID="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Workload="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.112 [INFO][4684] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" HandleID="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Workload="localhost-k8s-whisker--694b9f4766--tzttj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00011bb30), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-694b9f4766-tzttj", "timestamp":"2025-09-09 22:05:44.111683024 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.113 [INFO][4684] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.113 [INFO][4684] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.113 [INFO][4684] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.124 [INFO][4684] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" host="localhost" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.133 [INFO][4684] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.138 [INFO][4684] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.140 [INFO][4684] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.142 [INFO][4684] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:44.190740 containerd[1571]: 2025-09-09 22:05:44.142 [INFO][4684] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" host="localhost" Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.144 [INFO][4684] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.148 [INFO][4684] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" host="localhost" Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.155 [INFO][4684] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" host="localhost" Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.155 [INFO][4684] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" host="localhost" Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.155 [INFO][4684] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:44.191065 containerd[1571]: 2025-09-09 22:05:44.155 [INFO][4684] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" HandleID="k8s-pod-network.b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Workload="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.191215 containerd[1571]: 2025-09-09 22:05:44.159 [INFO][4654] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--694b9f4766--tzttj-eth0", GenerateName:"whisker-694b9f4766-", Namespace:"calico-system", SelfLink:"", UID:"dc9ed8e4-88c8-4f4c-abde-059855814e9d", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 5, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"694b9f4766", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-694b9f4766-tzttj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3d26e64e205", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:44.191215 containerd[1571]: 2025-09-09 22:05:44.159 [INFO][4654] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.191306 containerd[1571]: 2025-09-09 22:05:44.159 [INFO][4654] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d26e64e205 ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.191306 containerd[1571]: 2025-09-09 22:05:44.165 [INFO][4654] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.191359 containerd[1571]: 2025-09-09 22:05:44.168 [INFO][4654] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--694b9f4766--tzttj-eth0", GenerateName:"whisker-694b9f4766-", Namespace:"calico-system", SelfLink:"", UID:"dc9ed8e4-88c8-4f4c-abde-059855814e9d", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 5, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"694b9f4766", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab", Pod:"whisker-694b9f4766-tzttj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3d26e64e205", MAC:"5e:8a:f3:ee:7c:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:44.191420 containerd[1571]: 2025-09-09 22:05:44.185 [INFO][4654] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" Namespace="calico-system" Pod="whisker-694b9f4766-tzttj" WorkloadEndpoint="localhost-k8s-whisker--694b9f4766--tzttj-eth0" Sep 9 22:05:44.381981 containerd[1571]: time="2025-09-09T22:05:44.381925210Z" level=info msg="connecting to shim b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab" address="unix:///run/containerd/s/6989fd5fcd1decc74e94f05dc1e4a8fe326a07fddbe0c07e7e35f62407012a64" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:44.462697 systemd[1]: Started cri-containerd-b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab.scope - libcontainer container b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab. Sep 9 22:05:44.482970 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:44.595712 containerd[1571]: time="2025-09-09T22:05:44.595600574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-694b9f4766-tzttj,Uid:dc9ed8e4-88c8-4f4c-abde-059855814e9d,Namespace:calico-system,Attempt:0,} returns sandbox id \"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab\"" Sep 9 22:05:44.598054 containerd[1571]: time="2025-09-09T22:05:44.597990372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 22:05:44.895752 systemd-networkd[1455]: vxlan.calico: Gained IPv6LL Sep 9 22:05:45.154189 systemd[1]: Started sshd@10-10.0.0.72:22-10.0.0.1:39056.service - OpenSSH per-connection server daemon (10.0.0.1:39056). Sep 9 22:05:45.328013 sshd[4753]: Accepted publickey for core from 10.0.0.1 port 39056 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:05:45.331591 sshd-session[4753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:05:45.350058 systemd-logind[1554]: New session 11 of user core. Sep 9 22:05:45.366233 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 22:05:45.735036 sshd[4756]: Connection closed by 10.0.0.1 port 39056 Sep 9 22:05:45.735584 sshd-session[4753]: pam_unix(sshd:session): session closed for user core Sep 9 22:05:45.741042 systemd[1]: sshd@10-10.0.0.72:22-10.0.0.1:39056.service: Deactivated successfully. Sep 9 22:05:45.743450 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 22:05:45.744683 systemd-logind[1554]: Session 11 logged out. Waiting for processes to exit. Sep 9 22:05:45.746370 systemd-logind[1554]: Removed session 11. Sep 9 22:05:45.983752 systemd-networkd[1455]: cali3d26e64e205: Gained IPv6LL Sep 9 22:05:46.930673 containerd[1571]: time="2025-09-09T22:05:46.930586548Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:46.949039 containerd[1571]: time="2025-09-09T22:05:46.948973484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 22:05:46.950736 containerd[1571]: time="2025-09-09T22:05:46.950628385Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:46.953619 containerd[1571]: time="2025-09-09T22:05:46.953529526Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:46.954210 containerd[1571]: time="2025-09-09T22:05:46.954162360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.356115081s" Sep 9 22:05:46.954210 containerd[1571]: time="2025-09-09T22:05:46.954200671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 22:05:46.956412 containerd[1571]: time="2025-09-09T22:05:46.956380342Z" level=info msg="CreateContainer within sandbox \"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 22:05:46.967082 containerd[1571]: time="2025-09-09T22:05:46.967019016Z" level=info msg="Container baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:46.976170 containerd[1571]: time="2025-09-09T22:05:46.976121382Z" level=info msg="CreateContainer within sandbox \"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971\"" Sep 9 22:05:46.976735 containerd[1571]: time="2025-09-09T22:05:46.976693160Z" level=info msg="StartContainer for \"baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971\"" Sep 9 22:05:46.977788 containerd[1571]: time="2025-09-09T22:05:46.977760283Z" level=info msg="connecting to shim baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971" address="unix:///run/containerd/s/6989fd5fcd1decc74e94f05dc1e4a8fe326a07fddbe0c07e7e35f62407012a64" protocol=ttrpc version=3 Sep 9 22:05:47.010627 systemd[1]: Started cri-containerd-baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971.scope - libcontainer container baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971. Sep 9 22:05:47.057960 kubelet[2845]: E0909 22:05:47.056666 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:47.067378 containerd[1571]: time="2025-09-09T22:05:47.067335855Z" level=info msg="StartContainer for \"baee12aae4778067debe1c6ce8a7d51da0b042debcd57b989389cc05cdcf4971\" returns successfully" Sep 9 22:05:47.069481 containerd[1571]: time="2025-09-09T22:05:47.068695139Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 22:05:48.055273 kubelet[2845]: E0909 22:05:48.055206 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:49.064215 containerd[1571]: time="2025-09-09T22:05:49.063853149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:50.055808 kubelet[2845]: E0909 22:05:50.055760 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:50.056288 containerd[1571]: time="2025-09-09T22:05:50.056075277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:50.056288 containerd[1571]: time="2025-09-09T22:05:50.056075257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:50.593879 systemd-networkd[1455]: cali6b93512223b: Link UP Sep 9 22:05:50.594856 systemd-networkd[1455]: cali6b93512223b: Gained carrier Sep 9 22:05:50.692148 containerd[1571]: 2025-09-09 22:05:49.928 [INFO][4817] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--svg5l-eth0 goldmane-7988f88666- calico-system 26331079-5184-44f3-8131-980ec1b1f932 907 0 2025-09-09 22:04:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-svg5l eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali6b93512223b [] [] }} ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-" Sep 9 22:05:50.692148 containerd[1571]: 2025-09-09 22:05:49.928 [INFO][4817] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.692148 containerd[1571]: 2025-09-09 22:05:50.310 [INFO][4832] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" HandleID="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Workload="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.311 [INFO][4832] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" HandleID="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Workload="localhost-k8s-goldmane--7988f88666--svg5l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7600), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-svg5l", "timestamp":"2025-09-09 22:05:50.31093284 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.311 [INFO][4832] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.311 [INFO][4832] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.311 [INFO][4832] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.318 [INFO][4832] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" host="localhost" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.324 [INFO][4832] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.329 [INFO][4832] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.332 [INFO][4832] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.334 [INFO][4832] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:50.692982 containerd[1571]: 2025-09-09 22:05:50.334 [INFO][4832] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" host="localhost" Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.336 [INFO][4832] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632 Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.499 [INFO][4832] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" host="localhost" Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.583 [INFO][4832] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" host="localhost" Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.583 [INFO][4832] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" host="localhost" Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.583 [INFO][4832] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:50.698535 containerd[1571]: 2025-09-09 22:05:50.583 [INFO][4832] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" HandleID="k8s-pod-network.92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Workload="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.698825 containerd[1571]: 2025-09-09 22:05:50.587 [INFO][4817] cni-plugin/k8s.go 418: Populated endpoint ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--svg5l-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"26331079-5184-44f3-8131-980ec1b1f932", ResourceVersion:"907", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-svg5l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6b93512223b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:50.698825 containerd[1571]: 2025-09-09 22:05:50.587 [INFO][4817] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.700574 containerd[1571]: 2025-09-09 22:05:50.587 [INFO][4817] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b93512223b ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.700574 containerd[1571]: 2025-09-09 22:05:50.595 [INFO][4817] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.700665 containerd[1571]: 2025-09-09 22:05:50.595 [INFO][4817] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--svg5l-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"26331079-5184-44f3-8131-980ec1b1f932", ResourceVersion:"907", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632", Pod:"goldmane-7988f88666-svg5l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6b93512223b", MAC:"be:1c:1d:65:59:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:50.700759 containerd[1571]: 2025-09-09 22:05:50.681 [INFO][4817] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" Namespace="calico-system" Pod="goldmane-7988f88666-svg5l" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--svg5l-eth0" Sep 9 22:05:50.780583 systemd[1]: Started sshd@11-10.0.0.72:22-10.0.0.1:44738.service - OpenSSH per-connection server daemon (10.0.0.1:44738). Sep 9 22:05:51.056333 containerd[1571]: time="2025-09-09T22:05:51.056279573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:51.126251 sshd[4874]: Accepted publickey for core from 10.0.0.1 port 44738 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:05:51.129068 sshd-session[4874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:05:51.137847 systemd-logind[1554]: New session 12 of user core. Sep 9 22:05:51.144790 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 22:05:51.209807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2866736748.mount: Deactivated successfully. Sep 9 22:05:51.361114 systemd-networkd[1455]: cali99fea5fc422: Link UP Sep 9 22:05:51.362997 systemd-networkd[1455]: cali99fea5fc422: Gained carrier Sep 9 22:05:51.451148 containerd[1571]: 2025-09-09 22:05:50.892 [INFO][4876] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0 coredns-7c65d6cfc9- kube-system c6471ca4-fc51-48a3-aa84-42a6f5df3c60 911 0 2025-09-09 22:04:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-w7dk5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali99fea5fc422 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-" Sep 9 22:05:51.451148 containerd[1571]: 2025-09-09 22:05:50.893 [INFO][4876] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.451148 containerd[1571]: 2025-09-09 22:05:50.931 [INFO][4900] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" HandleID="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Workload="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:50.932 [INFO][4900] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" HandleID="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Workload="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e6b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-w7dk5", "timestamp":"2025-09-09 22:05:50.931454862 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:50.932 [INFO][4900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:50.932 [INFO][4900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:50.932 [INFO][4900] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.083 [INFO][4900] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" host="localhost" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.091 [INFO][4900] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.095 [INFO][4900] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.098 [INFO][4900] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.101 [INFO][4900] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.451448 containerd[1571]: 2025-09-09 22:05:51.101 [INFO][4900] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" host="localhost" Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.102 [INFO][4900] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.146 [INFO][4900] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" host="localhost" Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.352 [INFO][4900] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" host="localhost" Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.352 [INFO][4900] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" host="localhost" Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.352 [INFO][4900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:51.451793 containerd[1571]: 2025-09-09 22:05:51.352 [INFO][4900] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" HandleID="k8s-pod-network.06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Workload="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.451973 containerd[1571]: 2025-09-09 22:05:51.356 [INFO][4876] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c6471ca4-fc51-48a3-aa84-42a6f5df3c60", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-w7dk5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali99fea5fc422", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.452078 containerd[1571]: 2025-09-09 22:05:51.356 [INFO][4876] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.452078 containerd[1571]: 2025-09-09 22:05:51.356 [INFO][4876] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali99fea5fc422 ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.452078 containerd[1571]: 2025-09-09 22:05:51.362 [INFO][4876] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.452174 containerd[1571]: 2025-09-09 22:05:51.363 [INFO][4876] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c6471ca4-fc51-48a3-aa84-42a6f5df3c60", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf", Pod:"coredns-7c65d6cfc9-w7dk5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali99fea5fc422", MAC:"46:c3:52:20:c4:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.452174 containerd[1571]: 2025-09-09 22:05:51.448 [INFO][4876] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-w7dk5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--w7dk5-eth0" Sep 9 22:05:51.535092 sshd[4910]: Connection closed by 10.0.0.1 port 44738 Sep 9 22:05:51.535883 sshd-session[4874]: pam_unix(sshd:session): session closed for user core Sep 9 22:05:51.547000 systemd[1]: sshd@11-10.0.0.72:22-10.0.0.1:44738.service: Deactivated successfully. Sep 9 22:05:51.550290 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 22:05:51.551878 systemd-logind[1554]: Session 12 logged out. Waiting for processes to exit. Sep 9 22:05:51.555686 systemd-logind[1554]: Removed session 12. Sep 9 22:05:51.574820 systemd-networkd[1455]: cali73996c91ad7: Link UP Sep 9 22:05:51.577984 systemd-networkd[1455]: cali73996c91ad7: Gained carrier Sep 9 22:05:51.593125 containerd[1571]: time="2025-09-09T22:05:51.593008585Z" level=info msg="connecting to shim 92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632" address="unix:///run/containerd/s/714e0d4a082d9c990285b5acbad9a6aa6c704c41f2043b78366deb0f489f25f2" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:51.619264 containerd[1571]: time="2025-09-09T22:05:51.619110968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:51.621442 containerd[1571]: time="2025-09-09T22:05:51.621369626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:50.828 [INFO][4852] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0 calico-apiserver-7f6c6b6d- calico-apiserver 38023746-1010-4fa8-a0d6-8863807eb181 914 0 2025-09-09 22:04:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f6c6b6d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7f6c6b6d-9txl4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali73996c91ad7 [] [] }} ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:50.830 [INFO][4852] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.111 [INFO][4890] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.112 [INFO][4890] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000116fc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7f6c6b6d-9txl4", "timestamp":"2025-09-09 22:05:51.11196907 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.112 [INFO][4890] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.352 [INFO][4890] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.353 [INFO][4890] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.376 [INFO][4890] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.381 [INFO][4890] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.523 [INFO][4890] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.527 [INFO][4890] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.531 [INFO][4890] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.531 [INFO][4890] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.534 [INFO][4890] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979 Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.542 [INFO][4890] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.555 [INFO][4890] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.555 [INFO][4890] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" host="localhost" Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.555 [INFO][4890] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:51.621682 containerd[1571]: 2025-09-09 22:05:51.555 [INFO][4890] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.561 [INFO][4852] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0", GenerateName:"calico-apiserver-7f6c6b6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"38023746-1010-4fa8-a0d6-8863807eb181", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6c6b6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7f6c6b6d-9txl4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali73996c91ad7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.561 [INFO][4852] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.561 [INFO][4852] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali73996c91ad7 ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.580 [INFO][4852] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.582 [INFO][4852] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0", GenerateName:"calico-apiserver-7f6c6b6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"38023746-1010-4fa8-a0d6-8863807eb181", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6c6b6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979", Pod:"calico-apiserver-7f6c6b6d-9txl4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali73996c91ad7", MAC:"72:2a:74:d5:14:87", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.622232 containerd[1571]: 2025-09-09 22:05:51.606 [INFO][4852] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-9txl4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:05:51.624072 containerd[1571]: time="2025-09-09T22:05:51.623819696Z" level=info msg="connecting to shim 06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf" address="unix:///run/containerd/s/db2eee85306f765b5b59b3c336b2b7dbe141f5e0f14e58d5b9ce0c753c6215c0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:51.624313 containerd[1571]: time="2025-09-09T22:05:51.624180025Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:51.632719 containerd[1571]: time="2025-09-09T22:05:51.632659923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:51.633289 containerd[1571]: time="2025-09-09T22:05:51.633219328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.564490956s" Sep 9 22:05:51.633289 containerd[1571]: time="2025-09-09T22:05:51.633279582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 22:05:51.639649 containerd[1571]: time="2025-09-09T22:05:51.639026086Z" level=info msg="CreateContainer within sandbox \"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 22:05:51.666811 containerd[1571]: time="2025-09-09T22:05:51.666608029Z" level=info msg="Container 717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:51.685579 containerd[1571]: time="2025-09-09T22:05:51.685056963Z" level=info msg="connecting to shim 4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" address="unix:///run/containerd/s/cae175c28ec302ddf910784a3d42d236331897465540154d023339480aa0fce4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:51.698421 containerd[1571]: time="2025-09-09T22:05:51.698277739Z" level=info msg="CreateContainer within sandbox \"b2781f19923dd0ab76303d07d71f3e9a20c33e19a6310cdf4b97df8c627c65ab\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262\"" Sep 9 22:05:51.700497 containerd[1571]: time="2025-09-09T22:05:51.699650878Z" level=info msg="StartContainer for \"717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262\"" Sep 9 22:05:51.701047 containerd[1571]: time="2025-09-09T22:05:51.700992949Z" level=info msg="connecting to shim 717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262" address="unix:///run/containerd/s/6989fd5fcd1decc74e94f05dc1e4a8fe326a07fddbe0c07e7e35f62407012a64" protocol=ttrpc version=3 Sep 9 22:05:51.715919 systemd[1]: Started cri-containerd-92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632.scope - libcontainer container 92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632. Sep 9 22:05:51.731757 systemd[1]: Started cri-containerd-06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf.scope - libcontainer container 06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf. Sep 9 22:05:51.738420 systemd[1]: Started cri-containerd-4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979.scope - libcontainer container 4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979. Sep 9 22:05:51.755038 systemd[1]: Started cri-containerd-717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262.scope - libcontainer container 717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262. Sep 9 22:05:51.769388 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:51.771064 systemd-networkd[1455]: cali10706d7a5cd: Link UP Sep 9 22:05:51.772044 systemd-networkd[1455]: cali10706d7a5cd: Gained carrier Sep 9 22:05:51.780079 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:51.784404 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.600 [INFO][4931] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0 calico-apiserver-7f6c6b6d- calico-apiserver dd79b509-695c-4e9d-acec-db0b4753f41e 901 0 2025-09-09 22:04:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f6c6b6d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7f6c6b6d-td8gs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali10706d7a5cd [] [] }} ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.600 [INFO][4931] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.673 [INFO][4983] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.673 [INFO][4983] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00047c2a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7f6c6b6d-td8gs", "timestamp":"2025-09-09 22:05:51.673041268 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.673 [INFO][4983] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.673 [INFO][4983] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.673 [INFO][4983] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.689 [INFO][4983] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.702 [INFO][4983] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.724 [INFO][4983] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.727 [INFO][4983] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.733 [INFO][4983] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.733 [INFO][4983] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.737 [INFO][4983] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.747 [INFO][4983] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.757 [INFO][4983] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.757 [INFO][4983] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" host="localhost" Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.757 [INFO][4983] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:51.797693 containerd[1571]: 2025-09-09 22:05:51.758 [INFO][4983] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.767 [INFO][4931] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0", GenerateName:"calico-apiserver-7f6c6b6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd79b509-695c-4e9d-acec-db0b4753f41e", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6c6b6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7f6c6b6d-td8gs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali10706d7a5cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.767 [INFO][4931] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.767 [INFO][4931] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali10706d7a5cd ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.772 [INFO][4931] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.773 [INFO][4931] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0", GenerateName:"calico-apiserver-7f6c6b6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd79b509-695c-4e9d-acec-db0b4753f41e", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f6c6b6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e", Pod:"calico-apiserver-7f6c6b6d-td8gs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali10706d7a5cd", MAC:"42:16:e6:a3:91:a0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:51.798647 containerd[1571]: 2025-09-09 22:05:51.792 [INFO][4931] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Namespace="calico-apiserver" Pod="calico-apiserver-7f6c6b6d-td8gs" WorkloadEndpoint="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:05:51.850861 containerd[1571]: time="2025-09-09T22:05:51.850783384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-w7dk5,Uid:c6471ca4-fc51-48a3-aa84-42a6f5df3c60,Namespace:kube-system,Attempt:0,} returns sandbox id \"06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf\"" Sep 9 22:05:51.855493 kubelet[2845]: E0909 22:05:51.854507 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:51.864850 containerd[1571]: time="2025-09-09T22:05:51.864799399Z" level=info msg="CreateContainer within sandbox \"06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 22:05:51.874290 systemd-networkd[1455]: cali6b93512223b: Gained IPv6LL Sep 9 22:05:51.901728 containerd[1571]: time="2025-09-09T22:05:51.901664445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-svg5l,Uid:26331079-5184-44f3-8131-980ec1b1f932,Namespace:calico-system,Attempt:0,} returns sandbox id \"92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632\"" Sep 9 22:05:51.906691 containerd[1571]: time="2025-09-09T22:05:51.906027611Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 22:05:51.921197 containerd[1571]: time="2025-09-09T22:05:51.921137059Z" level=info msg="StartContainer for \"717a69424fd6a6de3748653fb93dc15b1acbeb800abc151768445565284ce262\" returns successfully" Sep 9 22:05:51.921866 containerd[1571]: time="2025-09-09T22:05:51.921823413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-9txl4,Uid:38023746-1010-4fa8-a0d6-8863807eb181,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\"" Sep 9 22:05:51.933412 containerd[1571]: time="2025-09-09T22:05:51.933107118Z" level=info msg="connecting to shim 1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" address="unix:///run/containerd/s/e493c6d6a5772300759e3ced2d2f86e7206d754617d234f2558fd70134f5e446" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:51.946157 containerd[1571]: time="2025-09-09T22:05:51.946062243Z" level=info msg="Container 9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:51.959003 containerd[1571]: time="2025-09-09T22:05:51.958925315Z" level=info msg="CreateContainer within sandbox \"06b8fbc958ff4447b54f8c270a59ef7b3c973cbae351aca2b6017e9ca55b6dbf\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89\"" Sep 9 22:05:51.961048 containerd[1571]: time="2025-09-09T22:05:51.961005387Z" level=info msg="StartContainer for \"9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89\"" Sep 9 22:05:51.962557 containerd[1571]: time="2025-09-09T22:05:51.962305328Z" level=info msg="connecting to shim 9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89" address="unix:///run/containerd/s/db2eee85306f765b5b59b3c336b2b7dbe141f5e0f14e58d5b9ce0c753c6215c0" protocol=ttrpc version=3 Sep 9 22:05:51.975111 systemd[1]: Started cri-containerd-1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e.scope - libcontainer container 1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e. Sep 9 22:05:51.984651 systemd[1]: Started cri-containerd-9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89.scope - libcontainer container 9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89. Sep 9 22:05:52.005649 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:52.046047 containerd[1571]: time="2025-09-09T22:05:52.045978271Z" level=info msg="StartContainer for \"9e47be04c963c557e6a8b7606ccfe080dfd788ddc93115295d9c1e31ffc3fa89\" returns successfully" Sep 9 22:05:52.057505 kubelet[2845]: E0909 22:05:52.056451 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:52.058201 containerd[1571]: time="2025-09-09T22:05:52.057899476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:52.058772 containerd[1571]: time="2025-09-09T22:05:52.058035823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,}" Sep 9 22:05:52.059488 containerd[1571]: time="2025-09-09T22:05:52.059235695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,}" Sep 9 22:05:52.078929 containerd[1571]: time="2025-09-09T22:05:52.078845526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f6c6b6d-td8gs,Uid:dd79b509-695c-4e9d-acec-db0b4753f41e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\"" Sep 9 22:05:52.315534 systemd-networkd[1455]: cali152a36c0586: Link UP Sep 9 22:05:52.316219 systemd-networkd[1455]: cali152a36c0586: Gained carrier Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.150 [INFO][5246] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--6mwcf-eth0 csi-node-driver- calico-system d2140908-6680-44bb-ab1a-8d40f2e95451 774 0 2025-09-09 22:04:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-6mwcf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali152a36c0586 [] [] }} ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.150 [INFO][5246] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.209 [INFO][5270] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" HandleID="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Workload="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.210 [INFO][5270] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" HandleID="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Workload="localhost-k8s-csi--node--driver--6mwcf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00049b8a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-6mwcf", "timestamp":"2025-09-09 22:05:52.209283013 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.210 [INFO][5270] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.210 [INFO][5270] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.210 [INFO][5270] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.228 [INFO][5270] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.245 [INFO][5270] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.264 [INFO][5270] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.269 [INFO][5270] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.273 [INFO][5270] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.273 [INFO][5270] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.276 [INFO][5270] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955 Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.287 [INFO][5270] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.305 [INFO][5270] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.305 [INFO][5270] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" host="localhost" Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.306 [INFO][5270] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:52.424053 containerd[1571]: 2025-09-09 22:05:52.306 [INFO][5270] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" HandleID="k8s-pod-network.9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Workload="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.310 [INFO][5246] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--6mwcf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d2140908-6680-44bb-ab1a-8d40f2e95451", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-6mwcf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali152a36c0586", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.311 [INFO][5246] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.311 [INFO][5246] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali152a36c0586 ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.316 [INFO][5246] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.318 [INFO][5246] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--6mwcf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d2140908-6680-44bb-ab1a-8d40f2e95451", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955", Pod:"csi-node-driver-6mwcf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali152a36c0586", MAC:"ee:48:b2:a1:66:36", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.424983 containerd[1571]: 2025-09-09 22:05:52.419 [INFO][5246] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" Namespace="calico-system" Pod="csi-node-driver-6mwcf" WorkloadEndpoint="localhost-k8s-csi--node--driver--6mwcf-eth0" Sep 9 22:05:52.471492 containerd[1571]: time="2025-09-09T22:05:52.469632473Z" level=info msg="connecting to shim 9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955" address="unix:///run/containerd/s/d14c87b801ce636fe0967a0066d90769d30ceea19d9a929c3da0e0dc84c80d32" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:52.475352 systemd-networkd[1455]: cali35dd1d86ebc: Link UP Sep 9 22:05:52.478199 systemd-networkd[1455]: cali35dd1d86ebc: Gained carrier Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.150 [INFO][5225] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0 calico-kube-controllers-57f6cff959- calico-system c8d9b567-61ce-4f43-b430-38feae93a0e4 896 0 2025-09-09 22:04:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:57f6cff959 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-57f6cff959-8vhb8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali35dd1d86ebc [] [] }} ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.151 [INFO][5225] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.225 [INFO][5278] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" HandleID="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Workload="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.226 [INFO][5278] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" HandleID="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Workload="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000491f20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-57f6cff959-8vhb8", "timestamp":"2025-09-09 22:05:52.22555492 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.226 [INFO][5278] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.306 [INFO][5278] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.306 [INFO][5278] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.327 [INFO][5278] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.424 [INFO][5278] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.431 [INFO][5278] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.433 [INFO][5278] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.436 [INFO][5278] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.436 [INFO][5278] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.438 [INFO][5278] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0 Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.445 [INFO][5278] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5278] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5278] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" host="localhost" Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5278] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:52.500328 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5278] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" HandleID="k8s-pod-network.61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Workload="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.465 [INFO][5225] cni-plugin/k8s.go 418: Populated endpoint ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0", GenerateName:"calico-kube-controllers-57f6cff959-", Namespace:"calico-system", SelfLink:"", UID:"c8d9b567-61ce-4f43-b430-38feae93a0e4", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57f6cff959", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-57f6cff959-8vhb8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali35dd1d86ebc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.466 [INFO][5225] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.466 [INFO][5225] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35dd1d86ebc ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.477 [INFO][5225] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.482 [INFO][5225] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0", GenerateName:"calico-kube-controllers-57f6cff959-", Namespace:"calico-system", SelfLink:"", UID:"c8d9b567-61ce-4f43-b430-38feae93a0e4", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"57f6cff959", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0", Pod:"calico-kube-controllers-57f6cff959-8vhb8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali35dd1d86ebc", MAC:"6e:56:17:a1:ca:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.501013 containerd[1571]: 2025-09-09 22:05:52.496 [INFO][5225] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" Namespace="calico-system" Pod="calico-kube-controllers-57f6cff959-8vhb8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--57f6cff959--8vhb8-eth0" Sep 9 22:05:52.521994 systemd[1]: Started cri-containerd-9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955.scope - libcontainer container 9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955. Sep 9 22:05:52.564849 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:52.585548 systemd-networkd[1455]: cali8feef04d51a: Link UP Sep 9 22:05:52.586824 containerd[1571]: time="2025-09-09T22:05:52.586545931Z" level=info msg="connecting to shim 61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0" address="unix:///run/containerd/s/34d3a2818043f217b609b267aae7869343f0344e355b86854c55761d1645463a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:52.587209 systemd-networkd[1455]: cali8feef04d51a: Gained carrier Sep 9 22:05:52.601349 containerd[1571]: time="2025-09-09T22:05:52.601233452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6mwcf,Uid:d2140908-6680-44bb-ab1a-8d40f2e95451,Namespace:calico-system,Attempt:0,} returns sandbox id \"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955\"" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.169 [INFO][5234] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0 coredns-7c65d6cfc9- kube-system 593c5a36-33f4-4cef-ac0c-f4a95824b3e7 913 0 2025-09-09 22:04:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-zqjqr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8feef04d51a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.169 [INFO][5234] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.237 [INFO][5285] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" HandleID="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Workload="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.238 [INFO][5285] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" HandleID="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Workload="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b76d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-zqjqr", "timestamp":"2025-09-09 22:05:52.237869828 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.238 [INFO][5285] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5285] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.455 [INFO][5285] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.473 [INFO][5285] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.527 [INFO][5285] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.536 [INFO][5285] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.539 [INFO][5285] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.542 [INFO][5285] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.542 [INFO][5285] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.545 [INFO][5285] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.553 [INFO][5285] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.569 [INFO][5285] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.569 [INFO][5285] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" host="localhost" Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.570 [INFO][5285] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:52.618084 containerd[1571]: 2025-09-09 22:05:52.570 [INFO][5285] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" HandleID="k8s-pod-network.40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Workload="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.581 [INFO][5234] cni-plugin/k8s.go 418: Populated endpoint ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"593c5a36-33f4-4cef-ac0c-f4a95824b3e7", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-zqjqr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8feef04d51a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.581 [INFO][5234] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.581 [INFO][5234] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8feef04d51a ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.588 [INFO][5234] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.590 [INFO][5234] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"593c5a36-33f4-4cef-ac0c-f4a95824b3e7", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b", Pod:"coredns-7c65d6cfc9-zqjqr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8feef04d51a", MAC:"9a:8d:14:4a:83:9f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:52.619913 containerd[1571]: 2025-09-09 22:05:52.603 [INFO][5234] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zqjqr" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--zqjqr-eth0" Sep 9 22:05:52.642913 systemd[1]: Started cri-containerd-61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0.scope - libcontainer container 61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0. Sep 9 22:05:52.657158 containerd[1571]: time="2025-09-09T22:05:52.657089178Z" level=info msg="connecting to shim 40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b" address="unix:///run/containerd/s/a878863aa64eed4d7ffa27988649ede6a53489266b84dfb920a28d4d8f90326f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:52.664365 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:52.694005 systemd[1]: Started cri-containerd-40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b.scope - libcontainer container 40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b. Sep 9 22:05:52.711511 kubelet[2845]: E0909 22:05:52.711191 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:52.718644 containerd[1571]: time="2025-09-09T22:05:52.718393206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-57f6cff959-8vhb8,Uid:c8d9b567-61ce-4f43-b430-38feae93a0e4,Namespace:calico-system,Attempt:0,} returns sandbox id \"61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0\"" Sep 9 22:05:52.721860 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:52.746385 kubelet[2845]: I0909 22:05:52.746240 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-694b9f4766-tzttj" podStartSLOduration=4.708659136 podStartE2EDuration="11.746218756s" podCreationTimestamp="2025-09-09 22:05:41 +0000 UTC" firstStartedPulling="2025-09-09 22:05:44.597578977 +0000 UTC m=+85.736267278" lastFinishedPulling="2025-09-09 22:05:51.635138607 +0000 UTC m=+92.773826898" observedRunningTime="2025-09-09 22:05:52.704812281 +0000 UTC m=+93.843500602" watchObservedRunningTime="2025-09-09 22:05:52.746218756 +0000 UTC m=+93.884907057" Sep 9 22:05:52.746385 kubelet[2845]: I0909 22:05:52.746374 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-w7dk5" podStartSLOduration=90.746368768 podStartE2EDuration="1m30.746368768s" podCreationTimestamp="2025-09-09 22:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:05:52.745087092 +0000 UTC m=+93.883775403" watchObservedRunningTime="2025-09-09 22:05:52.746368768 +0000 UTC m=+93.885057069" Sep 9 22:05:52.776279 containerd[1571]: time="2025-09-09T22:05:52.776097517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zqjqr,Uid:593c5a36-33f4-4cef-ac0c-f4a95824b3e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b\"" Sep 9 22:05:52.777320 kubelet[2845]: E0909 22:05:52.777249 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:52.784787 containerd[1571]: time="2025-09-09T22:05:52.784686108Z" level=info msg="CreateContainer within sandbox \"40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 22:05:52.822336 containerd[1571]: time="2025-09-09T22:05:52.822280866Z" level=info msg="Container 46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:52.832676 systemd-networkd[1455]: cali10706d7a5cd: Gained IPv6LL Sep 9 22:05:52.833710 systemd-networkd[1455]: cali99fea5fc422: Gained IPv6LL Sep 9 22:05:52.836686 containerd[1571]: time="2025-09-09T22:05:52.836643435Z" level=info msg="CreateContainer within sandbox \"40ec0336086938bed825f604afc8737d832d2b5aedf13ac1764b6e1864f4f09b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf\"" Sep 9 22:05:52.837981 containerd[1571]: time="2025-09-09T22:05:52.837875427Z" level=info msg="StartContainer for \"46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf\"" Sep 9 22:05:52.842302 containerd[1571]: time="2025-09-09T22:05:52.842192696Z" level=info msg="connecting to shim 46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf" address="unix:///run/containerd/s/a878863aa64eed4d7ffa27988649ede6a53489266b84dfb920a28d4d8f90326f" protocol=ttrpc version=3 Sep 9 22:05:52.883846 systemd[1]: Started cri-containerd-46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf.scope - libcontainer container 46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf. Sep 9 22:05:52.939876 containerd[1571]: time="2025-09-09T22:05:52.939799576Z" level=info msg="StartContainer for \"46c198ea17311d143df7066e8a2e4dc58f981d90bfdd0cc669ae2ca2ede966cf\" returns successfully" Sep 9 22:05:53.471826 systemd-networkd[1455]: cali73996c91ad7: Gained IPv6LL Sep 9 22:05:53.718393 kubelet[2845]: E0909 22:05:53.718083 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:53.719025 kubelet[2845]: E0909 22:05:53.718653 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:53.736091 kubelet[2845]: I0909 22:05:53.735537 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-zqjqr" podStartSLOduration=90.735508185 podStartE2EDuration="1m30.735508185s" podCreationTimestamp="2025-09-09 22:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 22:05:53.735050181 +0000 UTC m=+94.873738482" watchObservedRunningTime="2025-09-09 22:05:53.735508185 +0000 UTC m=+94.874196496" Sep 9 22:05:54.056127 containerd[1571]: time="2025-09-09T22:05:54.055710346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,}" Sep 9 22:05:54.111664 systemd-networkd[1455]: cali152a36c0586: Gained IPv6LL Sep 9 22:05:54.431678 systemd-networkd[1455]: cali35dd1d86ebc: Gained IPv6LL Sep 9 22:05:54.559704 systemd-networkd[1455]: cali8feef04d51a: Gained IPv6LL Sep 9 22:05:54.713747 systemd-networkd[1455]: cali670b5df3fe9: Link UP Sep 9 22:05:54.715444 systemd-networkd[1455]: cali670b5df3fe9: Gained carrier Sep 9 22:05:54.721879 kubelet[2845]: E0909 22:05:54.721827 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:54.722875 kubelet[2845]: E0909 22:05:54.722785 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.272 [INFO][5512] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0 calico-apiserver-6787546b8c- calico-apiserver b81662e5-c9f0-4152-9bb6-b076bea88390 906 0 2025-09-09 22:04:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6787546b8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6787546b8c-mbr2m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali670b5df3fe9 [] [] }} ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.272 [INFO][5512] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.312 [INFO][5528] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" HandleID="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Workload="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.312 [INFO][5528] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" HandleID="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Workload="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001356b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6787546b8c-mbr2m", "timestamp":"2025-09-09 22:05:54.311998015 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.312 [INFO][5528] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.312 [INFO][5528] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.312 [INFO][5528] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.327 [INFO][5528] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.336 [INFO][5528] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.344 [INFO][5528] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.348 [INFO][5528] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.355 [INFO][5528] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.355 [INFO][5528] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.357 [INFO][5528] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784 Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.427 [INFO][5528] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.702 [INFO][5528] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.702 [INFO][5528] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" host="localhost" Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.702 [INFO][5528] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:05:54.972760 containerd[1571]: 2025-09-09 22:05:54.702 [INFO][5528] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" HandleID="k8s-pod-network.b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Workload="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.709 [INFO][5512] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0", GenerateName:"calico-apiserver-6787546b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"b81662e5-c9f0-4152-9bb6-b076bea88390", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787546b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6787546b8c-mbr2m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali670b5df3fe9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.709 [INFO][5512] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.709 [INFO][5512] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali670b5df3fe9 ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.712 [INFO][5512] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.713 [INFO][5512] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0", GenerateName:"calico-apiserver-6787546b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"b81662e5-c9f0-4152-9bb6-b076bea88390", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 22, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6787546b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784", Pod:"calico-apiserver-6787546b8c-mbr2m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali670b5df3fe9", MAC:"72:7f:4b:78:5f:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 22:05:55.231715 containerd[1571]: 2025-09-09 22:05:54.967 [INFO][5512] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" Namespace="calico-apiserver" Pod="calico-apiserver-6787546b8c-mbr2m" WorkloadEndpoint="localhost-k8s-calico--apiserver--6787546b8c--mbr2m-eth0" Sep 9 22:05:55.723929 kubelet[2845]: E0909 22:05:55.723875 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:05:55.801814 containerd[1571]: time="2025-09-09T22:05:55.801733442Z" level=info msg="connecting to shim b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784" address="unix:///run/containerd/s/2aa02d25b8246c968ab1d166b5ddd16331fc78ea2b321c23051c2f43bde07e0e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 22:05:55.857789 systemd[1]: Started cri-containerd-b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784.scope - libcontainer container b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784. Sep 9 22:05:55.889531 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 22:05:55.905508 systemd-journald[1194]: Under memory pressure, flushing caches. Sep 9 22:05:55.935038 containerd[1571]: time="2025-09-09T22:05:55.934964141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6787546b8c-mbr2m,Uid:b81662e5-c9f0-4152-9bb6-b076bea88390,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784\"" Sep 9 22:05:56.067481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount76452088.mount: Deactivated successfully. Sep 9 22:05:56.479779 systemd-networkd[1455]: cali670b5df3fe9: Gained IPv6LL Sep 9 22:05:56.559837 systemd[1]: Started sshd@12-10.0.0.72:22-10.0.0.1:44748.service - OpenSSH per-connection server daemon (10.0.0.1:44748). Sep 9 22:05:56.705397 sshd[5595]: Accepted publickey for core from 10.0.0.1 port 44748 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:05:56.813388 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:05:56.818732 systemd-logind[1554]: New session 13 of user core. Sep 9 22:05:56.826643 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 22:05:57.303942 sshd[5598]: Connection closed by 10.0.0.1 port 44748 Sep 9 22:05:57.304334 sshd-session[5595]: pam_unix(sshd:session): session closed for user core Sep 9 22:05:57.309189 systemd[1]: sshd@12-10.0.0.72:22-10.0.0.1:44748.service: Deactivated successfully. Sep 9 22:05:57.311768 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 22:05:57.312876 systemd-logind[1554]: Session 13 logged out. Waiting for processes to exit. Sep 9 22:05:57.314596 systemd-logind[1554]: Removed session 13. Sep 9 22:05:58.645726 containerd[1571]: time="2025-09-09T22:05:58.645618498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:58.650168 containerd[1571]: time="2025-09-09T22:05:58.650114412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 22:05:58.653250 containerd[1571]: time="2025-09-09T22:05:58.653130908Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:58.658461 containerd[1571]: time="2025-09-09T22:05:58.658384259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:05:58.658461 containerd[1571]: time="2025-09-09T22:05:58.658450554Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 6.752359954s" Sep 9 22:05:58.658684 containerd[1571]: time="2025-09-09T22:05:58.658517290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 22:05:58.659932 containerd[1571]: time="2025-09-09T22:05:58.659884848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 22:05:58.661581 containerd[1571]: time="2025-09-09T22:05:58.661522274Z" level=info msg="CreateContainer within sandbox \"92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 22:05:58.679656 containerd[1571]: time="2025-09-09T22:05:58.679581072Z" level=info msg="Container 7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:05:58.697739 containerd[1571]: time="2025-09-09T22:05:58.697683833Z" level=info msg="CreateContainer within sandbox \"92a9af0dd3bb8945c65e83b0a9eb041a25ee93f1af88869e09ba1b292ee73632\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\"" Sep 9 22:05:58.698401 containerd[1571]: time="2025-09-09T22:05:58.698356160Z" level=info msg="StartContainer for \"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\"" Sep 9 22:05:58.700257 containerd[1571]: time="2025-09-09T22:05:58.700197731Z" level=info msg="connecting to shim 7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5" address="unix:///run/containerd/s/714e0d4a082d9c990285b5acbad9a6aa6c704c41f2043b78366deb0f489f25f2" protocol=ttrpc version=3 Sep 9 22:05:58.759827 systemd[1]: Started cri-containerd-7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5.scope - libcontainer container 7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5. Sep 9 22:05:59.004788 containerd[1571]: time="2025-09-09T22:05:59.004565595Z" level=info msg="StartContainer for \"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" returns successfully" Sep 9 22:05:59.400235 containerd[1571]: time="2025-09-09T22:05:59.400126977Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" id:\"f18a62b668bc622ba6065ea2ea14746957da2d77f01759da7e56e6123ce9ed1a\" pid:5667 exited_at:{seconds:1757455559 nanos:399594313}" Sep 9 22:05:59.837894 containerd[1571]: time="2025-09-09T22:05:59.837840516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" id:\"bfa5156ea5835ba7053e1ce1d24ceafaf7d89da2206aa3fc5450257648b58ae3\" pid:5694 exit_status:1 exited_at:{seconds:1757455559 nanos:837260644}" Sep 9 22:06:00.207296 kubelet[2845]: I0909 22:06:00.207037 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-svg5l" podStartSLOduration=56.452435935 podStartE2EDuration="1m3.207012797s" podCreationTimestamp="2025-09-09 22:04:57 +0000 UTC" firstStartedPulling="2025-09-09 22:05:51.905173161 +0000 UTC m=+93.043861462" lastFinishedPulling="2025-09-09 22:05:58.659750023 +0000 UTC m=+99.798438324" observedRunningTime="2025-09-09 22:06:00.206750272 +0000 UTC m=+101.345438573" watchObservedRunningTime="2025-09-09 22:06:00.207012797 +0000 UTC m=+101.345701098" Sep 9 22:06:00.910906 containerd[1571]: time="2025-09-09T22:06:00.910828929Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" id:\"445e95bb138a2ed9891c68741a24807c3d44fac544b57d8e842c0bf64ebf1c37\" pid:5726 exited_at:{seconds:1757455560 nanos:910174075}" Sep 9 22:06:02.320729 systemd[1]: Started sshd@13-10.0.0.72:22-10.0.0.1:37244.service - OpenSSH per-connection server daemon (10.0.0.1:37244). Sep 9 22:06:02.968455 sshd[5741]: Accepted publickey for core from 10.0.0.1 port 37244 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:02.971031 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:02.977919 systemd-logind[1554]: New session 14 of user core. Sep 9 22:06:02.990303 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 22:06:03.331187 sshd[5744]: Connection closed by 10.0.0.1 port 37244 Sep 9 22:06:03.336042 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:03.350845 systemd[1]: Started sshd@14-10.0.0.72:22-10.0.0.1:37260.service - OpenSSH per-connection server daemon (10.0.0.1:37260). Sep 9 22:06:03.351922 systemd[1]: sshd@13-10.0.0.72:22-10.0.0.1:37244.service: Deactivated successfully. Sep 9 22:06:03.358613 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 22:06:03.369871 systemd-logind[1554]: Session 14 logged out. Waiting for processes to exit. Sep 9 22:06:03.377418 systemd-logind[1554]: Removed session 14. Sep 9 22:06:03.441698 sshd[5762]: Accepted publickey for core from 10.0.0.1 port 37260 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:03.442907 sshd-session[5762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:03.452832 systemd-logind[1554]: New session 15 of user core. Sep 9 22:06:03.462830 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 22:06:03.683440 sshd[5774]: Connection closed by 10.0.0.1 port 37260 Sep 9 22:06:03.684492 sshd-session[5762]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:03.696312 systemd[1]: sshd@14-10.0.0.72:22-10.0.0.1:37260.service: Deactivated successfully. Sep 9 22:06:03.699952 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 22:06:03.703283 systemd-logind[1554]: Session 15 logged out. Waiting for processes to exit. Sep 9 22:06:03.707441 systemd-logind[1554]: Removed session 15. Sep 9 22:06:03.710042 systemd[1]: Started sshd@15-10.0.0.72:22-10.0.0.1:37276.service - OpenSSH per-connection server daemon (10.0.0.1:37276). Sep 9 22:06:03.778814 sshd[5785]: Accepted publickey for core from 10.0.0.1 port 37276 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:03.782079 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:03.790358 systemd-logind[1554]: New session 16 of user core. Sep 9 22:06:03.802177 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 22:06:03.979595 sshd[5788]: Connection closed by 10.0.0.1 port 37276 Sep 9 22:06:03.981104 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:03.986962 systemd-logind[1554]: Session 16 logged out. Waiting for processes to exit. Sep 9 22:06:03.988142 systemd[1]: sshd@15-10.0.0.72:22-10.0.0.1:37276.service: Deactivated successfully. Sep 9 22:06:03.991987 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 22:06:03.995323 systemd-logind[1554]: Removed session 16. Sep 9 22:06:04.105052 containerd[1571]: time="2025-09-09T22:06:04.104958432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:04.106675 containerd[1571]: time="2025-09-09T22:06:04.106632477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 22:06:04.108636 containerd[1571]: time="2025-09-09T22:06:04.108552144Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:04.111815 containerd[1571]: time="2025-09-09T22:06:04.111769798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:04.112625 containerd[1571]: time="2025-09-09T22:06:04.112555829Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 5.452600529s" Sep 9 22:06:04.112625 containerd[1571]: time="2025-09-09T22:06:04.112623306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 22:06:04.114369 containerd[1571]: time="2025-09-09T22:06:04.114327347Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 22:06:04.115381 containerd[1571]: time="2025-09-09T22:06:04.115324276Z" level=info msg="CreateContainer within sandbox \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 22:06:04.133187 containerd[1571]: time="2025-09-09T22:06:04.133100284Z" level=info msg="Container 802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:04.153382 containerd[1571]: time="2025-09-09T22:06:04.153310038Z" level=info msg="CreateContainer within sandbox \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\"" Sep 9 22:06:04.154020 containerd[1571]: time="2025-09-09T22:06:04.153995850Z" level=info msg="StartContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\"" Sep 9 22:06:04.155409 containerd[1571]: time="2025-09-09T22:06:04.155348710Z" level=info msg="connecting to shim 802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0" address="unix:///run/containerd/s/cae175c28ec302ddf910784a3d42d236331897465540154d023339480aa0fce4" protocol=ttrpc version=3 Sep 9 22:06:04.185727 systemd[1]: Started cri-containerd-802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0.scope - libcontainer container 802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0. Sep 9 22:06:04.248007 containerd[1571]: time="2025-09-09T22:06:04.247833673Z" level=info msg="StartContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" returns successfully" Sep 9 22:06:04.643879 containerd[1571]: time="2025-09-09T22:06:04.643797506Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:04.666752 containerd[1571]: time="2025-09-09T22:06:04.666669427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 22:06:04.669137 containerd[1571]: time="2025-09-09T22:06:04.668988758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 554.630814ms" Sep 9 22:06:04.669137 containerd[1571]: time="2025-09-09T22:06:04.669021840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 22:06:04.670518 containerd[1571]: time="2025-09-09T22:06:04.670239595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 22:06:04.671916 containerd[1571]: time="2025-09-09T22:06:04.671848016Z" level=info msg="CreateContainer within sandbox \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 22:06:04.688600 containerd[1571]: time="2025-09-09T22:06:04.688510144Z" level=info msg="Container e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:04.703764 containerd[1571]: time="2025-09-09T22:06:04.703687907Z" level=info msg="CreateContainer within sandbox \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\"" Sep 9 22:06:04.704987 containerd[1571]: time="2025-09-09T22:06:04.704958971Z" level=info msg="StartContainer for \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\"" Sep 9 22:06:04.706391 containerd[1571]: time="2025-09-09T22:06:04.706362396Z" level=info msg="connecting to shim e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85" address="unix:///run/containerd/s/e493c6d6a5772300759e3ced2d2f86e7206d754617d234f2558fd70134f5e446" protocol=ttrpc version=3 Sep 9 22:06:04.742844 systemd[1]: Started cri-containerd-e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85.scope - libcontainer container e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85. Sep 9 22:06:04.784267 kubelet[2845]: I0909 22:06:04.783682 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f6c6b6d-9txl4" podStartSLOduration=62.600341021 podStartE2EDuration="1m14.7836604s" podCreationTimestamp="2025-09-09 22:04:50 +0000 UTC" firstStartedPulling="2025-09-09 22:05:51.930177423 +0000 UTC m=+93.068865724" lastFinishedPulling="2025-09-09 22:06:04.113496792 +0000 UTC m=+105.252185103" observedRunningTime="2025-09-09 22:06:04.782694189 +0000 UTC m=+105.921382490" watchObservedRunningTime="2025-09-09 22:06:04.7836604 +0000 UTC m=+105.922348701" Sep 9 22:06:04.829086 containerd[1571]: time="2025-09-09T22:06:04.829025829Z" level=info msg="StartContainer for \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" returns successfully" Sep 9 22:06:06.145498 kubelet[2845]: I0909 22:06:06.145381 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f6c6b6d-td8gs" podStartSLOduration=63.556530793 podStartE2EDuration="1m16.145296156s" podCreationTimestamp="2025-09-09 22:04:50 +0000 UTC" firstStartedPulling="2025-09-09 22:05:52.081334639 +0000 UTC m=+93.220022950" lastFinishedPulling="2025-09-09 22:06:04.670100012 +0000 UTC m=+105.808788313" observedRunningTime="2025-09-09 22:06:06.143688477 +0000 UTC m=+107.282376778" watchObservedRunningTime="2025-09-09 22:06:06.145296156 +0000 UTC m=+107.283984477" Sep 9 22:06:06.772584 kubelet[2845]: I0909 22:06:06.772532 2845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 22:06:08.232388 containerd[1571]: time="2025-09-09T22:06:08.232258497Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:08.235689 containerd[1571]: time="2025-09-09T22:06:08.235646560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 22:06:08.258806 containerd[1571]: time="2025-09-09T22:06:08.258745223Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:08.310527 containerd[1571]: time="2025-09-09T22:06:08.310394051Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:08.311460 containerd[1571]: time="2025-09-09T22:06:08.311418001Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 3.641151365s" Sep 9 22:06:08.311755 containerd[1571]: time="2025-09-09T22:06:08.311696114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 22:06:08.312831 containerd[1571]: time="2025-09-09T22:06:08.312797179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 22:06:08.314695 containerd[1571]: time="2025-09-09T22:06:08.314644090Z" level=info msg="CreateContainer within sandbox \"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 22:06:08.780884 containerd[1571]: time="2025-09-09T22:06:08.780825850Z" level=info msg="Container cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:08.786966 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4011719262.mount: Deactivated successfully. Sep 9 22:06:08.993234 systemd[1]: Started sshd@16-10.0.0.72:22-10.0.0.1:37292.service - OpenSSH per-connection server daemon (10.0.0.1:37292). Sep 9 22:06:09.143564 sshd[5891]: Accepted publickey for core from 10.0.0.1 port 37292 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:09.145352 sshd-session[5891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:09.151366 systemd-logind[1554]: New session 17 of user core. Sep 9 22:06:09.163662 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 22:06:09.332413 containerd[1571]: time="2025-09-09T22:06:09.332331483Z" level=info msg="CreateContainer within sandbox \"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb\"" Sep 9 22:06:09.333402 containerd[1571]: time="2025-09-09T22:06:09.333337829Z" level=info msg="StartContainer for \"cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb\"" Sep 9 22:06:09.335322 containerd[1571]: time="2025-09-09T22:06:09.335266232Z" level=info msg="connecting to shim cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb" address="unix:///run/containerd/s/d14c87b801ce636fe0967a0066d90769d30ceea19d9a929c3da0e0dc84c80d32" protocol=ttrpc version=3 Sep 9 22:06:09.358646 systemd[1]: Started cri-containerd-cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb.scope - libcontainer container cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb. Sep 9 22:06:09.454045 sshd[5895]: Connection closed by 10.0.0.1 port 37292 Sep 9 22:06:09.434625 systemd[1]: sshd@16-10.0.0.72:22-10.0.0.1:37292.service: Deactivated successfully. Sep 9 22:06:09.429140 sshd-session[5891]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:09.436946 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 22:06:09.437902 systemd-logind[1554]: Session 17 logged out. Waiting for processes to exit. Sep 9 22:06:09.439933 systemd-logind[1554]: Removed session 17. Sep 9 22:06:09.504962 containerd[1571]: time="2025-09-09T22:06:09.504893409Z" level=info msg="StartContainer for \"cd71370bb398c765c9a58fa7e107c3d1cdf578d57b0a5cab5c1ecea1274d3feb\" returns successfully" Sep 9 22:06:14.055859 kubelet[2845]: E0909 22:06:14.055790 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:06:14.446809 systemd[1]: Started sshd@17-10.0.0.72:22-10.0.0.1:49842.service - OpenSSH per-connection server daemon (10.0.0.1:49842). Sep 9 22:06:15.185277 containerd[1571]: time="2025-09-09T22:06:15.185189526Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:15.187301 containerd[1571]: time="2025-09-09T22:06:15.187228287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 22:06:15.192997 sshd[5944]: Accepted publickey for core from 10.0.0.1 port 49842 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:15.195748 sshd-session[5944]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:15.197367 containerd[1571]: time="2025-09-09T22:06:15.197322594Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:15.205907 systemd-logind[1554]: New session 18 of user core. Sep 9 22:06:15.211787 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 22:06:15.232598 containerd[1571]: time="2025-09-09T22:06:15.232407841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:15.237406 containerd[1571]: time="2025-09-09T22:06:15.237344962Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 6.924515703s" Sep 9 22:06:15.237406 containerd[1571]: time="2025-09-09T22:06:15.237392241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 22:06:15.239027 containerd[1571]: time="2025-09-09T22:06:15.238990451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 22:06:15.253053 containerd[1571]: time="2025-09-09T22:06:15.252911096Z" level=info msg="CreateContainer within sandbox \"61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 22:06:15.457983 containerd[1571]: time="2025-09-09T22:06:15.457833550Z" level=info msg="Container af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:15.508012 containerd[1571]: time="2025-09-09T22:06:15.507952348Z" level=info msg="CreateContainer within sandbox \"61124e219529d9d8e6fa3e413c50d1385eb5b81a2b9a4c077643c51be4b754e0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\"" Sep 9 22:06:15.508840 containerd[1571]: time="2025-09-09T22:06:15.508763987Z" level=info msg="StartContainer for \"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\"" Sep 9 22:06:15.510381 containerd[1571]: time="2025-09-09T22:06:15.510346338Z" level=info msg="connecting to shim af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20" address="unix:///run/containerd/s/34d3a2818043f217b609b267aae7869343f0344e355b86854c55761d1645463a" protocol=ttrpc version=3 Sep 9 22:06:15.564795 systemd[1]: Started cri-containerd-af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20.scope - libcontainer container af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20. Sep 9 22:06:15.663242 containerd[1571]: time="2025-09-09T22:06:15.663171314Z" level=info msg="StartContainer for \"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\" returns successfully" Sep 9 22:06:15.725661 sshd[5950]: Connection closed by 10.0.0.1 port 49842 Sep 9 22:06:15.726701 sshd-session[5944]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:15.733139 systemd-logind[1554]: Session 18 logged out. Waiting for processes to exit. Sep 9 22:06:15.733802 systemd[1]: sshd@17-10.0.0.72:22-10.0.0.1:49842.service: Deactivated successfully. Sep 9 22:06:15.736716 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 22:06:15.742876 systemd-logind[1554]: Removed session 18. Sep 9 22:06:15.793444 containerd[1571]: time="2025-09-09T22:06:15.793376036Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:15.795559 containerd[1571]: time="2025-09-09T22:06:15.795496010Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 22:06:15.798426 containerd[1571]: time="2025-09-09T22:06:15.798364404Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 559.332244ms" Sep 9 22:06:15.798426 containerd[1571]: time="2025-09-09T22:06:15.798415380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 22:06:15.799769 containerd[1571]: time="2025-09-09T22:06:15.799732661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 22:06:15.801247 containerd[1571]: time="2025-09-09T22:06:15.801185819Z" level=info msg="CreateContainer within sandbox \"b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 22:06:15.842638 containerd[1571]: time="2025-09-09T22:06:15.842582717Z" level=info msg="Container 92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:15.864973 containerd[1571]: time="2025-09-09T22:06:15.864860335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\" id:\"72a7dfa351787712ba91acc0d829a34baa7dcae8bb3f678e6355d0e52cf4ef58\" pid:6027 exit_status:1 exited_at:{seconds:1757455575 nanos:864485900}" Sep 9 22:06:15.884332 kubelet[2845]: I0909 22:06:15.884115 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-57f6cff959-8vhb8" podStartSLOduration=54.369391355 podStartE2EDuration="1m16.8840892s" podCreationTimestamp="2025-09-09 22:04:59 +0000 UTC" firstStartedPulling="2025-09-09 22:05:52.724027848 +0000 UTC m=+93.862716149" lastFinishedPulling="2025-09-09 22:06:15.238725693 +0000 UTC m=+116.377413994" observedRunningTime="2025-09-09 22:06:15.880260147 +0000 UTC m=+117.018948468" watchObservedRunningTime="2025-09-09 22:06:15.8840892 +0000 UTC m=+117.022777512" Sep 9 22:06:15.895643 containerd[1571]: time="2025-09-09T22:06:15.895569568Z" level=info msg="CreateContainer within sandbox \"b2d1e3cd38e84f68e7a371588ef9f1c964ce41bb615c437fe3de66b85b937784\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c\"" Sep 9 22:06:15.896422 containerd[1571]: time="2025-09-09T22:06:15.896373342Z" level=info msg="StartContainer for \"92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c\"" Sep 9 22:06:15.897706 containerd[1571]: time="2025-09-09T22:06:15.897666879Z" level=info msg="connecting to shim 92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c" address="unix:///run/containerd/s/2aa02d25b8246c968ab1d166b5ddd16331fc78ea2b321c23051c2f43bde07e0e" protocol=ttrpc version=3 Sep 9 22:06:15.928835 systemd[1]: Started cri-containerd-92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c.scope - libcontainer container 92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c. Sep 9 22:06:16.065515 systemd-journald[1194]: Under memory pressure, flushing caches. Sep 9 22:06:16.149328 containerd[1571]: time="2025-09-09T22:06:16.149270578Z" level=info msg="StartContainer for \"92354cb87d58c743a57a2ff14340265bd6c9ba5aad7abb0b3a78ad92d0da355c\" returns successfully" Sep 9 22:06:16.860287 containerd[1571]: time="2025-09-09T22:06:16.860226335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\" id:\"13dbe91b0b2d5c8bb7d49befcf78db1bc0405372bd185ad5ddcf1eb39fc78434\" pid:6087 exited_at:{seconds:1757455576 nanos:859981174}" Sep 9 22:06:17.046615 kubelet[2845]: I0909 22:06:17.046396 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6787546b8c-mbr2m" podStartSLOduration=66.183746956 podStartE2EDuration="1m26.046362367s" podCreationTimestamp="2025-09-09 22:04:51 +0000 UTC" firstStartedPulling="2025-09-09 22:05:55.936880504 +0000 UTC m=+97.075568805" lastFinishedPulling="2025-09-09 22:06:15.799495915 +0000 UTC m=+116.938184216" observedRunningTime="2025-09-09 22:06:17.044340558 +0000 UTC m=+118.183028859" watchObservedRunningTime="2025-09-09 22:06:17.046362367 +0000 UTC m=+118.185050668" Sep 9 22:06:18.246671 containerd[1571]: time="2025-09-09T22:06:18.246593870Z" level=info msg="StopContainer for \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" with timeout 30 (s)" Sep 9 22:06:18.253715 containerd[1571]: time="2025-09-09T22:06:18.253667306Z" level=info msg="Stop container \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" with signal terminated" Sep 9 22:06:18.279868 systemd[1]: cri-containerd-e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85.scope: Deactivated successfully. Sep 9 22:06:18.281053 systemd[1]: cri-containerd-e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85.scope: Consumed 1.073s CPU time, 43.9M memory peak. Sep 9 22:06:18.291891 containerd[1571]: time="2025-09-09T22:06:18.283291872Z" level=info msg="received exit event container_id:\"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" id:\"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" pid:5852 exit_status:1 exited_at:{seconds:1757455578 nanos:282612361}" Sep 9 22:06:18.291891 containerd[1571]: time="2025-09-09T22:06:18.283760825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" id:\"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" pid:5852 exit_status:1 exited_at:{seconds:1757455578 nanos:282612361}" Sep 9 22:06:18.326557 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85-rootfs.mount: Deactivated successfully. Sep 9 22:06:18.934839 containerd[1571]: time="2025-09-09T22:06:18.934726658Z" level=info msg="StopContainer for \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" returns successfully" Sep 9 22:06:18.938493 containerd[1571]: time="2025-09-09T22:06:18.938400469Z" level=info msg="StopPodSandbox for \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\"" Sep 9 22:06:18.947446 containerd[1571]: time="2025-09-09T22:06:18.947349006Z" level=info msg="Container to stop \"e7cad2f78ea570cdaf39d8dbe6883c50fd77f0fed2897d631808cc1171d13a85\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 9 22:06:18.954298 containerd[1571]: time="2025-09-09T22:06:18.954201465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:18.965924 systemd[1]: cri-containerd-1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e.scope: Deactivated successfully. Sep 9 22:06:18.969416 containerd[1571]: time="2025-09-09T22:06:18.969347146Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" id:\"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" pid:5195 exit_status:137 exited_at:{seconds:1757455578 nanos:968813390}" Sep 9 22:06:18.979752 containerd[1571]: time="2025-09-09T22:06:18.979693186Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 22:06:19.012724 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e-rootfs.mount: Deactivated successfully. Sep 9 22:06:19.012899 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e-shm.mount: Deactivated successfully. Sep 9 22:06:19.024965 containerd[1571]: time="2025-09-09T22:06:19.023418975Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:19.028381 containerd[1571]: time="2025-09-09T22:06:19.028268300Z" level=info msg="shim disconnected" id=1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e namespace=k8s.io Sep 9 22:06:19.028381 containerd[1571]: time="2025-09-09T22:06:19.028318554Z" level=warning msg="cleaning up after shim disconnected" id=1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e namespace=k8s.io Sep 9 22:06:19.028381 containerd[1571]: time="2025-09-09T22:06:19.028329294Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 22:06:19.048857 containerd[1571]: time="2025-09-09T22:06:19.048763928Z" level=info msg="received exit event sandbox_id:\"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" exit_status:137 exited_at:{seconds:1757455578 nanos:968813390}" Sep 9 22:06:19.093913 containerd[1571]: time="2025-09-09T22:06:19.093774366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 22:06:19.096103 containerd[1571]: time="2025-09-09T22:06:19.096027741Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.296254752s" Sep 9 22:06:19.096325 containerd[1571]: time="2025-09-09T22:06:19.096284855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 22:06:19.104370 containerd[1571]: time="2025-09-09T22:06:19.104294212Z" level=info msg="CreateContainer within sandbox \"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 22:06:19.217117 containerd[1571]: time="2025-09-09T22:06:19.216982878Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\" id:\"cc0729ab3739c4a88c371356eaae09a0d5b85e9a14a674c3806bb65997654d95\" pid:6249 exited_at:{seconds:1757455579 nanos:216095126}" Sep 9 22:06:19.226263 containerd[1571]: time="2025-09-09T22:06:19.225654203Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" id:\"e2ddbc1c289169bbd16dac941ce0ee50a6b4ff56f493519edf53c7c8d0b2b81c\" pid:6215 exited_at:{seconds:1757455579 nanos:225062127}" Sep 9 22:06:19.229583 containerd[1571]: time="2025-09-09T22:06:19.229458499Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" id:\"393d9dd687eb8dfffe5ea60b61943d644238f8b10bd731ff9eeaa0c64f86d702\" pid:6206 exited_at:{seconds:1757455579 nanos:229175687}" Sep 9 22:06:19.313587 containerd[1571]: time="2025-09-09T22:06:19.310732698Z" level=info msg="Container eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051: CDI devices from CRI Config.CDIDevices: []" Sep 9 22:06:19.341665 containerd[1571]: time="2025-09-09T22:06:19.341599083Z" level=info msg="CreateContainer within sandbox \"9573047dacfe074b70b00b08eec774587d0572b09bb3cb2fc93864e8538d3955\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051\"" Sep 9 22:06:19.342386 containerd[1571]: time="2025-09-09T22:06:19.342355257Z" level=info msg="StartContainer for \"eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051\"" Sep 9 22:06:19.345088 containerd[1571]: time="2025-09-09T22:06:19.345023643Z" level=info msg="connecting to shim eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051" address="unix:///run/containerd/s/d14c87b801ce636fe0967a0066d90769d30ceea19d9a929c3da0e0dc84c80d32" protocol=ttrpc version=3 Sep 9 22:06:19.387857 systemd[1]: Started cri-containerd-eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051.scope - libcontainer container eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051. Sep 9 22:06:19.634652 containerd[1571]: time="2025-09-09T22:06:19.634573559Z" level=info msg="StartContainer for \"eda3fa7c0e4851e3271ef111ecb395983b3596ae4340a41d43351e35577fe051\" returns successfully" Sep 9 22:06:19.830357 kubelet[2845]: I0909 22:06:19.830300 2845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Sep 9 22:06:20.291963 kubelet[2845]: I0909 22:06:20.291891 2845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-6mwcf" podStartSLOduration=55.798675548 podStartE2EDuration="1m22.291869603s" podCreationTimestamp="2025-09-09 22:04:58 +0000 UTC" firstStartedPulling="2025-09-09 22:05:52.606694729 +0000 UTC m=+93.745383030" lastFinishedPulling="2025-09-09 22:06:19.099888773 +0000 UTC m=+120.238577085" observedRunningTime="2025-09-09 22:06:20.291266387 +0000 UTC m=+121.429954698" watchObservedRunningTime="2025-09-09 22:06:20.291869603 +0000 UTC m=+121.430557904" Sep 9 22:06:20.322693 kubelet[2845]: I0909 22:06:20.322592 2845 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 22:06:20.323319 kubelet[2845]: I0909 22:06:20.322803 2845 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 22:06:20.362879 systemd-networkd[1455]: cali10706d7a5cd: Link DOWN Sep 9 22:06:20.362890 systemd-networkd[1455]: cali10706d7a5cd: Lost carrier Sep 9 22:06:20.742963 systemd[1]: Started sshd@18-10.0.0.72:22-10.0.0.1:39698.service - OpenSSH per-connection server daemon (10.0.0.1:39698). Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.324 [INFO][6256] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.360 [INFO][6256] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" iface="eth0" netns="/var/run/netns/cni-1931fb4e-bfce-cae7-71b1-8c829afbfc41" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.360 [INFO][6256] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" iface="eth0" netns="/var/run/netns/cni-1931fb4e-bfce-cae7-71b1-8c829afbfc41" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.373 [INFO][6256] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" after=13.094817ms iface="eth0" netns="/var/run/netns/cni-1931fb4e-bfce-cae7-71b1-8c829afbfc41" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.373 [INFO][6256] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.373 [INFO][6256] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.397 [INFO][6311] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.398 [INFO][6311] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.398 [INFO][6311] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.769 [INFO][6311] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.770 [INFO][6311] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" HandleID="k8s-pod-network.1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--td8gs-eth0" Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.772 [INFO][6311] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:06:20.784222 containerd[1571]: 2025-09-09 22:06:20.778 [INFO][6256] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e" Sep 9 22:06:20.789060 systemd[1]: run-netns-cni\x2d1931fb4e\x2dbfce\x2dcae7\x2d71b1\x2d8c829afbfc41.mount: Deactivated successfully. Sep 9 22:06:20.790945 containerd[1571]: time="2025-09-09T22:06:20.790886912Z" level=info msg="TearDown network for sandbox \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" successfully" Sep 9 22:06:20.790945 containerd[1571]: time="2025-09-09T22:06:20.790943179Z" level=info msg="StopPodSandbox for \"1892dea2c1576fd8c9ee1f21a6adc7557ce220f5efae1cc5a3ae92f7d91a211e\" returns successfully" Sep 9 22:06:20.861175 sshd[6319]: Accepted publickey for core from 10.0.0.1 port 39698 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:20.863737 sshd-session[6319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:20.870195 systemd-logind[1554]: New session 19 of user core. Sep 9 22:06:20.876770 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 22:06:20.940090 kubelet[2845]: I0909 22:06:20.940020 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd79b509-695c-4e9d-acec-db0b4753f41e-calico-apiserver-certs\") pod \"dd79b509-695c-4e9d-acec-db0b4753f41e\" (UID: \"dd79b509-695c-4e9d-acec-db0b4753f41e\") " Sep 9 22:06:20.940090 kubelet[2845]: I0909 22:06:20.940076 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cghrn\" (UniqueName: \"kubernetes.io/projected/dd79b509-695c-4e9d-acec-db0b4753f41e-kube-api-access-cghrn\") pod \"dd79b509-695c-4e9d-acec-db0b4753f41e\" (UID: \"dd79b509-695c-4e9d-acec-db0b4753f41e\") " Sep 9 22:06:20.944144 kubelet[2845]: I0909 22:06:20.944082 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd79b509-695c-4e9d-acec-db0b4753f41e-kube-api-access-cghrn" (OuterVolumeSpecName: "kube-api-access-cghrn") pod "dd79b509-695c-4e9d-acec-db0b4753f41e" (UID: "dd79b509-695c-4e9d-acec-db0b4753f41e"). InnerVolumeSpecName "kube-api-access-cghrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 22:06:20.945612 kubelet[2845]: I0909 22:06:20.945584 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd79b509-695c-4e9d-acec-db0b4753f41e-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "dd79b509-695c-4e9d-acec-db0b4753f41e" (UID: "dd79b509-695c-4e9d-acec-db0b4753f41e"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 22:06:20.946392 systemd[1]: var-lib-kubelet-pods-dd79b509\x2d695c\x2d4e9d\x2dacec\x2ddb0b4753f41e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcghrn.mount: Deactivated successfully. Sep 9 22:06:20.946557 systemd[1]: var-lib-kubelet-pods-dd79b509\x2d695c\x2d4e9d\x2dacec\x2ddb0b4753f41e-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 9 22:06:21.040961 kubelet[2845]: I0909 22:06:21.040857 2845 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd79b509-695c-4e9d-acec-db0b4753f41e-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 9 22:06:21.040961 kubelet[2845]: I0909 22:06:21.040925 2845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cghrn\" (UniqueName: \"kubernetes.io/projected/dd79b509-695c-4e9d-acec-db0b4753f41e-kube-api-access-cghrn\") on node \"localhost\" DevicePath \"\"" Sep 9 22:06:21.065151 systemd[1]: Removed slice kubepods-besteffort-poddd79b509_695c_4e9d_acec_db0b4753f41e.slice - libcontainer container kubepods-besteffort-poddd79b509_695c_4e9d_acec_db0b4753f41e.slice. Sep 9 22:06:21.065271 systemd[1]: kubepods-besteffort-poddd79b509_695c_4e9d_acec_db0b4753f41e.slice: Consumed 1.112s CPU time, 44.1M memory peak. Sep 9 22:06:21.362484 sshd[6324]: Connection closed by 10.0.0.1 port 39698 Sep 9 22:06:21.363023 sshd-session[6319]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:21.384480 systemd[1]: sshd@18-10.0.0.72:22-10.0.0.1:39698.service: Deactivated successfully. Sep 9 22:06:21.386957 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 22:06:21.387864 systemd-logind[1554]: Session 19 logged out. Waiting for processes to exit. Sep 9 22:06:21.389221 systemd-logind[1554]: Removed session 19. Sep 9 22:06:23.060649 kubelet[2845]: I0909 22:06:23.060581 2845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd79b509-695c-4e9d-acec-db0b4753f41e" path="/var/lib/kubelet/pods/dd79b509-695c-4e9d-acec-db0b4753f41e/volumes" Sep 9 22:06:26.386727 systemd[1]: Started sshd@19-10.0.0.72:22-10.0.0.1:39704.service - OpenSSH per-connection server daemon (10.0.0.1:39704). Sep 9 22:06:26.447287 sshd[6348]: Accepted publickey for core from 10.0.0.1 port 39704 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:26.449643 sshd-session[6348]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:26.458431 systemd-logind[1554]: New session 20 of user core. Sep 9 22:06:26.468627 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 22:06:26.658274 sshd[6351]: Connection closed by 10.0.0.1 port 39704 Sep 9 22:06:26.658639 sshd-session[6348]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:26.665189 systemd[1]: sshd@19-10.0.0.72:22-10.0.0.1:39704.service: Deactivated successfully. Sep 9 22:06:26.669094 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 22:06:26.670151 systemd-logind[1554]: Session 20 logged out. Waiting for processes to exit. Sep 9 22:06:26.671938 systemd-logind[1554]: Removed session 20. Sep 9 22:06:29.492141 containerd[1571]: time="2025-09-09T22:06:29.492062541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" id:\"2d94b6e128f829aa4f7ab8e7ac8cc35fb20290ebb3789a8157a61e5c1962fc39\" pid:6376 exited_at:{seconds:1757455589 nanos:491633113}" Sep 9 22:06:31.677247 systemd[1]: Started sshd@20-10.0.0.72:22-10.0.0.1:57238.service - OpenSSH per-connection server daemon (10.0.0.1:57238). Sep 9 22:06:31.760892 sshd[6394]: Accepted publickey for core from 10.0.0.1 port 57238 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:31.762900 sshd-session[6394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:31.767682 systemd-logind[1554]: New session 21 of user core. Sep 9 22:06:31.777633 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 22:06:31.945191 sshd[6397]: Connection closed by 10.0.0.1 port 57238 Sep 9 22:06:31.945444 sshd-session[6394]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:31.950648 systemd[1]: sshd@20-10.0.0.72:22-10.0.0.1:57238.service: Deactivated successfully. Sep 9 22:06:31.953520 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 22:06:31.954451 systemd-logind[1554]: Session 21 logged out. Waiting for processes to exit. Sep 9 22:06:31.956104 systemd-logind[1554]: Removed session 21. Sep 9 22:06:36.972861 systemd[1]: Started sshd@21-10.0.0.72:22-10.0.0.1:57246.service - OpenSSH per-connection server daemon (10.0.0.1:57246). Sep 9 22:06:37.050103 sshd[6410]: Accepted publickey for core from 10.0.0.1 port 57246 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:37.052694 sshd-session[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:37.063237 systemd-logind[1554]: New session 22 of user core. Sep 9 22:06:37.081944 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 22:06:37.256215 sshd[6413]: Connection closed by 10.0.0.1 port 57246 Sep 9 22:06:37.256862 sshd-session[6410]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:37.268325 systemd[1]: sshd@21-10.0.0.72:22-10.0.0.1:57246.service: Deactivated successfully. Sep 9 22:06:37.272286 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 22:06:37.274535 systemd-logind[1554]: Session 22 logged out. Waiting for processes to exit. Sep 9 22:06:37.279831 systemd[1]: Started sshd@22-10.0.0.72:22-10.0.0.1:57260.service - OpenSSH per-connection server daemon (10.0.0.1:57260). Sep 9 22:06:37.281164 systemd-logind[1554]: Removed session 22. Sep 9 22:06:37.355677 sshd[6427]: Accepted publickey for core from 10.0.0.1 port 57260 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:37.357936 sshd-session[6427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:37.364761 systemd-logind[1554]: New session 23 of user core. Sep 9 22:06:37.375812 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 22:06:38.286730 sshd[6430]: Connection closed by 10.0.0.1 port 57260 Sep 9 22:06:38.288212 sshd-session[6427]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:38.301665 systemd[1]: sshd@22-10.0.0.72:22-10.0.0.1:57260.service: Deactivated successfully. Sep 9 22:06:38.305155 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 22:06:38.306532 systemd-logind[1554]: Session 23 logged out. Waiting for processes to exit. Sep 9 22:06:38.311797 systemd[1]: Started sshd@23-10.0.0.72:22-10.0.0.1:57268.service - OpenSSH per-connection server daemon (10.0.0.1:57268). Sep 9 22:06:38.313705 systemd-logind[1554]: Removed session 23. Sep 9 22:06:38.404791 sshd[6441]: Accepted publickey for core from 10.0.0.1 port 57268 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:38.406743 sshd-session[6441]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:38.412825 systemd-logind[1554]: New session 24 of user core. Sep 9 22:06:38.432938 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 22:06:41.055032 kubelet[2845]: E0909 22:06:41.054970 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:06:41.365379 sshd[6444]: Connection closed by 10.0.0.1 port 57268 Sep 9 22:06:41.366790 sshd-session[6441]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:41.381652 systemd[1]: sshd@23-10.0.0.72:22-10.0.0.1:57268.service: Deactivated successfully. Sep 9 22:06:41.385313 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 22:06:41.386611 systemd[1]: session-24.scope: Consumed 784ms CPU time, 79.8M memory peak. Sep 9 22:06:41.388328 systemd-logind[1554]: Session 24 logged out. Waiting for processes to exit. Sep 9 22:06:41.395494 systemd-logind[1554]: Removed session 24. Sep 9 22:06:41.397567 systemd[1]: Started sshd@24-10.0.0.72:22-10.0.0.1:55800.service - OpenSSH per-connection server daemon (10.0.0.1:55800). Sep 9 22:06:41.498166 sshd[6475]: Accepted publickey for core from 10.0.0.1 port 55800 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:41.497909 sshd-session[6475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:41.509337 systemd-logind[1554]: New session 25 of user core. Sep 9 22:06:41.515017 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 22:06:42.217390 sshd[6478]: Connection closed by 10.0.0.1 port 55800 Sep 9 22:06:42.219713 sshd-session[6475]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:42.232065 systemd[1]: sshd@24-10.0.0.72:22-10.0.0.1:55800.service: Deactivated successfully. Sep 9 22:06:42.235428 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 22:06:42.239189 systemd-logind[1554]: Session 25 logged out. Waiting for processes to exit. Sep 9 22:06:42.244209 systemd[1]: Started sshd@25-10.0.0.72:22-10.0.0.1:55806.service - OpenSSH per-connection server daemon (10.0.0.1:55806). Sep 9 22:06:42.246606 systemd-logind[1554]: Removed session 25. Sep 9 22:06:42.311142 sshd[6489]: Accepted publickey for core from 10.0.0.1 port 55806 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:42.315345 sshd-session[6489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:42.329863 systemd-logind[1554]: New session 26 of user core. Sep 9 22:06:42.339928 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 22:06:42.485029 sshd[6494]: Connection closed by 10.0.0.1 port 55806 Sep 9 22:06:42.491977 systemd[1]: sshd@25-10.0.0.72:22-10.0.0.1:55806.service: Deactivated successfully. Sep 9 22:06:42.485342 sshd-session[6489]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:42.495291 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 22:06:42.496509 systemd-logind[1554]: Session 26 logged out. Waiting for processes to exit. Sep 9 22:06:42.499173 systemd-logind[1554]: Removed session 26. Sep 9 22:06:43.971576 containerd[1571]: time="2025-09-09T22:06:43.971425131Z" level=info msg="StopContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" with timeout 30 (s)" Sep 9 22:06:43.972892 containerd[1571]: time="2025-09-09T22:06:43.972841604Z" level=info msg="Stop container \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" with signal terminated" Sep 9 22:06:43.989251 systemd[1]: cri-containerd-802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0.scope: Deactivated successfully. Sep 9 22:06:43.990289 systemd[1]: cri-containerd-802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0.scope: Consumed 1.800s CPU time, 53.6M memory peak, 724K read from disk. Sep 9 22:06:44.020211 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0-rootfs.mount: Deactivated successfully. Sep 9 22:06:44.022297 containerd[1571]: time="2025-09-09T22:06:43.992008029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" id:\"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" pid:5816 exit_status:1 exited_at:{seconds:1757455603 nanos:991264691}" Sep 9 22:06:44.022297 containerd[1571]: time="2025-09-09T22:06:43.992075538Z" level=info msg="received exit event container_id:\"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" id:\"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" pid:5816 exit_status:1 exited_at:{seconds:1757455603 nanos:991264691}" Sep 9 22:06:44.630195 containerd[1571]: time="2025-09-09T22:06:44.630109826Z" level=info msg="StopContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" returns successfully" Sep 9 22:06:44.632454 containerd[1571]: time="2025-09-09T22:06:44.632402250Z" level=info msg="StopPodSandbox for \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\"" Sep 9 22:06:44.632564 containerd[1571]: time="2025-09-09T22:06:44.632537129Z" level=info msg="Container to stop \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 9 22:06:44.642758 systemd[1]: cri-containerd-4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979.scope: Deactivated successfully. Sep 9 22:06:44.647260 containerd[1571]: time="2025-09-09T22:06:44.647200473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" id:\"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" pid:5094 exit_status:137 exited_at:{seconds:1757455604 nanos:646745749}" Sep 9 22:06:44.681165 containerd[1571]: time="2025-09-09T22:06:44.681016003Z" level=info msg="shim disconnected" id=4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979 namespace=k8s.io Sep 9 22:06:44.681165 containerd[1571]: time="2025-09-09T22:06:44.681156945Z" level=warning msg="cleaning up after shim disconnected" id=4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979 namespace=k8s.io Sep 9 22:06:44.681526 containerd[1571]: time="2025-09-09T22:06:44.681166462Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 22:06:44.683111 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979-rootfs.mount: Deactivated successfully. Sep 9 22:06:44.956117 containerd[1571]: time="2025-09-09T22:06:44.955920645Z" level=info msg="received exit event sandbox_id:\"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" exit_status:137 exited_at:{seconds:1757455604 nanos:646745749}" Sep 9 22:06:44.959651 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979-shm.mount: Deactivated successfully. Sep 9 22:06:45.269808 systemd-networkd[1455]: cali73996c91ad7: Link DOWN Sep 9 22:06:45.269822 systemd-networkd[1455]: cali73996c91ad7: Lost carrier Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.267 [INFO][6578] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.268 [INFO][6578] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" iface="eth0" netns="/var/run/netns/cni-cb5c851f-8208-e043-0dbc-b6180b7ea6f7" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.268 [INFO][6578] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" iface="eth0" netns="/var/run/netns/cni-cb5c851f-8208-e043-0dbc-b6180b7ea6f7" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.277 [INFO][6578] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" after=9.300404ms iface="eth0" netns="/var/run/netns/cni-cb5c851f-8208-e043-0dbc-b6180b7ea6f7" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.278 [INFO][6578] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.278 [INFO][6578] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.356 [INFO][6590] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.356 [INFO][6590] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.357 [INFO][6590] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.634 [INFO][6590] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.634 [INFO][6590] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" HandleID="k8s-pod-network.4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Workload="localhost-k8s-calico--apiserver--7f6c6b6d--9txl4-eth0" Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.636 [INFO][6590] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 22:06:45.645381 containerd[1571]: 2025-09-09 22:06:45.640 [INFO][6578] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979" Sep 9 22:06:45.649513 containerd[1571]: time="2025-09-09T22:06:45.646792071Z" level=info msg="TearDown network for sandbox \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" successfully" Sep 9 22:06:45.649513 containerd[1571]: time="2025-09-09T22:06:45.646841787Z" level=info msg="StopPodSandbox for \"4362e379c0ddca3b6988f5f707a72f5ca8ab40ad114f8cedd88ae27aedd18979\" returns successfully" Sep 9 22:06:45.649247 systemd[1]: run-netns-cni\x2dcb5c851f\x2d8208\x2de043\x2d0dbc\x2db6180b7ea6f7.mount: Deactivated successfully. Sep 9 22:06:45.825783 kubelet[2845]: I0909 22:06:45.825721 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/38023746-1010-4fa8-a0d6-8863807eb181-calico-apiserver-certs\") pod \"38023746-1010-4fa8-a0d6-8863807eb181\" (UID: \"38023746-1010-4fa8-a0d6-8863807eb181\") " Sep 9 22:06:45.825783 kubelet[2845]: I0909 22:06:45.825777 2845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r8z\" (UniqueName: \"kubernetes.io/projected/38023746-1010-4fa8-a0d6-8863807eb181-kube-api-access-m9r8z\") pod \"38023746-1010-4fa8-a0d6-8863807eb181\" (UID: \"38023746-1010-4fa8-a0d6-8863807eb181\") " Sep 9 22:06:45.833659 kubelet[2845]: I0909 22:06:45.833589 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38023746-1010-4fa8-a0d6-8863807eb181-kube-api-access-m9r8z" (OuterVolumeSpecName: "kube-api-access-m9r8z") pod "38023746-1010-4fa8-a0d6-8863807eb181" (UID: "38023746-1010-4fa8-a0d6-8863807eb181"). InnerVolumeSpecName "kube-api-access-m9r8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 22:06:45.833793 kubelet[2845]: I0909 22:06:45.833685 2845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38023746-1010-4fa8-a0d6-8863807eb181-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "38023746-1010-4fa8-a0d6-8863807eb181" (UID: "38023746-1010-4fa8-a0d6-8863807eb181"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 22:06:45.833943 systemd[1]: var-lib-kubelet-pods-38023746\x2d1010\x2d4fa8\x2da0d6\x2d8863807eb181-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dm9r8z.mount: Deactivated successfully. Sep 9 22:06:45.834085 systemd[1]: var-lib-kubelet-pods-38023746\x2d1010\x2d4fa8\x2da0d6\x2d8863807eb181-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 9 22:06:45.920518 kubelet[2845]: I0909 22:06:45.920365 2845 scope.go:117] "RemoveContainer" containerID="802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0" Sep 9 22:06:45.922549 containerd[1571]: time="2025-09-09T22:06:45.922510194Z" level=info msg="RemoveContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\"" Sep 9 22:06:45.926204 kubelet[2845]: I0909 22:06:45.926170 2845 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/38023746-1010-4fa8-a0d6-8863807eb181-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 9 22:06:45.926204 kubelet[2845]: I0909 22:06:45.926201 2845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r8z\" (UniqueName: \"kubernetes.io/projected/38023746-1010-4fa8-a0d6-8863807eb181-kube-api-access-m9r8z\") on node \"localhost\" DevicePath \"\"" Sep 9 22:06:45.927378 systemd[1]: Removed slice kubepods-besteffort-pod38023746_1010_4fa8_a0d6_8863807eb181.slice - libcontainer container kubepods-besteffort-pod38023746_1010_4fa8_a0d6_8863807eb181.slice. Sep 9 22:06:45.927518 systemd[1]: kubepods-besteffort-pod38023746_1010_4fa8_a0d6_8863807eb181.slice: Consumed 1.839s CPU time, 54.1M memory peak, 857K read from disk. Sep 9 22:06:46.013149 containerd[1571]: time="2025-09-09T22:06:46.013067030Z" level=info msg="RemoveContainer for \"802521755275e0ebfd1c294a9841c79b3112e1d6ffc70415f4c9bd5eeb22a2b0\" returns successfully" Sep 9 22:06:47.058631 kubelet[2845]: I0909 22:06:47.058571 2845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38023746-1010-4fa8-a0d6-8863807eb181" path="/var/lib/kubelet/pods/38023746-1010-4fa8-a0d6-8863807eb181/volumes" Sep 9 22:06:47.498813 systemd[1]: Started sshd@26-10.0.0.72:22-10.0.0.1:55820.service - OpenSSH per-connection server daemon (10.0.0.1:55820). Sep 9 22:06:47.556060 sshd[6606]: Accepted publickey for core from 10.0.0.1 port 55820 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:47.558092 sshd-session[6606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:47.563431 systemd-logind[1554]: New session 27 of user core. Sep 9 22:06:47.573781 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 22:06:47.689166 sshd[6609]: Connection closed by 10.0.0.1 port 55820 Sep 9 22:06:47.689669 sshd-session[6606]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:47.694557 systemd[1]: sshd@26-10.0.0.72:22-10.0.0.1:55820.service: Deactivated successfully. Sep 9 22:06:47.697104 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 22:06:47.697884 systemd-logind[1554]: Session 27 logged out. Waiting for processes to exit. Sep 9 22:06:47.700014 systemd-logind[1554]: Removed session 27. Sep 9 22:06:49.177041 containerd[1571]: time="2025-09-09T22:06:49.176754712Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af1d71f510f12b2c00f92ddce64334f0d4e9eceb23ae87129f9df8db16f92f20\" id:\"5700a2c920ff2c54348fb4d7965f3dfaa23ff1308e3ba9a1ba0cbd2185e0f048\" pid:6657 exited_at:{seconds:1757455609 nanos:175687574}" Sep 9 22:06:49.205699 containerd[1571]: time="2025-09-09T22:06:49.205626111Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7f2f471f0864daabc92dc14878d9371881bee0fc4914ecdb3eab44f2d2ce52e5\" id:\"5a5e9b6722823a684c6c647daa0aa0872cf6a3dea2189bc14179cf813f42318b\" pid:6634 exited_at:{seconds:1757455609 nanos:204879067}" Sep 9 22:06:52.055658 kubelet[2845]: E0909 22:06:52.055574 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:06:52.707183 systemd[1]: Started sshd@27-10.0.0.72:22-10.0.0.1:34128.service - OpenSSH per-connection server daemon (10.0.0.1:34128). Sep 9 22:06:52.773589 sshd[6671]: Accepted publickey for core from 10.0.0.1 port 34128 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:52.776384 sshd-session[6671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:52.784675 systemd-logind[1554]: New session 28 of user core. Sep 9 22:06:52.793878 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 9 22:06:52.937387 sshd[6674]: Connection closed by 10.0.0.1 port 34128 Sep 9 22:06:52.937842 sshd-session[6671]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:52.945182 systemd[1]: sshd@27-10.0.0.72:22-10.0.0.1:34128.service: Deactivated successfully. Sep 9 22:06:52.948008 systemd[1]: session-28.scope: Deactivated successfully. Sep 9 22:06:52.949413 systemd-logind[1554]: Session 28 logged out. Waiting for processes to exit. Sep 9 22:06:52.953999 systemd-logind[1554]: Removed session 28. Sep 9 22:06:57.954158 systemd[1]: Started sshd@28-10.0.0.72:22-10.0.0.1:34144.service - OpenSSH per-connection server daemon (10.0.0.1:34144). Sep 9 22:06:58.041290 sshd[6692]: Accepted publickey for core from 10.0.0.1 port 34144 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:06:58.043408 sshd-session[6692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:06:58.049315 systemd-logind[1554]: New session 29 of user core. Sep 9 22:06:58.061778 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 9 22:06:58.272784 sshd[6695]: Connection closed by 10.0.0.1 port 34144 Sep 9 22:06:58.280653 systemd-logind[1554]: Session 29 logged out. Waiting for processes to exit. Sep 9 22:06:58.274764 sshd-session[6692]: pam_unix(sshd:session): session closed for user core Sep 9 22:06:58.280967 systemd[1]: sshd@28-10.0.0.72:22-10.0.0.1:34144.service: Deactivated successfully. Sep 9 22:06:58.283521 systemd[1]: session-29.scope: Deactivated successfully. Sep 9 22:06:58.285962 systemd-logind[1554]: Removed session 29. Sep 9 22:06:59.383750 containerd[1571]: time="2025-09-09T22:06:59.383642399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ab5e758026bbb6257e27a31e5da491bf2e45409d5f5e47664a8b430c2a783c2b\" id:\"cf0af44150d6251d49d028b443e7ae72410ed1078b0046bc06077cf9a021ed8d\" pid:6720 exited_at:{seconds:1757455619 nanos:382659198}" Sep 9 22:07:03.292231 systemd[1]: Started sshd@29-10.0.0.72:22-10.0.0.1:60884.service - OpenSSH per-connection server daemon (10.0.0.1:60884). Sep 9 22:07:03.378063 sshd[6736]: Accepted publickey for core from 10.0.0.1 port 60884 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:07:03.380349 sshd-session[6736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:07:03.386253 systemd-logind[1554]: New session 30 of user core. Sep 9 22:07:03.394614 systemd[1]: Started session-30.scope - Session 30 of User core. Sep 9 22:07:03.750106 sshd[6745]: Connection closed by 10.0.0.1 port 60884 Sep 9 22:07:03.750389 sshd-session[6736]: pam_unix(sshd:session): session closed for user core Sep 9 22:07:03.755160 systemd[1]: sshd@29-10.0.0.72:22-10.0.0.1:60884.service: Deactivated successfully. Sep 9 22:07:03.757507 systemd[1]: session-30.scope: Deactivated successfully. Sep 9 22:07:03.758286 systemd-logind[1554]: Session 30 logged out. Waiting for processes to exit. Sep 9 22:07:03.759519 systemd-logind[1554]: Removed session 30. Sep 9 22:07:08.056092 kubelet[2845]: E0909 22:07:08.056026 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:07:08.765499 systemd[1]: Started sshd@30-10.0.0.72:22-10.0.0.1:60894.service - OpenSSH per-connection server daemon (10.0.0.1:60894). Sep 9 22:07:08.892739 sshd[6761]: Accepted publickey for core from 10.0.0.1 port 60894 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:07:08.894857 sshd-session[6761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:07:08.900489 systemd-logind[1554]: New session 31 of user core. Sep 9 22:07:08.908783 systemd[1]: Started session-31.scope - Session 31 of User core. Sep 9 22:07:09.411602 sshd[6764]: Connection closed by 10.0.0.1 port 60894 Sep 9 22:07:09.409896 sshd-session[6761]: pam_unix(sshd:session): session closed for user core Sep 9 22:07:09.414969 systemd[1]: sshd@30-10.0.0.72:22-10.0.0.1:60894.service: Deactivated successfully. Sep 9 22:07:09.418052 systemd[1]: session-31.scope: Deactivated successfully. Sep 9 22:07:09.418977 systemd-logind[1554]: Session 31 logged out. Waiting for processes to exit. Sep 9 22:07:09.422680 systemd-logind[1554]: Removed session 31. Sep 9 22:07:12.055847 kubelet[2845]: E0909 22:07:12.055797 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Sep 9 22:07:14.422985 systemd[1]: Started sshd@31-10.0.0.72:22-10.0.0.1:58784.service - OpenSSH per-connection server daemon (10.0.0.1:58784). Sep 9 22:07:14.499287 sshd[6778]: Accepted publickey for core from 10.0.0.1 port 58784 ssh2: RSA SHA256:A2CJI2QL6ueQzwzJUDumHRmawTN/BqpJNEZzUqxCWKo Sep 9 22:07:14.501254 sshd-session[6778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 22:07:14.506712 systemd-logind[1554]: New session 32 of user core. Sep 9 22:07:14.511970 systemd[1]: Started session-32.scope - Session 32 of User core. Sep 9 22:07:14.790716 sshd[6781]: Connection closed by 10.0.0.1 port 58784 Sep 9 22:07:14.791183 sshd-session[6778]: pam_unix(sshd:session): session closed for user core Sep 9 22:07:14.796923 systemd[1]: sshd@31-10.0.0.72:22-10.0.0.1:58784.service: Deactivated successfully. Sep 9 22:07:14.799310 systemd[1]: session-32.scope: Deactivated successfully. Sep 9 22:07:14.800149 systemd-logind[1554]: Session 32 logged out. Waiting for processes to exit. Sep 9 22:07:14.801730 systemd-logind[1554]: Removed session 32. Sep 9 22:07:15.055297 kubelet[2845]: E0909 22:07:15.055137 2845 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8"