Aug 19 08:16:34.845623 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Aug 18 22:19:37 -00 2025 Aug 19 08:16:34.845660 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:16:34.845670 kernel: BIOS-provided physical RAM map: Aug 19 08:16:34.845678 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Aug 19 08:16:34.845686 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Aug 19 08:16:34.845693 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Aug 19 08:16:34.845701 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Aug 19 08:16:34.845707 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Aug 19 08:16:34.845719 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Aug 19 08:16:34.845726 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Aug 19 08:16:34.845733 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Aug 19 08:16:34.845739 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Aug 19 08:16:34.845746 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Aug 19 08:16:34.845752 kernel: NX (Execute Disable) protection: active Aug 19 08:16:34.845763 kernel: APIC: Static calls initialized Aug 19 08:16:34.845771 kernel: SMBIOS 2.8 present. Aug 19 08:16:34.845780 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Aug 19 08:16:34.845788 kernel: DMI: Memory slots populated: 1/1 Aug 19 08:16:34.845795 kernel: Hypervisor detected: KVM Aug 19 08:16:34.845802 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 19 08:16:34.845809 kernel: kvm-clock: using sched offset of 4864424530 cycles Aug 19 08:16:34.845818 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 19 08:16:34.845841 kernel: tsc: Detected 2794.748 MHz processor Aug 19 08:16:34.845851 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 19 08:16:34.845859 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 19 08:16:34.845867 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Aug 19 08:16:34.845874 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Aug 19 08:16:34.845882 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 19 08:16:34.845889 kernel: Using GB pages for direct mapping Aug 19 08:16:34.845897 kernel: ACPI: Early table checksum verification disabled Aug 19 08:16:34.845904 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Aug 19 08:16:34.845912 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845921 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845929 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845936 kernel: ACPI: FACS 0x000000009CFE0000 000040 Aug 19 08:16:34.845944 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845955 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845969 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845979 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:16:34.845989 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Aug 19 08:16:34.846007 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Aug 19 08:16:34.846016 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Aug 19 08:16:34.846023 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Aug 19 08:16:34.846031 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Aug 19 08:16:34.846038 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Aug 19 08:16:34.846046 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Aug 19 08:16:34.846056 kernel: No NUMA configuration found Aug 19 08:16:34.846064 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Aug 19 08:16:34.846071 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Aug 19 08:16:34.846079 kernel: Zone ranges: Aug 19 08:16:34.846086 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 19 08:16:34.846094 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Aug 19 08:16:34.846101 kernel: Normal empty Aug 19 08:16:34.846109 kernel: Device empty Aug 19 08:16:34.846116 kernel: Movable zone start for each node Aug 19 08:16:34.846124 kernel: Early memory node ranges Aug 19 08:16:34.846133 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Aug 19 08:16:34.846141 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Aug 19 08:16:34.846149 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Aug 19 08:16:34.846156 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 19 08:16:34.846163 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Aug 19 08:16:34.846171 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Aug 19 08:16:34.846178 kernel: ACPI: PM-Timer IO Port: 0x608 Aug 19 08:16:34.846190 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 19 08:16:34.846197 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 19 08:16:34.846207 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 19 08:16:34.846215 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 19 08:16:34.846224 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 19 08:16:34.846232 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 19 08:16:34.846241 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 19 08:16:34.846255 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 19 08:16:34.846266 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 19 08:16:34.846276 kernel: TSC deadline timer available Aug 19 08:16:34.846285 kernel: CPU topo: Max. logical packages: 1 Aug 19 08:16:34.846299 kernel: CPU topo: Max. logical dies: 1 Aug 19 08:16:34.846308 kernel: CPU topo: Max. dies per package: 1 Aug 19 08:16:34.846317 kernel: CPU topo: Max. threads per core: 1 Aug 19 08:16:34.846326 kernel: CPU topo: Num. cores per package: 4 Aug 19 08:16:34.846336 kernel: CPU topo: Num. threads per package: 4 Aug 19 08:16:34.846345 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Aug 19 08:16:34.846355 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 19 08:16:34.846375 kernel: kvm-guest: KVM setup pv remote TLB flush Aug 19 08:16:34.846384 kernel: kvm-guest: setup PV sched yield Aug 19 08:16:34.846394 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Aug 19 08:16:34.846407 kernel: Booting paravirtualized kernel on KVM Aug 19 08:16:34.846417 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 19 08:16:34.846427 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Aug 19 08:16:34.846437 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Aug 19 08:16:34.846446 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Aug 19 08:16:34.846456 kernel: pcpu-alloc: [0] 0 1 2 3 Aug 19 08:16:34.846465 kernel: kvm-guest: PV spinlocks enabled Aug 19 08:16:34.846475 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Aug 19 08:16:34.846487 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:16:34.846501 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 08:16:34.846511 kernel: random: crng init done Aug 19 08:16:34.846520 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 08:16:34.846530 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 08:16:34.846540 kernel: Fallback order for Node 0: 0 Aug 19 08:16:34.846549 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Aug 19 08:16:34.846559 kernel: Policy zone: DMA32 Aug 19 08:16:34.846569 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 08:16:34.846582 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Aug 19 08:16:34.846592 kernel: ftrace: allocating 40101 entries in 157 pages Aug 19 08:16:34.846602 kernel: ftrace: allocated 157 pages with 5 groups Aug 19 08:16:34.846611 kernel: Dynamic Preempt: voluntary Aug 19 08:16:34.846621 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 08:16:34.846631 kernel: rcu: RCU event tracing is enabled. Aug 19 08:16:34.846641 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Aug 19 08:16:34.846651 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 08:16:34.846666 kernel: Rude variant of Tasks RCU enabled. Aug 19 08:16:34.846679 kernel: Tracing variant of Tasks RCU enabled. Aug 19 08:16:34.846690 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 08:16:34.846700 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Aug 19 08:16:34.846710 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:16:34.846720 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:16:34.846730 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:16:34.846739 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Aug 19 08:16:34.846749 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 08:16:34.846772 kernel: Console: colour VGA+ 80x25 Aug 19 08:16:34.846783 kernel: printk: legacy console [ttyS0] enabled Aug 19 08:16:34.846793 kernel: ACPI: Core revision 20240827 Aug 19 08:16:34.846804 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Aug 19 08:16:34.846818 kernel: APIC: Switch to symmetric I/O mode setup Aug 19 08:16:34.846848 kernel: x2apic enabled Aug 19 08:16:34.846859 kernel: APIC: Switched APIC routing to: physical x2apic Aug 19 08:16:34.846874 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Aug 19 08:16:34.846886 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Aug 19 08:16:34.846901 kernel: kvm-guest: setup PV IPIs Aug 19 08:16:34.846912 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 19 08:16:34.846924 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Aug 19 08:16:34.846936 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Aug 19 08:16:34.846947 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Aug 19 08:16:34.846959 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Aug 19 08:16:34.846970 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Aug 19 08:16:34.846982 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 19 08:16:34.846996 kernel: Spectre V2 : Mitigation: Retpolines Aug 19 08:16:34.847007 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 19 08:16:34.847019 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Aug 19 08:16:34.847030 kernel: RETBleed: Mitigation: untrained return thunk Aug 19 08:16:34.847042 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 19 08:16:34.847054 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 19 08:16:34.847065 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Aug 19 08:16:34.847077 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Aug 19 08:16:34.847089 kernel: x86/bugs: return thunk changed Aug 19 08:16:34.847102 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Aug 19 08:16:34.847114 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 19 08:16:34.847125 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 19 08:16:34.847137 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 19 08:16:34.847148 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 19 08:16:34.847160 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Aug 19 08:16:34.847171 kernel: Freeing SMP alternatives memory: 32K Aug 19 08:16:34.847182 kernel: pid_max: default: 32768 minimum: 301 Aug 19 08:16:34.847194 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 08:16:34.847207 kernel: landlock: Up and running. Aug 19 08:16:34.847219 kernel: SELinux: Initializing. Aug 19 08:16:34.847230 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 08:16:34.847245 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 08:16:34.847256 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Aug 19 08:16:34.847268 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Aug 19 08:16:34.847279 kernel: ... version: 0 Aug 19 08:16:34.847291 kernel: ... bit width: 48 Aug 19 08:16:34.847302 kernel: ... generic registers: 6 Aug 19 08:16:34.847316 kernel: ... value mask: 0000ffffffffffff Aug 19 08:16:34.847327 kernel: ... max period: 00007fffffffffff Aug 19 08:16:34.847338 kernel: ... fixed-purpose events: 0 Aug 19 08:16:34.847350 kernel: ... event mask: 000000000000003f Aug 19 08:16:34.847371 kernel: signal: max sigframe size: 1776 Aug 19 08:16:34.847383 kernel: rcu: Hierarchical SRCU implementation. Aug 19 08:16:34.847394 kernel: rcu: Max phase no-delay instances is 400. Aug 19 08:16:34.847406 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 08:16:34.847418 kernel: smp: Bringing up secondary CPUs ... Aug 19 08:16:34.847432 kernel: smpboot: x86: Booting SMP configuration: Aug 19 08:16:34.847444 kernel: .... node #0, CPUs: #1 #2 #3 Aug 19 08:16:34.847455 kernel: smp: Brought up 1 node, 4 CPUs Aug 19 08:16:34.847466 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Aug 19 08:16:34.847477 kernel: Memory: 2428912K/2571752K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54040K init, 2928K bss, 136904K reserved, 0K cma-reserved) Aug 19 08:16:34.847486 kernel: devtmpfs: initialized Aug 19 08:16:34.847496 kernel: x86/mm: Memory block size: 128MB Aug 19 08:16:34.847506 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 08:16:34.847516 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Aug 19 08:16:34.847530 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 08:16:34.847540 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 08:16:34.847550 kernel: audit: initializing netlink subsys (disabled) Aug 19 08:16:34.847560 kernel: audit: type=2000 audit(1755591391.725:1): state=initialized audit_enabled=0 res=1 Aug 19 08:16:34.847570 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 08:16:34.847581 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 19 08:16:34.847593 kernel: cpuidle: using governor menu Aug 19 08:16:34.847605 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 08:16:34.847616 kernel: dca service started, version 1.12.1 Aug 19 08:16:34.847631 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Aug 19 08:16:34.847643 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Aug 19 08:16:34.847653 kernel: PCI: Using configuration type 1 for base access Aug 19 08:16:34.847665 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 19 08:16:34.847677 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 08:16:34.847688 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 08:16:34.847700 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 08:16:34.847712 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 08:16:34.847723 kernel: ACPI: Added _OSI(Module Device) Aug 19 08:16:34.847738 kernel: ACPI: Added _OSI(Processor Device) Aug 19 08:16:34.847750 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 08:16:34.847761 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 08:16:34.847772 kernel: ACPI: Interpreter enabled Aug 19 08:16:34.847784 kernel: ACPI: PM: (supports S0 S3 S5) Aug 19 08:16:34.847795 kernel: ACPI: Using IOAPIC for interrupt routing Aug 19 08:16:34.847806 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 19 08:16:34.847817 kernel: PCI: Using E820 reservations for host bridge windows Aug 19 08:16:34.847863 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Aug 19 08:16:34.847880 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 19 08:16:34.848224 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 19 08:16:34.848450 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Aug 19 08:16:34.848615 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Aug 19 08:16:34.848632 kernel: PCI host bridge to bus 0000:00 Aug 19 08:16:34.848807 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 19 08:16:34.848983 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 19 08:16:34.849118 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 19 08:16:34.849255 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Aug 19 08:16:34.849404 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Aug 19 08:16:34.849543 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Aug 19 08:16:34.849679 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 19 08:16:34.849907 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Aug 19 08:16:34.850071 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Aug 19 08:16:34.850198 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Aug 19 08:16:34.850322 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Aug 19 08:16:34.850455 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Aug 19 08:16:34.850592 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 19 08:16:34.850767 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Aug 19 08:16:34.850957 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Aug 19 08:16:34.851112 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Aug 19 08:16:34.851295 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Aug 19 08:16:34.851508 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Aug 19 08:16:34.851652 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Aug 19 08:16:34.851779 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Aug 19 08:16:34.851927 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Aug 19 08:16:34.852071 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Aug 19 08:16:34.852202 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Aug 19 08:16:34.852324 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Aug 19 08:16:34.852495 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Aug 19 08:16:34.852647 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Aug 19 08:16:34.852844 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Aug 19 08:16:34.853003 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Aug 19 08:16:34.853165 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Aug 19 08:16:34.853403 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Aug 19 08:16:34.853561 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Aug 19 08:16:34.853728 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Aug 19 08:16:34.853886 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Aug 19 08:16:34.853898 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 19 08:16:34.853906 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 19 08:16:34.853919 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 19 08:16:34.853927 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 19 08:16:34.853935 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Aug 19 08:16:34.853943 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Aug 19 08:16:34.853951 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Aug 19 08:16:34.853959 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Aug 19 08:16:34.853967 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Aug 19 08:16:34.853975 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Aug 19 08:16:34.853983 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Aug 19 08:16:34.853994 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Aug 19 08:16:34.854002 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Aug 19 08:16:34.854009 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Aug 19 08:16:34.854017 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Aug 19 08:16:34.854025 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Aug 19 08:16:34.854034 kernel: iommu: Default domain type: Translated Aug 19 08:16:34.854042 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 19 08:16:34.854049 kernel: PCI: Using ACPI for IRQ routing Aug 19 08:16:34.854057 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 19 08:16:34.854067 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Aug 19 08:16:34.854075 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Aug 19 08:16:34.854198 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Aug 19 08:16:34.854329 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Aug 19 08:16:34.854468 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 19 08:16:34.854484 kernel: vgaarb: loaded Aug 19 08:16:34.854494 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Aug 19 08:16:34.854503 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Aug 19 08:16:34.854511 kernel: clocksource: Switched to clocksource kvm-clock Aug 19 08:16:34.854523 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 08:16:34.854532 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 08:16:34.854540 kernel: pnp: PnP ACPI init Aug 19 08:16:34.854805 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Aug 19 08:16:34.854843 kernel: pnp: PnP ACPI: found 6 devices Aug 19 08:16:34.854854 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 19 08:16:34.854865 kernel: NET: Registered PF_INET protocol family Aug 19 08:16:34.854875 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 08:16:34.854892 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 19 08:16:34.854902 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 08:16:34.854912 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 08:16:34.854922 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 19 08:16:34.854932 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 19 08:16:34.854943 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 08:16:34.854954 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 08:16:34.854964 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 08:16:34.854978 kernel: NET: Registered PF_XDP protocol family Aug 19 08:16:34.855122 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 19 08:16:34.855257 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 19 08:16:34.855471 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 19 08:16:34.855682 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Aug 19 08:16:34.855865 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Aug 19 08:16:34.856020 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Aug 19 08:16:34.856042 kernel: PCI: CLS 0 bytes, default 64 Aug 19 08:16:34.856062 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Aug 19 08:16:34.856096 kernel: Initialise system trusted keyrings Aug 19 08:16:34.856111 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 19 08:16:34.856122 kernel: Key type asymmetric registered Aug 19 08:16:34.856143 kernel: Asymmetric key parser 'x509' registered Aug 19 08:16:34.856156 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 19 08:16:34.856167 kernel: io scheduler mq-deadline registered Aug 19 08:16:34.856181 kernel: io scheduler kyber registered Aug 19 08:16:34.856192 kernel: io scheduler bfq registered Aug 19 08:16:34.856202 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 19 08:16:34.856219 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Aug 19 08:16:34.856230 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Aug 19 08:16:34.856241 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Aug 19 08:16:34.856252 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 08:16:34.856263 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 19 08:16:34.856274 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 19 08:16:34.856285 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 19 08:16:34.856296 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 19 08:16:34.856624 kernel: rtc_cmos 00:04: RTC can wake from S4 Aug 19 08:16:34.856652 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Aug 19 08:16:34.856860 kernel: rtc_cmos 00:04: registered as rtc0 Aug 19 08:16:34.857020 kernel: rtc_cmos 00:04: setting system clock to 2025-08-19T08:16:34 UTC (1755591394) Aug 19 08:16:34.857168 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Aug 19 08:16:34.857184 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Aug 19 08:16:34.857195 kernel: NET: Registered PF_INET6 protocol family Aug 19 08:16:34.857206 kernel: Segment Routing with IPv6 Aug 19 08:16:34.857217 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 08:16:34.857234 kernel: NET: Registered PF_PACKET protocol family Aug 19 08:16:34.857245 kernel: Key type dns_resolver registered Aug 19 08:16:34.857256 kernel: IPI shorthand broadcast: enabled Aug 19 08:16:34.857266 kernel: sched_clock: Marking stable (3213080705, 110829810)->(3448933483, -125022968) Aug 19 08:16:34.857277 kernel: registered taskstats version 1 Aug 19 08:16:34.857288 kernel: Loading compiled-in X.509 certificates Aug 19 08:16:34.857298 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: 93a065b103c00d4b81cc5822e4e7f9674e63afaf' Aug 19 08:16:34.857309 kernel: Demotion targets for Node 0: null Aug 19 08:16:34.857319 kernel: Key type .fscrypt registered Aug 19 08:16:34.857333 kernel: Key type fscrypt-provisioning registered Aug 19 08:16:34.857344 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 08:16:34.857354 kernel: ima: Allocated hash algorithm: sha1 Aug 19 08:16:34.857375 kernel: ima: No architecture policies found Aug 19 08:16:34.857386 kernel: clk: Disabling unused clocks Aug 19 08:16:34.857397 kernel: Warning: unable to open an initial console. Aug 19 08:16:34.857408 kernel: Freeing unused kernel image (initmem) memory: 54040K Aug 19 08:16:34.857419 kernel: Write protecting the kernel read-only data: 24576k Aug 19 08:16:34.857433 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 19 08:16:34.857444 kernel: Run /init as init process Aug 19 08:16:34.857454 kernel: with arguments: Aug 19 08:16:34.857465 kernel: /init Aug 19 08:16:34.857475 kernel: with environment: Aug 19 08:16:34.857494 kernel: HOME=/ Aug 19 08:16:34.857506 kernel: TERM=linux Aug 19 08:16:34.857516 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 08:16:34.857535 systemd[1]: Successfully made /usr/ read-only. Aug 19 08:16:34.857556 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:16:34.857594 systemd[1]: Detected virtualization kvm. Aug 19 08:16:34.857631 systemd[1]: Detected architecture x86-64. Aug 19 08:16:34.857646 systemd[1]: Running in initrd. Aug 19 08:16:34.857657 systemd[1]: No hostname configured, using default hostname. Aug 19 08:16:34.857673 systemd[1]: Hostname set to . Aug 19 08:16:34.857696 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:16:34.857721 systemd[1]: Queued start job for default target initrd.target. Aug 19 08:16:34.857733 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:16:34.857745 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:16:34.857757 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 08:16:34.857769 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:16:34.857781 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 08:16:34.857799 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 08:16:34.857812 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 08:16:34.857847 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 08:16:34.857859 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:16:34.857870 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:16:34.857897 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:16:34.857919 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:16:34.857953 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:16:34.857985 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:16:34.858017 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:16:34.858039 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:16:34.858052 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 08:16:34.858064 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 08:16:34.858077 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:16:34.858089 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:16:34.858101 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:16:34.858117 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:16:34.858129 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 08:16:34.858141 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:16:34.858153 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 08:16:34.858177 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 08:16:34.858215 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 08:16:34.858229 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:16:34.858242 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:16:34.858254 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:16:34.858276 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 08:16:34.858290 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:16:34.858306 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 08:16:34.858395 systemd-journald[220]: Collecting audit messages is disabled. Aug 19 08:16:34.858429 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 08:16:34.858447 systemd-journald[220]: Journal started Aug 19 08:16:34.858492 systemd-journald[220]: Runtime Journal (/run/log/journal/664a42f9439244a69b06d447ff662aec) is 6M, max 48.6M, 42.5M free. Aug 19 08:16:34.856490 systemd-modules-load[221]: Inserted module 'overlay' Aug 19 08:16:34.860655 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:16:34.868726 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:16:34.901658 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 08:16:34.901684 kernel: Bridge firewalling registered Aug 19 08:16:34.899210 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 08:16:34.899238 systemd-modules-load[221]: Inserted module 'br_netfilter' Aug 19 08:16:34.910086 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:16:34.910552 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:34.916453 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 08:16:34.920066 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:16:34.921286 systemd-tmpfiles[238]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 08:16:34.929706 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:16:34.930532 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:16:34.943205 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:16:34.944090 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:16:34.946936 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:16:34.961649 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:16:34.963188 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 08:16:34.992626 dracut-cmdline[265]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:16:35.006891 systemd-resolved[260]: Positive Trust Anchors: Aug 19 08:16:35.006920 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:16:35.006953 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:16:35.010215 systemd-resolved[260]: Defaulting to hostname 'linux'. Aug 19 08:16:35.011789 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:16:35.017396 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:16:35.137905 kernel: SCSI subsystem initialized Aug 19 08:16:35.151930 kernel: Loading iSCSI transport class v2.0-870. Aug 19 08:16:35.165896 kernel: iscsi: registered transport (tcp) Aug 19 08:16:35.195868 kernel: iscsi: registered transport (qla4xxx) Aug 19 08:16:35.195965 kernel: QLogic iSCSI HBA Driver Aug 19 08:16:35.223631 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:16:35.245332 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:16:35.246856 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:16:35.376801 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 08:16:35.379085 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 08:16:35.445883 kernel: raid6: avx2x4 gen() 20775 MB/s Aug 19 08:16:35.462859 kernel: raid6: avx2x2 gen() 27515 MB/s Aug 19 08:16:35.479938 kernel: raid6: avx2x1 gen() 23630 MB/s Aug 19 08:16:35.480022 kernel: raid6: using algorithm avx2x2 gen() 27515 MB/s Aug 19 08:16:35.497983 kernel: raid6: .... xor() 19285 MB/s, rmw enabled Aug 19 08:16:35.498040 kernel: raid6: using avx2x2 recovery algorithm Aug 19 08:16:35.520862 kernel: xor: automatically using best checksumming function avx Aug 19 08:16:35.696888 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 08:16:35.705981 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:16:35.709104 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:16:35.759629 systemd-udevd[474]: Using default interface naming scheme 'v255'. Aug 19 08:16:35.773667 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:16:35.775756 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 08:16:35.811009 dracut-pre-trigger[476]: rd.md=0: removing MD RAID activation Aug 19 08:16:35.847116 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:16:35.851147 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:16:35.952245 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:16:35.957990 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 08:16:35.999896 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Aug 19 08:16:36.003769 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Aug 19 08:16:36.009841 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Aug 19 08:16:36.013921 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 19 08:16:36.013952 kernel: GPT:9289727 != 19775487 Aug 19 08:16:36.013964 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 19 08:16:36.016004 kernel: GPT:9289727 != 19775487 Aug 19 08:16:36.016033 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 19 08:16:36.016049 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:16:36.023870 kernel: cryptd: max_cpu_qlen set to 1000 Aug 19 08:16:36.026866 kernel: libata version 3.00 loaded. Aug 19 08:16:36.033852 kernel: AES CTR mode by8 optimization enabled Aug 19 08:16:36.040849 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:16:36.041661 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:36.045498 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:16:36.051154 kernel: ahci 0000:00:1f.2: version 3.0 Aug 19 08:16:36.051438 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Aug 19 08:16:36.051791 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:16:36.056690 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Aug 19 08:16:36.056963 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Aug 19 08:16:36.057143 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Aug 19 08:16:36.059203 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:16:36.066854 kernel: scsi host0: ahci Aug 19 08:16:36.067105 kernel: scsi host1: ahci Aug 19 08:16:36.068092 kernel: scsi host2: ahci Aug 19 08:16:36.069796 kernel: scsi host3: ahci Aug 19 08:16:36.071317 kernel: scsi host4: ahci Aug 19 08:16:36.071569 kernel: scsi host5: ahci Aug 19 08:16:36.072735 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 0 Aug 19 08:16:36.072767 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 0 Aug 19 08:16:36.075395 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 0 Aug 19 08:16:36.075424 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 0 Aug 19 08:16:36.076528 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 0 Aug 19 08:16:36.077497 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 0 Aug 19 08:16:36.089578 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Aug 19 08:16:36.114546 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Aug 19 08:16:36.142675 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:36.150991 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Aug 19 08:16:36.151323 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Aug 19 08:16:36.168219 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 08:16:36.171909 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 08:16:36.391851 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Aug 19 08:16:36.391945 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Aug 19 08:16:36.391958 kernel: ata3.00: applying bridge limits Aug 19 08:16:36.391972 kernel: ata2: SATA link down (SStatus 0 SControl 300) Aug 19 08:16:36.392001 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 19 08:16:36.392856 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 19 08:16:36.394854 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 19 08:16:36.394882 kernel: ata1: SATA link down (SStatus 0 SControl 300) Aug 19 08:16:36.395869 kernel: ata3.00: configured for UDMA/100 Aug 19 08:16:36.396625 disk-uuid[634]: Primary Header is updated. Aug 19 08:16:36.396625 disk-uuid[634]: Secondary Entries is updated. Aug 19 08:16:36.396625 disk-uuid[634]: Secondary Header is updated. Aug 19 08:16:36.401483 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 19 08:16:36.401551 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:16:36.409857 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:16:36.445855 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Aug 19 08:16:36.446133 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 19 08:16:36.469869 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Aug 19 08:16:36.899305 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 08:16:36.901226 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:16:36.903036 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:16:36.904369 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:16:36.906806 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 08:16:36.939469 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:16:37.414862 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:16:37.415681 disk-uuid[635]: The operation has completed successfully. Aug 19 08:16:37.450730 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 08:16:37.450879 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 08:16:37.489938 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 08:16:37.518410 sh[664]: Success Aug 19 08:16:37.536874 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 08:16:37.536935 kernel: device-mapper: uevent: version 1.0.3 Aug 19 08:16:37.536948 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 08:16:37.547855 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Aug 19 08:16:37.577414 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 08:16:37.589573 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 08:16:37.619276 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 08:16:37.623872 kernel: BTRFS: device fsid 99050df3-5e04-4f37-acde-dec46aab7896 devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (676) Aug 19 08:16:37.623908 kernel: BTRFS info (device dm-0): first mount of filesystem 99050df3-5e04-4f37-acde-dec46aab7896 Aug 19 08:16:37.625331 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:16:37.625348 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 08:16:37.630777 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 08:16:37.631598 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:16:37.633393 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 08:16:37.638006 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 08:16:37.639975 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 08:16:37.664885 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (707) Aug 19 08:16:37.666950 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:16:37.667002 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:16:37.667021 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:16:37.674872 kernel: BTRFS info (device vda6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:16:37.677099 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 08:16:37.678474 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 08:16:37.843745 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:16:37.846116 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:16:37.924707 systemd-networkd[847]: lo: Link UP Aug 19 08:16:37.924719 systemd-networkd[847]: lo: Gained carrier Aug 19 08:16:37.926674 systemd-networkd[847]: Enumeration completed Aug 19 08:16:37.926938 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:16:37.928601 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:37.928618 systemd-networkd[847]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:16:37.928730 systemd[1]: Reached target network.target - Network. Aug 19 08:16:37.931598 systemd-networkd[847]: eth0: Link UP Aug 19 08:16:37.931850 systemd-networkd[847]: eth0: Gained carrier Aug 19 08:16:37.931861 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:37.950529 ignition[748]: Ignition 2.21.0 Aug 19 08:16:37.950544 ignition[748]: Stage: fetch-offline Aug 19 08:16:37.950581 ignition[748]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:37.950598 ignition[748]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:37.950722 ignition[748]: parsed url from cmdline: "" Aug 19 08:16:37.950726 ignition[748]: no config URL provided Aug 19 08:16:37.950732 ignition[748]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 08:16:37.950744 ignition[748]: no config at "/usr/lib/ignition/user.ign" Aug 19 08:16:37.955403 systemd-networkd[847]: eth0: DHCPv4 address 10.0.0.123/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 08:16:37.950777 ignition[748]: op(1): [started] loading QEMU firmware config module Aug 19 08:16:37.950785 ignition[748]: op(1): executing: "modprobe" "qemu_fw_cfg" Aug 19 08:16:37.964930 ignition[748]: op(1): [finished] loading QEMU firmware config module Aug 19 08:16:37.964962 ignition[748]: QEMU firmware config was not found. Ignoring... Aug 19 08:16:38.004579 ignition[748]: parsing config with SHA512: 61405fc5bd3855463e224eff2a9240123f29bf8b095c6dc0dadd4561c945aa290cf00b30e71b5c0fd8801d0fdb9379d06d1962e45b5053a4b9c7fcb0c0399a6b Aug 19 08:16:38.008668 unknown[748]: fetched base config from "system" Aug 19 08:16:38.008685 unknown[748]: fetched user config from "qemu" Aug 19 08:16:38.009130 ignition[748]: fetch-offline: fetch-offline passed Aug 19 08:16:38.009201 ignition[748]: Ignition finished successfully Aug 19 08:16:38.018323 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:16:38.018782 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 19 08:16:38.020753 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 08:16:38.094950 ignition[859]: Ignition 2.21.0 Aug 19 08:16:38.094963 ignition[859]: Stage: kargs Aug 19 08:16:38.095096 ignition[859]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:38.095106 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:38.097688 ignition[859]: kargs: kargs passed Aug 19 08:16:38.097789 ignition[859]: Ignition finished successfully Aug 19 08:16:38.115255 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 08:16:38.117898 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 08:16:38.167353 ignition[867]: Ignition 2.21.0 Aug 19 08:16:38.167371 ignition[867]: Stage: disks Aug 19 08:16:38.169456 ignition[867]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:38.169472 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:38.174781 ignition[867]: disks: disks passed Aug 19 08:16:38.174896 ignition[867]: Ignition finished successfully Aug 19 08:16:38.179901 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 08:16:38.180456 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 08:16:38.182308 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 08:16:38.182627 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:16:38.183131 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:16:38.183686 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:16:38.193101 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 08:16:38.237771 systemd-fsck[877]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 19 08:16:38.289933 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 08:16:38.292523 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 08:16:38.471878 kernel: EXT4-fs (vda9): mounted filesystem 41966107-04fa-426e-9830-6b4efa50e27b r/w with ordered data mode. Quota mode: none. Aug 19 08:16:38.473005 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 08:16:38.474955 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 08:16:38.476609 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:16:38.478794 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 08:16:38.480246 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 19 08:16:38.480336 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 08:16:38.480373 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:16:38.494869 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 08:16:38.497596 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 08:16:38.503941 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (885) Aug 19 08:16:38.503979 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:16:38.503994 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:16:38.504008 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:16:38.508185 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:16:38.548589 initrd-setup-root[909]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 08:16:38.575043 initrd-setup-root[916]: cut: /sysroot/etc/group: No such file or directory Aug 19 08:16:38.580896 initrd-setup-root[923]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 08:16:38.586665 initrd-setup-root[930]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 08:16:38.696484 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 08:16:38.699372 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 08:16:38.700681 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 08:16:38.723255 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 08:16:38.724640 kernel: BTRFS info (device vda6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:16:38.738525 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 08:16:38.800969 ignition[999]: INFO : Ignition 2.21.0 Aug 19 08:16:38.800969 ignition[999]: INFO : Stage: mount Aug 19 08:16:38.803001 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:38.803001 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:38.807272 ignition[999]: INFO : mount: mount passed Aug 19 08:16:38.808125 ignition[999]: INFO : Ignition finished successfully Aug 19 08:16:38.812095 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 08:16:38.815767 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 08:16:38.848962 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:16:38.881131 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1011) Aug 19 08:16:38.881162 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:16:38.881173 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:16:38.881967 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:16:38.886678 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:16:38.938777 ignition[1028]: INFO : Ignition 2.21.0 Aug 19 08:16:38.938777 ignition[1028]: INFO : Stage: files Aug 19 08:16:38.940623 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:38.940623 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:38.944250 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Aug 19 08:16:38.945960 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 08:16:38.945960 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 08:16:38.948885 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 08:16:38.950428 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 08:16:38.952275 unknown[1028]: wrote ssh authorized keys file for user: core Aug 19 08:16:38.953360 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 08:16:38.955945 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Aug 19 08:16:38.958353 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Aug 19 08:16:39.006055 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 08:16:39.161670 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Aug 19 08:16:39.161670 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:16:39.165714 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 19 08:16:39.177844 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Aug 19 08:16:39.495085 systemd-networkd[847]: eth0: Gained IPv6LL Aug 19 08:16:39.628151 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 08:16:40.376114 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Aug 19 08:16:40.376114 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 08:16:40.380963 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:16:40.383742 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:16:40.383742 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 08:16:40.383742 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 19 08:16:40.388657 ignition[1028]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 08:16:40.388657 ignition[1028]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 08:16:40.388657 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 19 08:16:40.388657 ignition[1028]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Aug 19 08:16:40.413091 ignition[1028]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 08:16:40.418950 ignition[1028]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:16:40.421028 ignition[1028]: INFO : files: files passed Aug 19 08:16:40.421028 ignition[1028]: INFO : Ignition finished successfully Aug 19 08:16:40.428944 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 08:16:40.432661 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 08:16:40.434562 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 08:16:40.469345 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 08:16:40.469487 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 08:16:40.473428 initrd-setup-root-after-ignition[1056]: grep: /sysroot/oem/oem-release: No such file or directory Aug 19 08:16:40.476681 initrd-setup-root-after-ignition[1059]: grep: Aug 19 08:16:40.477851 initrd-setup-root-after-ignition[1063]: grep: Aug 19 08:16:40.478853 initrd-setup-root-after-ignition[1059]: /sysroot/etc/flatcar/enabled-sysext.conf Aug 19 08:16:40.480185 initrd-setup-root-after-ignition[1063]: /sysroot/etc/flatcar/enabled-sysext.conf Aug 19 08:16:40.481522 initrd-setup-root-after-ignition[1059]: : No such file or directory Aug 19 08:16:40.483063 initrd-setup-root-after-ignition[1063]: : No such file or directory Aug 19 08:16:40.483768 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:16:40.484782 initrd-setup-root-after-ignition[1059]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:16:40.486011 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 08:16:40.491559 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 08:16:40.555449 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 08:16:40.556626 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 08:16:40.559152 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 08:16:40.561162 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 08:16:40.561478 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 08:16:40.562504 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 08:16:40.594327 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:16:40.597087 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 08:16:40.623927 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:16:40.624410 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:16:40.626951 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 08:16:40.627502 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 08:16:40.627639 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:16:40.633202 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 08:16:40.636477 systemd[1]: Stopped target basic.target - Basic System. Aug 19 08:16:40.636969 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 08:16:40.637489 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:16:40.637854 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 08:16:40.638341 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:16:40.638701 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 08:16:40.639399 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:16:40.639779 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 08:16:40.651385 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 08:16:40.653199 systemd[1]: Stopped target swap.target - Swaps. Aug 19 08:16:40.655199 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 08:16:40.655393 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:16:40.658862 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:16:40.659488 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:16:40.659800 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 08:16:40.664667 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:16:40.665432 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 08:16:40.665561 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 08:16:40.669635 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 08:16:40.669813 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:16:40.672934 systemd[1]: Stopped target paths.target - Path Units. Aug 19 08:16:40.673401 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 08:16:40.680953 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:16:40.681598 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 08:16:40.682187 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 08:16:40.682587 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 08:16:40.682733 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:16:40.688289 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 08:16:40.688381 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:16:40.690188 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 08:16:40.690319 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:16:40.692306 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 08:16:40.692447 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 08:16:40.697079 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 08:16:40.697505 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 08:16:40.697610 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:16:40.698737 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 08:16:40.704071 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 08:16:40.704209 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:16:40.706351 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 08:16:40.706461 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:16:40.712679 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 08:16:40.712795 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 08:16:40.736254 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 08:16:40.757046 ignition[1083]: INFO : Ignition 2.21.0 Aug 19 08:16:40.757046 ignition[1083]: INFO : Stage: umount Aug 19 08:16:40.759706 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:16:40.759706 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:16:40.762695 ignition[1083]: INFO : umount: umount passed Aug 19 08:16:40.762695 ignition[1083]: INFO : Ignition finished successfully Aug 19 08:16:40.768199 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 08:16:40.768479 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 08:16:40.769513 systemd[1]: Stopped target network.target - Network. Aug 19 08:16:40.771736 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 08:16:40.771853 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 08:16:40.772618 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 08:16:40.772668 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 08:16:40.775562 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 08:16:40.775622 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 08:16:40.776068 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 08:16:40.776170 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 08:16:40.776887 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 08:16:40.777454 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 08:16:40.788492 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 08:16:40.788728 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 08:16:40.794922 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 08:16:40.795574 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 08:16:40.795845 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 08:16:40.833539 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 08:16:40.834984 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 08:16:40.835612 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 08:16:40.835672 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:16:40.837193 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 08:16:40.837537 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 08:16:40.837608 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:16:40.838201 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 08:16:40.838272 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:16:40.840932 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 08:16:40.840996 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 08:16:40.842278 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 08:16:40.842346 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:16:40.882495 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:16:40.884989 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 08:16:40.885104 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:16:40.904129 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 08:16:40.904409 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:16:40.905403 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 08:16:40.905461 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 08:16:40.908652 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 08:16:40.908696 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:16:40.909199 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 08:16:40.909270 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:16:40.915628 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 08:16:40.915701 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 08:16:40.918744 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 08:16:40.918808 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:16:40.924948 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 08:16:40.925369 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 08:16:40.925483 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:16:40.932914 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 08:16:40.933012 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:16:40.936946 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:16:40.937024 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:40.941657 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 08:16:40.941732 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 08:16:40.941782 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:16:40.942233 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 08:16:40.944020 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 08:16:40.946120 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 08:16:40.946244 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 08:16:40.949119 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 08:16:40.949243 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 08:16:40.953577 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 08:16:40.953745 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 08:16:40.954544 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 08:16:40.958269 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 08:16:40.994288 systemd[1]: Switching root. Aug 19 08:16:41.050329 systemd-journald[220]: Journal stopped Aug 19 08:16:42.532029 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Aug 19 08:16:42.532119 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 08:16:42.532145 kernel: SELinux: policy capability open_perms=1 Aug 19 08:16:42.532160 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 08:16:42.532186 kernel: SELinux: policy capability always_check_network=0 Aug 19 08:16:42.532203 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 08:16:42.532227 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 08:16:42.532245 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 08:16:42.532263 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 08:16:42.532285 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 08:16:42.532313 kernel: audit: type=1403 audit(1755591401.554:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 08:16:42.532331 systemd[1]: Successfully loaded SELinux policy in 98.183ms. Aug 19 08:16:42.532358 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.256ms. Aug 19 08:16:42.532376 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:16:42.532394 systemd[1]: Detected virtualization kvm. Aug 19 08:16:42.532410 systemd[1]: Detected architecture x86-64. Aug 19 08:16:42.532426 systemd[1]: Detected first boot. Aug 19 08:16:42.532441 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:16:42.532461 zram_generator::config[1128]: No configuration found. Aug 19 08:16:42.532483 kernel: Guest personality initialized and is inactive Aug 19 08:16:42.532498 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 19 08:16:42.532512 kernel: Initialized host personality Aug 19 08:16:42.532526 kernel: NET: Registered PF_VSOCK protocol family Aug 19 08:16:42.532540 systemd[1]: Populated /etc with preset unit settings. Aug 19 08:16:42.532556 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 08:16:42.532572 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 08:16:42.532592 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 08:16:42.532609 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 08:16:42.532626 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 08:16:42.532643 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 08:16:42.532659 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 08:16:42.532688 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 08:16:42.532705 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 08:16:42.532721 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 08:16:42.532738 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 08:16:42.532763 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 08:16:42.532779 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:16:42.532796 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:16:42.532813 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 08:16:42.532947 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 08:16:42.532967 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 08:16:42.532985 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:16:42.533011 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 19 08:16:42.533028 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:16:42.533045 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:16:42.533061 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 08:16:42.533078 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 08:16:42.533094 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 08:16:42.533111 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 08:16:42.533128 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:16:42.533151 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:16:42.533168 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:16:42.533203 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:16:42.533221 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 08:16:42.533238 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 08:16:42.533341 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 08:16:42.533358 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:16:42.533375 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:16:42.533392 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:16:42.533408 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 08:16:42.533425 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 08:16:42.533446 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 08:16:42.533463 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 08:16:42.533481 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:42.533498 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 08:16:42.533514 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 08:16:42.533531 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 08:16:42.533548 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 08:16:42.533564 systemd[1]: Reached target machines.target - Containers. Aug 19 08:16:42.533584 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 08:16:42.533601 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:42.533618 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:16:42.533635 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 08:16:42.533651 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:16:42.533668 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:16:42.533685 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:16:42.533713 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 08:16:42.533730 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:16:42.533757 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 08:16:42.533774 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 08:16:42.533791 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 08:16:42.533807 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 08:16:42.533840 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 08:16:42.533859 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:42.533876 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:16:42.533893 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:16:42.533921 kernel: loop: module loaded Aug 19 08:16:42.533937 kernel: fuse: init (API version 7.41) Aug 19 08:16:42.533953 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:16:42.533970 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 08:16:42.533987 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 08:16:42.534003 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:16:42.534030 kernel: ACPI: bus type drm_connector registered Aug 19 08:16:42.534046 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 08:16:42.534062 systemd[1]: Stopped verity-setup.service. Aug 19 08:16:42.534087 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:42.534104 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 08:16:42.534129 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 08:16:42.534191 systemd-journald[1205]: Collecting audit messages is disabled. Aug 19 08:16:42.534234 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 08:16:42.534251 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 08:16:42.534267 systemd-journald[1205]: Journal started Aug 19 08:16:42.534307 systemd-journald[1205]: Runtime Journal (/run/log/journal/664a42f9439244a69b06d447ff662aec) is 6M, max 48.6M, 42.5M free. Aug 19 08:16:42.254518 systemd[1]: Queued start job for default target multi-user.target. Aug 19 08:16:42.276124 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Aug 19 08:16:42.276660 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 08:16:42.538129 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:16:42.539249 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 08:16:42.540570 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 08:16:42.542046 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 08:16:42.543845 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:16:42.545515 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 08:16:42.545808 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 08:16:42.547426 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:16:42.547736 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:16:42.549363 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:16:42.549651 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:16:42.551217 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:16:42.551511 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:16:42.553165 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 08:16:42.553491 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 08:16:42.555239 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:16:42.555540 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:16:42.557506 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:16:42.559392 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:16:42.561138 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 08:16:42.562913 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 08:16:42.580515 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:16:42.583672 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 08:16:42.586996 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 08:16:42.588478 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 08:16:42.588516 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:16:42.590964 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 08:16:42.603689 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 08:16:42.605240 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:42.606716 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 08:16:42.610946 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 08:16:42.612542 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:16:42.616593 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 08:16:42.618015 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:16:42.619812 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:16:42.624977 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 08:16:42.629100 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 08:16:42.636470 systemd-journald[1205]: Time spent on flushing to /var/log/journal/664a42f9439244a69b06d447ff662aec is 24.477ms for 981 entries. Aug 19 08:16:42.636470 systemd-journald[1205]: System Journal (/var/log/journal/664a42f9439244a69b06d447ff662aec) is 8M, max 195.6M, 187.6M free. Aug 19 08:16:42.673604 systemd-journald[1205]: Received client request to flush runtime journal. Aug 19 08:16:42.673651 kernel: loop0: detected capacity change from 0 to 128016 Aug 19 08:16:42.634083 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 08:16:42.635721 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 08:16:42.645726 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 08:16:42.649764 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 08:16:42.655200 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 08:16:42.664451 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:16:42.678251 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 08:16:42.700534 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:16:42.737716 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 08:16:42.742026 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 08:16:42.746940 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 08:16:42.751360 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:16:42.757853 kernel: loop1: detected capacity change from 0 to 224512 Aug 19 08:16:42.779481 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Aug 19 08:16:42.779507 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Aug 19 08:16:42.785771 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:16:42.789880 kernel: loop2: detected capacity change from 0 to 111000 Aug 19 08:16:42.825876 kernel: loop3: detected capacity change from 0 to 128016 Aug 19 08:16:42.836892 kernel: loop4: detected capacity change from 0 to 224512 Aug 19 08:16:42.853877 kernel: loop5: detected capacity change from 0 to 111000 Aug 19 08:16:42.868664 (sd-merge)[1269]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Aug 19 08:16:42.869483 (sd-merge)[1269]: Merged extensions into '/usr'. Aug 19 08:16:42.912108 systemd[1]: Reload requested from client PID 1247 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 08:16:42.912137 systemd[1]: Reloading... Aug 19 08:16:42.993862 zram_generator::config[1294]: No configuration found. Aug 19 08:16:43.287933 ldconfig[1242]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 08:16:43.309623 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 08:16:43.309810 systemd[1]: Reloading finished in 396 ms. Aug 19 08:16:43.343705 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 08:16:43.345356 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 08:16:43.371207 systemd[1]: Starting ensure-sysext.service... Aug 19 08:16:43.373642 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:16:43.399427 systemd[1]: Reload requested from client PID 1332 ('systemctl') (unit ensure-sysext.service)... Aug 19 08:16:43.399444 systemd[1]: Reloading... Aug 19 08:16:43.408632 systemd-tmpfiles[1333]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 08:16:43.408679 systemd-tmpfiles[1333]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 08:16:43.409126 systemd-tmpfiles[1333]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 08:16:43.409435 systemd-tmpfiles[1333]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 08:16:43.410399 systemd-tmpfiles[1333]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 08:16:43.410739 systemd-tmpfiles[1333]: ACLs are not supported, ignoring. Aug 19 08:16:43.410813 systemd-tmpfiles[1333]: ACLs are not supported, ignoring. Aug 19 08:16:43.415686 systemd-tmpfiles[1333]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:16:43.415702 systemd-tmpfiles[1333]: Skipping /boot Aug 19 08:16:43.427358 systemd-tmpfiles[1333]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:16:43.427372 systemd-tmpfiles[1333]: Skipping /boot Aug 19 08:16:43.466874 zram_generator::config[1360]: No configuration found. Aug 19 08:16:43.654628 systemd[1]: Reloading finished in 254 ms. Aug 19 08:16:43.676818 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 08:16:43.699410 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:16:43.709272 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:16:43.712489 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 08:16:43.715076 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 08:16:43.727031 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:16:43.730665 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:16:43.736205 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 08:16:43.743123 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:43.743558 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:43.752115 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:16:43.755341 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:16:43.758887 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:16:43.760120 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:43.760237 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:43.764093 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 08:16:43.765169 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:43.771350 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 08:16:43.773914 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:16:43.774313 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:16:43.777325 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:16:43.777801 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:16:43.779616 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:16:43.779888 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:16:43.789748 systemd-udevd[1403]: Using default interface naming scheme 'v255'. Aug 19 08:16:43.792726 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:43.793833 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:16:43.796491 augenrules[1432]: No rules Aug 19 08:16:43.796980 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:16:43.799424 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:16:43.803006 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:16:43.806117 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:16:43.806680 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:16:43.806778 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:16:43.808174 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 08:16:43.809394 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:16:43.810599 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:16:43.816254 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:16:43.820164 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 08:16:43.823085 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:16:43.823722 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:16:43.825900 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 08:16:43.828931 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 08:16:43.830630 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:16:43.831386 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:16:43.833978 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:16:43.834211 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:16:43.835987 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:16:43.836321 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:16:43.839741 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:16:43.842726 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 08:16:43.856000 systemd[1]: Finished ensure-sysext.service. Aug 19 08:16:43.874948 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:16:43.876209 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:16:43.876307 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:16:43.878619 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 19 08:16:43.879777 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 08:16:43.966728 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 19 08:16:43.992255 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 08:16:43.999004 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 08:16:44.025874 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 08:16:44.033155 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 08:16:44.042976 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Aug 19 08:16:44.051609 kernel: ACPI: button: Power Button [PWRF] Aug 19 08:16:44.070992 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Aug 19 08:16:44.071295 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Aug 19 08:16:44.090714 systemd-resolved[1402]: Positive Trust Anchors: Aug 19 08:16:44.091102 systemd-resolved[1402]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:16:44.091199 systemd-resolved[1402]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:16:44.097319 systemd-resolved[1402]: Defaulting to hostname 'linux'. Aug 19 08:16:44.099766 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:16:44.101375 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:16:44.136298 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:16:44.146992 systemd-networkd[1480]: lo: Link UP Aug 19 08:16:44.147278 systemd-networkd[1480]: lo: Gained carrier Aug 19 08:16:44.149006 systemd-networkd[1480]: Enumeration completed Aug 19 08:16:44.149154 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:16:44.150763 systemd[1]: Reached target network.target - Network. Aug 19 08:16:44.182188 systemd-networkd[1480]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:44.184456 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 08:16:44.184922 systemd-networkd[1480]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:16:44.186699 systemd-networkd[1480]: eth0: Link UP Aug 19 08:16:44.187116 systemd-networkd[1480]: eth0: Gained carrier Aug 19 08:16:44.187207 systemd-networkd[1480]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:16:44.192010 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 08:16:44.204937 systemd-networkd[1480]: eth0: DHCPv4 address 10.0.0.123/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 08:16:44.221487 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 19 08:16:44.223363 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 08:16:44.768446 systemd-resolved[1402]: Clock change detected. Flushing caches. Aug 19 08:16:44.768627 systemd-timesyncd[1482]: Contacted time server 10.0.0.1:123 (10.0.0.1). Aug 19 08:16:44.768704 systemd-timesyncd[1482]: Initial clock synchronization to Tue 2025-08-19 08:16:44.768377 UTC. Aug 19 08:16:44.771948 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 08:16:44.787701 kernel: kvm_amd: TSC scaling supported Aug 19 08:16:44.787779 kernel: kvm_amd: Nested Virtualization enabled Aug 19 08:16:44.787824 kernel: kvm_amd: Nested Paging enabled Aug 19 08:16:44.789117 kernel: kvm_amd: LBR virtualization supported Aug 19 08:16:44.790260 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Aug 19 08:16:44.790298 kernel: kvm_amd: Virtual GIF supported Aug 19 08:16:44.876500 kernel: EDAC MC: Ver: 3.0.0 Aug 19 08:16:44.892271 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:16:44.893939 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:16:44.895198 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 08:16:44.896526 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 08:16:44.897818 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 19 08:16:44.899280 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 08:16:44.900569 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 08:16:44.901884 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 08:16:44.903187 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 08:16:44.903231 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:16:44.904173 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:16:44.906358 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 08:16:44.910257 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 08:16:44.914326 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 08:16:44.915866 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 08:16:44.917167 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 08:16:44.927978 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 08:16:44.929642 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 08:16:44.931804 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 08:16:44.933857 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:16:44.934913 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:16:44.935944 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:16:44.935996 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:16:44.937590 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 08:16:44.940744 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 08:16:44.954051 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 08:16:44.957038 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 08:16:44.960059 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 08:16:44.961241 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 08:16:44.962400 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 19 08:16:44.969482 jq[1530]: false Aug 19 08:16:44.968150 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 08:16:44.969791 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 08:16:44.974221 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 08:16:44.976391 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 08:16:44.981839 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 08:16:44.982100 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Refreshing passwd entry cache Aug 19 08:16:44.982115 oslogin_cache_refresh[1532]: Refreshing passwd entry cache Aug 19 08:16:44.983837 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 08:16:44.986857 extend-filesystems[1531]: Found /dev/vda6 Aug 19 08:16:44.991858 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 08:16:44.993161 extend-filesystems[1531]: Found /dev/vda9 Aug 19 08:16:44.993379 oslogin_cache_refresh[1532]: Failure getting users, quitting Aug 19 08:16:44.994128 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Failure getting users, quitting Aug 19 08:16:44.994128 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:16:44.993401 oslogin_cache_refresh[1532]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:16:44.997000 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Refreshing group entry cache Aug 19 08:16:44.994533 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 08:16:44.994509 oslogin_cache_refresh[1532]: Refreshing group entry cache Aug 19 08:16:44.997511 extend-filesystems[1531]: Checking size of /dev/vda9 Aug 19 08:16:44.997856 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 08:16:45.001288 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Failure getting groups, quitting Aug 19 08:16:45.001288 google_oslogin_nss_cache[1532]: oslogin_cache_refresh[1532]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:16:45.001278 oslogin_cache_refresh[1532]: Failure getting groups, quitting Aug 19 08:16:45.001291 oslogin_cache_refresh[1532]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:16:45.004093 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 08:16:45.006045 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 08:16:45.006986 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 08:16:45.007431 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 19 08:16:45.007775 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 19 08:16:45.009501 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 08:16:45.009982 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 08:16:45.012787 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 08:16:45.014128 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 08:16:45.028767 update_engine[1547]: I20250819 08:16:45.027016 1547 main.cc:92] Flatcar Update Engine starting Aug 19 08:16:45.029090 extend-filesystems[1531]: Resized partition /dev/vda9 Aug 19 08:16:45.045352 extend-filesystems[1568]: resize2fs 1.47.2 (1-Jan-2025) Aug 19 08:16:45.047328 jq[1550]: true Aug 19 08:16:45.052866 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Aug 19 08:16:45.052746 (ntainerd)[1567]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 08:16:45.078114 tar[1557]: linux-amd64/LICENSE Aug 19 08:16:45.080090 tar[1557]: linux-amd64/helm Aug 19 08:16:45.109707 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Aug 19 08:16:45.307579 extend-filesystems[1568]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Aug 19 08:16:45.307579 extend-filesystems[1568]: old_desc_blocks = 1, new_desc_blocks = 1 Aug 19 08:16:45.307579 extend-filesystems[1568]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Aug 19 08:16:45.312139 jq[1570]: true Aug 19 08:16:45.312338 extend-filesystems[1531]: Resized filesystem in /dev/vda9 Aug 19 08:16:45.313342 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 08:16:45.319018 dbus-daemon[1528]: [system] SELinux support is enabled Aug 19 08:16:45.314884 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 08:16:45.319976 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 08:16:45.325763 update_engine[1547]: I20250819 08:16:45.325444 1547 update_check_scheduler.cc:74] Next update check in 3m57s Aug 19 08:16:45.325720 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 08:16:45.325761 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 08:16:45.327274 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 08:16:45.327307 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 08:16:45.333553 systemd[1]: Started update-engine.service - Update Engine. Aug 19 08:16:45.342509 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 08:16:45.359814 systemd-logind[1541]: Watching system buttons on /dev/input/event2 (Power Button) Aug 19 08:16:45.361120 systemd-logind[1541]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 19 08:16:45.467392 sshd_keygen[1555]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 08:16:45.467637 systemd-logind[1541]: New seat seat0. Aug 19 08:16:45.473722 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 08:16:45.484357 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 08:16:45.489405 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 08:16:45.514389 locksmithd[1578]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 08:16:45.517865 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 08:16:45.518290 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 08:16:45.529342 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 08:16:45.550142 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 08:16:45.554378 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 08:16:45.559353 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 19 08:16:45.560943 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 08:16:45.814216 tar[1557]: linux-amd64/README.md Aug 19 08:16:45.885634 bash[1599]: Updated "/home/core/.ssh/authorized_keys" Aug 19 08:16:45.888113 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 08:16:45.890106 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 08:16:45.893201 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 19 08:16:45.917844 containerd[1567]: time="2025-08-19T08:16:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 08:16:45.918789 containerd[1567]: time="2025-08-19T08:16:45.918711867Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 08:16:45.930810 containerd[1567]: time="2025-08-19T08:16:45.930760036Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="27.39µs" Aug 19 08:16:45.930902 containerd[1567]: time="2025-08-19T08:16:45.930874150Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 08:16:45.930927 containerd[1567]: time="2025-08-19T08:16:45.930905709Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 08:16:45.931445 containerd[1567]: time="2025-08-19T08:16:45.931348960Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 08:16:45.931505 containerd[1567]: time="2025-08-19T08:16:45.931468184Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 08:16:45.931527 containerd[1567]: time="2025-08-19T08:16:45.931507668Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:16:45.931630 containerd[1567]: time="2025-08-19T08:16:45.931596384Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:16:45.931630 containerd[1567]: time="2025-08-19T08:16:45.931613246Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:16:45.931979 containerd[1567]: time="2025-08-19T08:16:45.931942514Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:16:45.931979 containerd[1567]: time="2025-08-19T08:16:45.931962281Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:16:45.931979 containerd[1567]: time="2025-08-19T08:16:45.931974484Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:16:45.932052 containerd[1567]: time="2025-08-19T08:16:45.931983831Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 08:16:45.932121 containerd[1567]: time="2025-08-19T08:16:45.932097094Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 08:16:45.932393 containerd[1567]: time="2025-08-19T08:16:45.932358674Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:16:45.932422 containerd[1567]: time="2025-08-19T08:16:45.932397347Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:16:45.932422 containerd[1567]: time="2025-08-19T08:16:45.932407576Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 08:16:45.932511 containerd[1567]: time="2025-08-19T08:16:45.932488167Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 08:16:45.932813 containerd[1567]: time="2025-08-19T08:16:45.932770427Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 08:16:45.932898 containerd[1567]: time="2025-08-19T08:16:45.932866667Z" level=info msg="metadata content store policy set" policy=shared Aug 19 08:16:46.093869 containerd[1567]: time="2025-08-19T08:16:46.093712097Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 08:16:46.093869 containerd[1567]: time="2025-08-19T08:16:46.093829728Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 08:16:46.093869 containerd[1567]: time="2025-08-19T08:16:46.093851398Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 08:16:46.093869 containerd[1567]: time="2025-08-19T08:16:46.093868681Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093890712Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093906672Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093930968Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093949482Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093966965Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093984027Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.093997402Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 08:16:46.094029 containerd[1567]: time="2025-08-19T08:16:46.094014885Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 08:16:46.094315 containerd[1567]: time="2025-08-19T08:16:46.094272238Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 08:16:46.094376 containerd[1567]: time="2025-08-19T08:16:46.094318034Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 08:16:46.094376 containerd[1567]: time="2025-08-19T08:16:46.094339945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 08:16:46.094416 containerd[1567]: time="2025-08-19T08:16:46.094382665Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 08:16:46.094416 containerd[1567]: time="2025-08-19T08:16:46.094402061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 08:16:46.094479 containerd[1567]: time="2025-08-19T08:16:46.094416779Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 08:16:46.094479 containerd[1567]: time="2025-08-19T08:16:46.094437348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 08:16:46.094536 containerd[1567]: time="2025-08-19T08:16:46.094488203Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 08:16:46.094536 containerd[1567]: time="2025-08-19T08:16:46.094508541Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 08:16:46.094536 containerd[1567]: time="2025-08-19T08:16:46.094526274Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 08:16:46.094594 containerd[1567]: time="2025-08-19T08:16:46.094541102Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 08:16:46.094794 containerd[1567]: time="2025-08-19T08:16:46.094658152Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 08:16:46.094794 containerd[1567]: time="2025-08-19T08:16:46.094781493Z" level=info msg="Start snapshots syncer" Aug 19 08:16:46.094847 containerd[1567]: time="2025-08-19T08:16:46.094830876Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 08:16:46.095259 containerd[1567]: time="2025-08-19T08:16:46.095195600Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 08:16:46.095426 containerd[1567]: time="2025-08-19T08:16:46.095280770Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 08:16:46.097930 containerd[1567]: time="2025-08-19T08:16:46.097887208Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 08:16:46.098096 containerd[1567]: time="2025-08-19T08:16:46.098058289Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 08:16:46.098122 containerd[1567]: time="2025-08-19T08:16:46.098092984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 08:16:46.098122 containerd[1567]: time="2025-08-19T08:16:46.098108764Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 08:16:46.098160 containerd[1567]: time="2025-08-19T08:16:46.098124123Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 08:16:46.098160 containerd[1567]: time="2025-08-19T08:16:46.098149270Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 08:16:46.098204 containerd[1567]: time="2025-08-19T08:16:46.098163797Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 08:16:46.098204 containerd[1567]: time="2025-08-19T08:16:46.098179196Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 08:16:46.098251 containerd[1567]: time="2025-08-19T08:16:46.098211387Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 08:16:46.098272 containerd[1567]: time="2025-08-19T08:16:46.098248426Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 08:16:46.098272 containerd[1567]: time="2025-08-19T08:16:46.098266530Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 08:16:46.098356 containerd[1567]: time="2025-08-19T08:16:46.098321934Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:16:46.098381 containerd[1567]: time="2025-08-19T08:16:46.098350027Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:16:46.098381 containerd[1567]: time="2025-08-19T08:16:46.098364373Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:16:46.098428 containerd[1567]: time="2025-08-19T08:16:46.098379181Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:16:46.098428 containerd[1567]: time="2025-08-19T08:16:46.098392176Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 08:16:46.098428 containerd[1567]: time="2025-08-19T08:16:46.098405941Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 08:16:46.098428 containerd[1567]: time="2025-08-19T08:16:46.098420469Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 08:16:46.098521 containerd[1567]: time="2025-08-19T08:16:46.098445295Z" level=info msg="runtime interface created" Aug 19 08:16:46.098521 containerd[1567]: time="2025-08-19T08:16:46.098477726Z" level=info msg="created NRI interface" Aug 19 08:16:46.098521 containerd[1567]: time="2025-08-19T08:16:46.098490751Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 08:16:46.098521 containerd[1567]: time="2025-08-19T08:16:46.098505859Z" level=info msg="Connect containerd service" Aug 19 08:16:46.098604 containerd[1567]: time="2025-08-19T08:16:46.098539382Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 08:16:46.099672 containerd[1567]: time="2025-08-19T08:16:46.099629036Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 08:16:46.219193 containerd[1567]: time="2025-08-19T08:16:46.219136788Z" level=info msg="Start subscribing containerd event" Aug 19 08:16:46.219353 containerd[1567]: time="2025-08-19T08:16:46.219201830Z" level=info msg="Start recovering state" Aug 19 08:16:46.219422 containerd[1567]: time="2025-08-19T08:16:46.219362882Z" level=info msg="Start event monitor" Aug 19 08:16:46.219422 containerd[1567]: time="2025-08-19T08:16:46.219371809Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 08:16:46.219501 containerd[1567]: time="2025-08-19T08:16:46.219445086Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 08:16:46.219501 containerd[1567]: time="2025-08-19T08:16:46.219382900Z" level=info msg="Start cni network conf syncer for default" Aug 19 08:16:46.219555 containerd[1567]: time="2025-08-19T08:16:46.219506902Z" level=info msg="Start streaming server" Aug 19 08:16:46.219555 containerd[1567]: time="2025-08-19T08:16:46.219519456Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 08:16:46.219555 containerd[1567]: time="2025-08-19T08:16:46.219527501Z" level=info msg="runtime interface starting up..." Aug 19 08:16:46.219555 containerd[1567]: time="2025-08-19T08:16:46.219538792Z" level=info msg="starting plugins..." Aug 19 08:16:46.219677 containerd[1567]: time="2025-08-19T08:16:46.219559020Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 08:16:46.220083 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 08:16:46.221074 containerd[1567]: time="2025-08-19T08:16:46.221009501Z" level=info msg="containerd successfully booted in 0.303825s" Aug 19 08:16:46.568018 systemd-networkd[1480]: eth0: Gained IPv6LL Aug 19 08:16:46.573592 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 08:16:46.576218 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 08:16:46.580257 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Aug 19 08:16:46.583205 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:46.605749 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 08:16:46.639917 systemd[1]: coreos-metadata.service: Deactivated successfully. Aug 19 08:16:46.640377 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Aug 19 08:16:46.642444 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 08:16:46.646509 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 08:16:48.072087 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:48.075533 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 08:16:48.076870 systemd[1]: Startup finished in 3.281s (kernel) + 6.904s (initrd) + 6.046s (userspace) = 16.232s. Aug 19 08:16:48.101853 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:16:48.870303 kubelet[1664]: E0819 08:16:48.870226 1664 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:16:48.874693 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:16:48.874970 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:16:48.875372 systemd[1]: kubelet.service: Consumed 2.028s CPU time, 264.5M memory peak. Aug 19 08:16:48.979898 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 08:16:48.981541 systemd[1]: Started sshd@0-10.0.0.123:22-10.0.0.1:55268.service - OpenSSH per-connection server daemon (10.0.0.1:55268). Aug 19 08:16:49.179137 sshd[1677]: Accepted publickey for core from 10.0.0.1 port 55268 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:49.181940 sshd-session[1677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:49.190257 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 08:16:49.191486 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 08:16:49.198216 systemd-logind[1541]: New session 1 of user core. Aug 19 08:16:49.221180 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 08:16:49.224668 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 08:16:49.254354 (systemd)[1682]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 08:16:49.257363 systemd-logind[1541]: New session c1 of user core. Aug 19 08:16:49.454841 systemd[1682]: Queued start job for default target default.target. Aug 19 08:16:49.471777 systemd[1682]: Created slice app.slice - User Application Slice. Aug 19 08:16:49.471830 systemd[1682]: Reached target paths.target - Paths. Aug 19 08:16:49.471898 systemd[1682]: Reached target timers.target - Timers. Aug 19 08:16:49.474021 systemd[1682]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 08:16:49.491429 systemd[1682]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 08:16:49.491799 systemd[1682]: Reached target sockets.target - Sockets. Aug 19 08:16:49.491871 systemd[1682]: Reached target basic.target - Basic System. Aug 19 08:16:49.491928 systemd[1682]: Reached target default.target - Main User Target. Aug 19 08:16:49.491976 systemd[1682]: Startup finished in 223ms. Aug 19 08:16:49.493727 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 08:16:49.509803 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 08:16:49.576119 systemd[1]: Started sshd@1-10.0.0.123:22-10.0.0.1:55284.service - OpenSSH per-connection server daemon (10.0.0.1:55284). Aug 19 08:16:49.644739 sshd[1693]: Accepted publickey for core from 10.0.0.1 port 55284 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:49.646333 sshd-session[1693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:49.651680 systemd-logind[1541]: New session 2 of user core. Aug 19 08:16:49.666671 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 08:16:49.724248 sshd[1696]: Connection closed by 10.0.0.1 port 55284 Aug 19 08:16:49.724538 sshd-session[1693]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:49.742725 systemd[1]: sshd@1-10.0.0.123:22-10.0.0.1:55284.service: Deactivated successfully. Aug 19 08:16:49.745308 systemd[1]: session-2.scope: Deactivated successfully. Aug 19 08:16:49.746361 systemd-logind[1541]: Session 2 logged out. Waiting for processes to exit. Aug 19 08:16:49.750319 systemd[1]: Started sshd@2-10.0.0.123:22-10.0.0.1:55298.service - OpenSSH per-connection server daemon (10.0.0.1:55298). Aug 19 08:16:49.751715 systemd-logind[1541]: Removed session 2. Aug 19 08:16:49.816554 sshd[1702]: Accepted publickey for core from 10.0.0.1 port 55298 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:49.818445 sshd-session[1702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:49.824125 systemd-logind[1541]: New session 3 of user core. Aug 19 08:16:49.833780 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 08:16:49.886396 sshd[1705]: Connection closed by 10.0.0.1 port 55298 Aug 19 08:16:49.886813 sshd-session[1702]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:49.901891 systemd[1]: sshd@2-10.0.0.123:22-10.0.0.1:55298.service: Deactivated successfully. Aug 19 08:16:49.904438 systemd[1]: session-3.scope: Deactivated successfully. Aug 19 08:16:49.905385 systemd-logind[1541]: Session 3 logged out. Waiting for processes to exit. Aug 19 08:16:49.909476 systemd[1]: Started sshd@3-10.0.0.123:22-10.0.0.1:55310.service - OpenSSH per-connection server daemon (10.0.0.1:55310). Aug 19 08:16:49.910163 systemd-logind[1541]: Removed session 3. Aug 19 08:16:49.968486 sshd[1711]: Accepted publickey for core from 10.0.0.1 port 55310 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:49.970474 sshd-session[1711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:49.975986 systemd-logind[1541]: New session 4 of user core. Aug 19 08:16:49.990693 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 08:16:50.045303 sshd[1715]: Connection closed by 10.0.0.1 port 55310 Aug 19 08:16:50.045776 sshd-session[1711]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:50.060261 systemd[1]: sshd@3-10.0.0.123:22-10.0.0.1:55310.service: Deactivated successfully. Aug 19 08:16:50.062147 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 08:16:50.062943 systemd-logind[1541]: Session 4 logged out. Waiting for processes to exit. Aug 19 08:16:50.066370 systemd[1]: Started sshd@4-10.0.0.123:22-10.0.0.1:55322.service - OpenSSH per-connection server daemon (10.0.0.1:55322). Aug 19 08:16:50.067273 systemd-logind[1541]: Removed session 4. Aug 19 08:16:50.128737 sshd[1721]: Accepted publickey for core from 10.0.0.1 port 55322 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:50.130647 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:50.136984 systemd-logind[1541]: New session 5 of user core. Aug 19 08:16:50.150837 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 08:16:50.214684 sudo[1726]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 08:16:50.215106 sudo[1726]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:50.236789 sudo[1726]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:50.238740 sshd[1725]: Connection closed by 10.0.0.1 port 55322 Aug 19 08:16:50.239166 sshd-session[1721]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:50.252855 systemd[1]: sshd@4-10.0.0.123:22-10.0.0.1:55322.service: Deactivated successfully. Aug 19 08:16:50.255016 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 08:16:50.255996 systemd-logind[1541]: Session 5 logged out. Waiting for processes to exit. Aug 19 08:16:50.259494 systemd[1]: Started sshd@5-10.0.0.123:22-10.0.0.1:55338.service - OpenSSH per-connection server daemon (10.0.0.1:55338). Aug 19 08:16:50.260050 systemd-logind[1541]: Removed session 5. Aug 19 08:16:50.315797 sshd[1732]: Accepted publickey for core from 10.0.0.1 port 55338 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:50.317281 sshd-session[1732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:50.322307 systemd-logind[1541]: New session 6 of user core. Aug 19 08:16:50.331643 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 08:16:50.387289 sudo[1737]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 08:16:50.387659 sudo[1737]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:50.526359 sudo[1737]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:50.533769 sudo[1736]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 08:16:50.534105 sudo[1736]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:50.545825 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:16:50.601955 augenrules[1759]: No rules Aug 19 08:16:50.604167 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:16:50.604525 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:16:50.605787 sudo[1736]: pam_unix(sudo:session): session closed for user root Aug 19 08:16:50.607481 sshd[1735]: Connection closed by 10.0.0.1 port 55338 Aug 19 08:16:50.607970 sshd-session[1732]: pam_unix(sshd:session): session closed for user core Aug 19 08:16:50.617705 systemd[1]: sshd@5-10.0.0.123:22-10.0.0.1:55338.service: Deactivated successfully. Aug 19 08:16:50.619734 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 08:16:50.620611 systemd-logind[1541]: Session 6 logged out. Waiting for processes to exit. Aug 19 08:16:50.623304 systemd[1]: Started sshd@6-10.0.0.123:22-10.0.0.1:55348.service - OpenSSH per-connection server daemon (10.0.0.1:55348). Aug 19 08:16:50.624234 systemd-logind[1541]: Removed session 6. Aug 19 08:16:50.688926 sshd[1768]: Accepted publickey for core from 10.0.0.1 port 55348 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:16:50.690419 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:16:50.695419 systemd-logind[1541]: New session 7 of user core. Aug 19 08:16:50.704579 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 08:16:50.758207 sudo[1772]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 08:16:50.758554 sudo[1772]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:16:51.408818 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 08:16:51.423859 (dockerd)[1792]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 08:16:51.816216 dockerd[1792]: time="2025-08-19T08:16:51.816055237Z" level=info msg="Starting up" Aug 19 08:16:51.817071 dockerd[1792]: time="2025-08-19T08:16:51.817035877Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 08:16:51.840109 dockerd[1792]: time="2025-08-19T08:16:51.840067277Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 08:16:52.184671 dockerd[1792]: time="2025-08-19T08:16:52.184505575Z" level=info msg="Loading containers: start." Aug 19 08:16:52.195489 kernel: Initializing XFRM netlink socket Aug 19 08:16:52.521669 systemd-networkd[1480]: docker0: Link UP Aug 19 08:16:52.528022 dockerd[1792]: time="2025-08-19T08:16:52.527953094Z" level=info msg="Loading containers: done." Aug 19 08:16:52.543567 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4250533254-merged.mount: Deactivated successfully. Aug 19 08:16:52.546697 dockerd[1792]: time="2025-08-19T08:16:52.546635096Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 08:16:52.546779 dockerd[1792]: time="2025-08-19T08:16:52.546752126Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 08:16:52.546922 dockerd[1792]: time="2025-08-19T08:16:52.546890064Z" level=info msg="Initializing buildkit" Aug 19 08:16:52.581914 dockerd[1792]: time="2025-08-19T08:16:52.581849188Z" level=info msg="Completed buildkit initialization" Aug 19 08:16:52.586591 dockerd[1792]: time="2025-08-19T08:16:52.586522624Z" level=info msg="Daemon has completed initialization" Aug 19 08:16:52.586748 dockerd[1792]: time="2025-08-19T08:16:52.586601121Z" level=info msg="API listen on /run/docker.sock" Aug 19 08:16:52.586769 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 08:16:53.408536 containerd[1567]: time="2025-08-19T08:16:53.408477887Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Aug 19 08:16:54.200697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3844888874.mount: Deactivated successfully. Aug 19 08:16:55.780747 containerd[1567]: time="2025-08-19T08:16:55.780681262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.781331 containerd[1567]: time="2025-08-19T08:16:55.781281388Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Aug 19 08:16:55.782553 containerd[1567]: time="2025-08-19T08:16:55.782512998Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.785991 containerd[1567]: time="2025-08-19T08:16:55.785952810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:55.786883 containerd[1567]: time="2025-08-19T08:16:55.786840936Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.378311252s" Aug 19 08:16:55.786937 containerd[1567]: time="2025-08-19T08:16:55.786884718Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Aug 19 08:16:55.788125 containerd[1567]: time="2025-08-19T08:16:55.788090660Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Aug 19 08:16:57.169845 containerd[1567]: time="2025-08-19T08:16:57.169741753Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:57.170583 containerd[1567]: time="2025-08-19T08:16:57.170505926Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Aug 19 08:16:57.171775 containerd[1567]: time="2025-08-19T08:16:57.171737897Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:57.177775 containerd[1567]: time="2025-08-19T08:16:57.177714207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:16:57.179173 containerd[1567]: time="2025-08-19T08:16:57.179107771Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.390987895s" Aug 19 08:16:57.179173 containerd[1567]: time="2025-08-19T08:16:57.179162083Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Aug 19 08:16:57.179852 containerd[1567]: time="2025-08-19T08:16:57.179815298Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Aug 19 08:16:59.125217 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 08:16:59.127173 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:16:59.522991 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:16:59.541791 (kubelet)[2080]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:17:00.288683 kubelet[2080]: E0819 08:17:00.288571 2080 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:17:00.296772 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:17:00.297032 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:17:00.297509 systemd[1]: kubelet.service: Consumed 504ms CPU time, 111.5M memory peak. Aug 19 08:17:00.476742 containerd[1567]: time="2025-08-19T08:17:00.476653775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.477802 containerd[1567]: time="2025-08-19T08:17:00.477747526Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Aug 19 08:17:00.479741 containerd[1567]: time="2025-08-19T08:17:00.479708695Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.483919 containerd[1567]: time="2025-08-19T08:17:00.483845865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:00.484921 containerd[1567]: time="2025-08-19T08:17:00.484877480Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 3.30502368s" Aug 19 08:17:00.484921 containerd[1567]: time="2025-08-19T08:17:00.484916022Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Aug 19 08:17:00.485520 containerd[1567]: time="2025-08-19T08:17:00.485489929Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Aug 19 08:17:01.654481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3173377769.mount: Deactivated successfully. Aug 19 08:17:03.064057 containerd[1567]: time="2025-08-19T08:17:03.063950894Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:03.065151 containerd[1567]: time="2025-08-19T08:17:03.065092075Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Aug 19 08:17:03.071346 containerd[1567]: time="2025-08-19T08:17:03.071223486Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:03.112914 containerd[1567]: time="2025-08-19T08:17:03.112785584Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:03.113840 containerd[1567]: time="2025-08-19T08:17:03.113745204Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 2.628217785s" Aug 19 08:17:03.113915 containerd[1567]: time="2025-08-19T08:17:03.113850221Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Aug 19 08:17:03.114943 containerd[1567]: time="2025-08-19T08:17:03.114917283Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 19 08:17:03.829569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1974824153.mount: Deactivated successfully. Aug 19 08:17:04.921319 containerd[1567]: time="2025-08-19T08:17:04.921207096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.921991 containerd[1567]: time="2025-08-19T08:17:04.921947886Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Aug 19 08:17:04.923475 containerd[1567]: time="2025-08-19T08:17:04.923410219Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.926953 containerd[1567]: time="2025-08-19T08:17:04.926881460Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:04.928104 containerd[1567]: time="2025-08-19T08:17:04.928061794Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.813113273s" Aug 19 08:17:04.928104 containerd[1567]: time="2025-08-19T08:17:04.928097621Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 19 08:17:04.928708 containerd[1567]: time="2025-08-19T08:17:04.928631142Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 08:17:06.914698 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1677303350.mount: Deactivated successfully. Aug 19 08:17:07.003686 containerd[1567]: time="2025-08-19T08:17:07.003597045Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:17:07.004488 containerd[1567]: time="2025-08-19T08:17:07.004394530Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 19 08:17:07.006696 containerd[1567]: time="2025-08-19T08:17:07.006622850Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:17:07.009075 containerd[1567]: time="2025-08-19T08:17:07.009026549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:17:07.009969 containerd[1567]: time="2025-08-19T08:17:07.009916488Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 2.081256723s" Aug 19 08:17:07.009969 containerd[1567]: time="2025-08-19T08:17:07.009957014Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 19 08:17:07.010735 containerd[1567]: time="2025-08-19T08:17:07.010698244Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Aug 19 08:17:09.365247 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount17232145.mount: Deactivated successfully. Aug 19 08:17:10.377161 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 19 08:17:10.379348 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:10.865971 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:10.878991 (kubelet)[2212]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:17:10.956766 kubelet[2212]: E0819 08:17:10.956654 2212 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:17:10.962051 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:17:10.962333 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:17:10.963603 systemd[1]: kubelet.service: Consumed 278ms CPU time, 111M memory peak. Aug 19 08:17:12.129646 containerd[1567]: time="2025-08-19T08:17:12.129566378Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:12.130796 containerd[1567]: time="2025-08-19T08:17:12.130766399Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Aug 19 08:17:12.132818 containerd[1567]: time="2025-08-19T08:17:12.132792820Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:12.137124 containerd[1567]: time="2025-08-19T08:17:12.137053732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:12.138648 containerd[1567]: time="2025-08-19T08:17:12.138597949Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 5.127863195s" Aug 19 08:17:12.138648 containerd[1567]: time="2025-08-19T08:17:12.138636280Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Aug 19 08:17:14.115587 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:14.115763 systemd[1]: kubelet.service: Consumed 278ms CPU time, 111M memory peak. Aug 19 08:17:14.118203 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:14.147134 systemd[1]: Reload requested from client PID 2252 ('systemctl') (unit session-7.scope)... Aug 19 08:17:14.147153 systemd[1]: Reloading... Aug 19 08:17:14.252490 zram_generator::config[2297]: No configuration found. Aug 19 08:17:14.854542 systemd[1]: Reloading finished in 706 ms. Aug 19 08:17:14.933390 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 08:17:14.933536 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 08:17:14.933974 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:14.934032 systemd[1]: kubelet.service: Consumed 170ms CPU time, 98.5M memory peak. Aug 19 08:17:14.936935 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:15.158128 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:15.163576 (kubelet)[2342]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:17:15.207946 kubelet[2342]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:15.207946 kubelet[2342]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 08:17:15.207946 kubelet[2342]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:15.208421 kubelet[2342]: I0819 08:17:15.208026 2342 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:17:15.479122 kubelet[2342]: I0819 08:17:15.478932 2342 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Aug 19 08:17:15.479122 kubelet[2342]: I0819 08:17:15.478975 2342 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:17:15.479318 kubelet[2342]: I0819 08:17:15.479281 2342 server.go:954] "Client rotation is on, will bootstrap in background" Aug 19 08:17:15.510549 kubelet[2342]: E0819 08:17:15.510483 2342 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.123:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:15.513576 kubelet[2342]: I0819 08:17:15.513508 2342 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:17:15.528117 kubelet[2342]: I0819 08:17:15.528065 2342 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:17:15.533940 kubelet[2342]: I0819 08:17:15.533895 2342 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:17:15.535289 kubelet[2342]: I0819 08:17:15.535223 2342 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:17:15.535499 kubelet[2342]: I0819 08:17:15.535275 2342 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:17:15.535674 kubelet[2342]: I0819 08:17:15.535520 2342 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:17:15.535674 kubelet[2342]: I0819 08:17:15.535530 2342 container_manager_linux.go:304] "Creating device plugin manager" Aug 19 08:17:15.535796 kubelet[2342]: I0819 08:17:15.535774 2342 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:15.637884 kubelet[2342]: I0819 08:17:15.637828 2342 kubelet.go:446] "Attempting to sync node with API server" Aug 19 08:17:15.637884 kubelet[2342]: I0819 08:17:15.637899 2342 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:17:15.638050 kubelet[2342]: I0819 08:17:15.637978 2342 kubelet.go:352] "Adding apiserver pod source" Aug 19 08:17:15.638050 kubelet[2342]: I0819 08:17:15.638001 2342 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:17:15.641596 kubelet[2342]: I0819 08:17:15.641553 2342 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:17:15.641999 kubelet[2342]: I0819 08:17:15.641979 2342 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:17:15.643064 kubelet[2342]: W0819 08:17:15.643013 2342 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 08:17:15.645077 kubelet[2342]: W0819 08:17:15.644783 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.123:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:15.645077 kubelet[2342]: I0819 08:17:15.644879 2342 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 08:17:15.645077 kubelet[2342]: I0819 08:17:15.644927 2342 server.go:1287] "Started kubelet" Aug 19 08:17:15.645077 kubelet[2342]: E0819 08:17:15.644915 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.123:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:15.645754 kubelet[2342]: W0819 08:17:15.645686 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.123:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:15.645754 kubelet[2342]: E0819 08:17:15.645745 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.123:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:15.649482 kubelet[2342]: I0819 08:17:15.648761 2342 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:17:15.649608 kubelet[2342]: I0819 08:17:15.649526 2342 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:17:15.653244 kubelet[2342]: I0819 08:17:15.653202 2342 server.go:479] "Adding debug handlers to kubelet server" Aug 19 08:17:15.653666 kubelet[2342]: I0819 08:17:15.653608 2342 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:17:15.653909 kubelet[2342]: I0819 08:17:15.653870 2342 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:17:15.654561 kubelet[2342]: I0819 08:17:15.654507 2342 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:17:15.657276 kubelet[2342]: E0819 08:17:15.655200 2342 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.123:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.123:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185d1d1bf45524e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-19 08:17:15.644904678 +0000 UTC m=+0.476612109,LastTimestamp:2025-08-19 08:17:15.644904678 +0000 UTC m=+0.476612109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 19 08:17:15.658542 kubelet[2342]: I0819 08:17:15.658506 2342 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 08:17:15.658718 kubelet[2342]: I0819 08:17:15.658696 2342 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 08:17:15.658798 kubelet[2342]: I0819 08:17:15.658779 2342 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:17:15.659843 kubelet[2342]: E0819 08:17:15.659802 2342 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:17:15.659971 kubelet[2342]: W0819 08:17:15.659945 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.123:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:15.660009 kubelet[2342]: E0819 08:17:15.659994 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.123:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:15.660733 kubelet[2342]: E0819 08:17:15.660697 2342 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:15.661220 kubelet[2342]: I0819 08:17:15.661157 2342 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:17:15.661220 kubelet[2342]: I0819 08:17:15.661175 2342 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:17:15.661312 kubelet[2342]: I0819 08:17:15.661284 2342 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:17:15.661673 kubelet[2342]: E0819 08:17:15.661590 2342 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.123:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.123:6443: connect: connection refused" interval="200ms" Aug 19 08:17:15.672686 kubelet[2342]: I0819 08:17:15.672650 2342 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 08:17:15.672686 kubelet[2342]: I0819 08:17:15.672675 2342 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 08:17:15.672833 kubelet[2342]: I0819 08:17:15.672716 2342 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:15.761033 kubelet[2342]: E0819 08:17:15.760844 2342 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:15.861551 kubelet[2342]: E0819 08:17:15.861492 2342 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:15.863390 kubelet[2342]: E0819 08:17:15.863284 2342 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.123:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.123:6443: connect: connection refused" interval="400ms" Aug 19 08:17:15.961752 kubelet[2342]: E0819 08:17:15.961636 2342 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:16.062840 kubelet[2342]: E0819 08:17:16.062607 2342 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:16.082959 kubelet[2342]: I0819 08:17:16.082391 2342 policy_none.go:49] "None policy: Start" Aug 19 08:17:16.082959 kubelet[2342]: I0819 08:17:16.082472 2342 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 08:17:16.082959 kubelet[2342]: I0819 08:17:16.082500 2342 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:17:16.086916 kubelet[2342]: I0819 08:17:16.086845 2342 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:17:16.089310 kubelet[2342]: I0819 08:17:16.089244 2342 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:17:16.089310 kubelet[2342]: I0819 08:17:16.089279 2342 status_manager.go:227] "Starting to sync pod status with apiserver" Aug 19 08:17:16.089566 kubelet[2342]: I0819 08:17:16.089322 2342 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 08:17:16.089566 kubelet[2342]: I0819 08:17:16.089334 2342 kubelet.go:2382] "Starting kubelet main sync loop" Aug 19 08:17:16.089566 kubelet[2342]: E0819 08:17:16.089393 2342 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:17:16.090236 kubelet[2342]: W0819 08:17:16.090170 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.123:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:16.090319 kubelet[2342]: E0819 08:17:16.090240 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.123:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:16.094005 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 08:17:16.107967 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 08:17:16.113093 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 08:17:16.127560 kubelet[2342]: I0819 08:17:16.127499 2342 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:17:16.127914 kubelet[2342]: I0819 08:17:16.127832 2342 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:17:16.127914 kubelet[2342]: I0819 08:17:16.127866 2342 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:17:16.129618 kubelet[2342]: E0819 08:17:16.129546 2342 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 08:17:16.129756 kubelet[2342]: E0819 08:17:16.129636 2342 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Aug 19 08:17:16.130222 kubelet[2342]: I0819 08:17:16.130085 2342 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:17:16.201175 systemd[1]: Created slice kubepods-burstable-podd5a2bcdaf7f37e3707ed2dc1143c0d3f.slice - libcontainer container kubepods-burstable-podd5a2bcdaf7f37e3707ed2dc1143c0d3f.slice. Aug 19 08:17:16.224315 kubelet[2342]: E0819 08:17:16.224139 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:16.228492 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Aug 19 08:17:16.230418 kubelet[2342]: I0819 08:17:16.230371 2342 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:17:16.230954 kubelet[2342]: E0819 08:17:16.230888 2342 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.123:6443/api/v1/nodes\": dial tcp 10.0.0.123:6443: connect: connection refused" node="localhost" Aug 19 08:17:16.245068 kubelet[2342]: E0819 08:17:16.245002 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:16.249494 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Aug 19 08:17:16.251798 kubelet[2342]: E0819 08:17:16.251772 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:16.262167 kubelet[2342]: I0819 08:17:16.262135 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:16.262167 kubelet[2342]: I0819 08:17:16.262166 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:16.262281 kubelet[2342]: I0819 08:17:16.262183 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:16.262281 kubelet[2342]: I0819 08:17:16.262198 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:16.262281 kubelet[2342]: I0819 08:17:16.262215 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:16.262281 kubelet[2342]: I0819 08:17:16.262232 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:16.262281 kubelet[2342]: I0819 08:17:16.262274 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:16.262410 kubelet[2342]: I0819 08:17:16.262288 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:16.262410 kubelet[2342]: I0819 08:17:16.262307 2342 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:16.264799 kubelet[2342]: E0819 08:17:16.264751 2342 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.123:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.123:6443: connect: connection refused" interval="800ms" Aug 19 08:17:16.433017 kubelet[2342]: I0819 08:17:16.432965 2342 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:17:16.433448 kubelet[2342]: E0819 08:17:16.433405 2342 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.123:6443/api/v1/nodes\": dial tcp 10.0.0.123:6443: connect: connection refused" node="localhost" Aug 19 08:17:16.456066 kubelet[2342]: W0819 08:17:16.455962 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.123:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:16.456194 kubelet[2342]: E0819 08:17:16.456077 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.123:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:16.521906 kubelet[2342]: W0819 08:17:16.521799 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.123:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:16.521906 kubelet[2342]: E0819 08:17:16.521896 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.123:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:16.526768 containerd[1567]: time="2025-08-19T08:17:16.526696187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d5a2bcdaf7f37e3707ed2dc1143c0d3f,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:16.546777 containerd[1567]: time="2025-08-19T08:17:16.546714064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:16.553809 containerd[1567]: time="2025-08-19T08:17:16.553756514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:16.836208 kubelet[2342]: I0819 08:17:16.835932 2342 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:17:16.836922 kubelet[2342]: E0819 08:17:16.836827 2342 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.123:6443/api/v1/nodes\": dial tcp 10.0.0.123:6443: connect: connection refused" node="localhost" Aug 19 08:17:16.837727 containerd[1567]: time="2025-08-19T08:17:16.837668572Z" level=info msg="connecting to shim d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709" address="unix:///run/containerd/s/57700f66b36f82bb2bb20fd4c5a8025f3932698a39284f36c1c75fa65a9ec179" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:16.845176 containerd[1567]: time="2025-08-19T08:17:16.845090543Z" level=info msg="connecting to shim e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c" address="unix:///run/containerd/s/885f9cda5ae11975797c64c4974e279a63ce371cb22fb515a9c529bc6d3ed57b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:16.857337 containerd[1567]: time="2025-08-19T08:17:16.857260270Z" level=info msg="connecting to shim 53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102" address="unix:///run/containerd/s/67727de3bd66df34d5257f3e05c47698f820f3a51596ff955769b052f3907270" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:16.882717 systemd[1]: Started cri-containerd-d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709.scope - libcontainer container d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709. Aug 19 08:17:16.889244 systemd[1]: Started cri-containerd-e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c.scope - libcontainer container e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c. Aug 19 08:17:16.897275 systemd[1]: Started cri-containerd-53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102.scope - libcontainer container 53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102. Aug 19 08:17:16.905154 kubelet[2342]: W0819 08:17:16.905110 2342 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.123:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.123:6443: connect: connection refused Aug 19 08:17:16.905250 kubelet[2342]: E0819 08:17:16.905169 2342 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.123:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.123:6443: connect: connection refused" logger="UnhandledError" Aug 19 08:17:16.960272 containerd[1567]: time="2025-08-19T08:17:16.960190468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d5a2bcdaf7f37e3707ed2dc1143c0d3f,Namespace:kube-system,Attempt:0,} returns sandbox id \"d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709\"" Aug 19 08:17:16.962502 containerd[1567]: time="2025-08-19T08:17:16.962426612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c\"" Aug 19 08:17:16.964344 containerd[1567]: time="2025-08-19T08:17:16.964266664Z" level=info msg="CreateContainer within sandbox \"d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 08:17:16.965187 containerd[1567]: time="2025-08-19T08:17:16.965158216Z" level=info msg="CreateContainer within sandbox \"e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 08:17:16.976056 containerd[1567]: time="2025-08-19T08:17:16.976010211Z" level=info msg="Container 98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:16.976171 containerd[1567]: time="2025-08-19T08:17:16.976107674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102\"" Aug 19 08:17:16.979731 containerd[1567]: time="2025-08-19T08:17:16.979693209Z" level=info msg="CreateContainer within sandbox \"53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 08:17:16.987604 containerd[1567]: time="2025-08-19T08:17:16.987552411Z" level=info msg="CreateContainer within sandbox \"d3f34804bf36ae8906afcf05fbd5345b0eb6e9dbdef87f65d10e932fef3e1709\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e\"" Aug 19 08:17:16.988224 containerd[1567]: time="2025-08-19T08:17:16.988195858Z" level=info msg="StartContainer for \"98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e\"" Aug 19 08:17:16.989579 containerd[1567]: time="2025-08-19T08:17:16.989551401Z" level=info msg="connecting to shim 98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e" address="unix:///run/containerd/s/57700f66b36f82bb2bb20fd4c5a8025f3932698a39284f36c1c75fa65a9ec179" protocol=ttrpc version=3 Aug 19 08:17:16.992342 containerd[1567]: time="2025-08-19T08:17:16.992299445Z" level=info msg="Container 302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:17.000578 containerd[1567]: time="2025-08-19T08:17:17.000535467Z" level=info msg="Container 73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:17.006190 containerd[1567]: time="2025-08-19T08:17:17.006127103Z" level=info msg="CreateContainer within sandbox \"e2c6965f95f9fab09c4ad8f594dfb3f03afc58f72c546dee2727fde19f1a121c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4\"" Aug 19 08:17:17.007079 containerd[1567]: time="2025-08-19T08:17:17.007053345Z" level=info msg="StartContainer for \"302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4\"" Aug 19 08:17:17.009205 containerd[1567]: time="2025-08-19T08:17:17.009180788Z" level=info msg="connecting to shim 302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4" address="unix:///run/containerd/s/885f9cda5ae11975797c64c4974e279a63ce371cb22fb515a9c529bc6d3ed57b" protocol=ttrpc version=3 Aug 19 08:17:17.012296 containerd[1567]: time="2025-08-19T08:17:17.012253178Z" level=info msg="CreateContainer within sandbox \"53a49b0328edcd71da6cca24cd0c06d5abd9452f35092c6f480aed033addf102\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c\"" Aug 19 08:17:17.012808 containerd[1567]: time="2025-08-19T08:17:17.012781575Z" level=info msg="StartContainer for \"73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c\"" Aug 19 08:17:17.013786 containerd[1567]: time="2025-08-19T08:17:17.013765006Z" level=info msg="connecting to shim 73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c" address="unix:///run/containerd/s/67727de3bd66df34d5257f3e05c47698f820f3a51596ff955769b052f3907270" protocol=ttrpc version=3 Aug 19 08:17:17.014430 systemd[1]: Started cri-containerd-98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e.scope - libcontainer container 98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e. Aug 19 08:17:17.036673 systemd[1]: Started cri-containerd-73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c.scope - libcontainer container 73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c. Aug 19 08:17:17.043473 systemd[1]: Started cri-containerd-302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4.scope - libcontainer container 302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4. Aug 19 08:17:17.066314 kubelet[2342]: E0819 08:17:17.066257 2342 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.123:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.123:6443: connect: connection refused" interval="1.6s" Aug 19 08:17:17.101746 containerd[1567]: time="2025-08-19T08:17:17.101559463Z" level=info msg="StartContainer for \"98b62080cafbff7c1aa10fd0c193a3064fa6880f38674220f12a54f5205fd41e\" returns successfully" Aug 19 08:17:17.108079 kubelet[2342]: E0819 08:17:17.107970 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:17.115884 containerd[1567]: time="2025-08-19T08:17:17.115350512Z" level=info msg="StartContainer for \"73af5888a36730cb06c41fb47e03aa96fd4befd1cc297f8a566dc86b48c0399c\" returns successfully" Aug 19 08:17:17.116229 containerd[1567]: time="2025-08-19T08:17:17.116192632Z" level=info msg="StartContainer for \"302ed6ba4640de3e7cfa9ea1fe509c0efc2e6f31fc7d32d8ff9cd8b093556ce4\" returns successfully" Aug 19 08:17:17.639691 kubelet[2342]: I0819 08:17:17.639441 2342 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:17:18.112349 kubelet[2342]: E0819 08:17:18.112185 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:18.115739 kubelet[2342]: E0819 08:17:18.115697 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:18.116000 kubelet[2342]: E0819 08:17:18.115976 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:19.117483 kubelet[2342]: E0819 08:17:19.117324 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:19.117483 kubelet[2342]: E0819 08:17:19.117477 2342 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:17:19.176139 kubelet[2342]: E0819 08:17:19.175972 2342 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Aug 19 08:17:19.517649 kubelet[2342]: I0819 08:17:19.517561 2342 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 08:17:19.518060 kubelet[2342]: E0819 08:17:19.517680 2342 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Aug 19 08:17:19.560961 kubelet[2342]: I0819 08:17:19.560889 2342 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:19.641576 kubelet[2342]: I0819 08:17:19.641485 2342 apiserver.go:52] "Watching apiserver" Aug 19 08:17:19.659703 kubelet[2342]: I0819 08:17:19.659626 2342 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 08:17:19.703278 kubelet[2342]: E0819 08:17:19.703134 2342 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:19.703278 kubelet[2342]: I0819 08:17:19.703184 2342 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:19.705477 kubelet[2342]: E0819 08:17:19.705397 2342 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:19.705633 kubelet[2342]: I0819 08:17:19.705491 2342 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:19.708482 kubelet[2342]: E0819 08:17:19.708392 2342 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:20.117843 kubelet[2342]: I0819 08:17:20.117808 2342 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:20.118431 kubelet[2342]: I0819 08:17:20.117901 2342 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:20.120022 kubelet[2342]: E0819 08:17:20.119987 2342 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:20.120128 kubelet[2342]: E0819 08:17:20.120010 2342 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:21.611744 systemd[1]: Reload requested from client PID 2615 ('systemctl') (unit session-7.scope)... Aug 19 08:17:21.611761 systemd[1]: Reloading... Aug 19 08:17:21.703500 zram_generator::config[2658]: No configuration found. Aug 19 08:17:21.990170 systemd[1]: Reloading finished in 377 ms. Aug 19 08:17:22.024136 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:22.039679 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 08:17:22.040158 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:22.040255 systemd[1]: kubelet.service: Consumed 1.068s CPU time, 132.5M memory peak. Aug 19 08:17:22.043076 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:17:22.359617 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:17:22.378182 (kubelet)[2703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:17:22.435542 kubelet[2703]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:22.435542 kubelet[2703]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 08:17:22.435542 kubelet[2703]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:17:22.435542 kubelet[2703]: I0819 08:17:22.435207 2703 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:17:22.445941 kubelet[2703]: I0819 08:17:22.445871 2703 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Aug 19 08:17:22.445941 kubelet[2703]: I0819 08:17:22.445922 2703 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:17:22.446373 kubelet[2703]: I0819 08:17:22.446336 2703 server.go:954] "Client rotation is on, will bootstrap in background" Aug 19 08:17:22.448155 kubelet[2703]: I0819 08:17:22.448116 2703 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 19 08:17:22.455608 kubelet[2703]: I0819 08:17:22.455554 2703 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:17:22.460130 kubelet[2703]: I0819 08:17:22.460062 2703 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:17:22.469245 kubelet[2703]: I0819 08:17:22.469184 2703 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:17:22.469591 kubelet[2703]: I0819 08:17:22.469534 2703 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:17:22.469854 kubelet[2703]: I0819 08:17:22.469580 2703 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:17:22.469946 kubelet[2703]: I0819 08:17:22.469861 2703 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:17:22.469946 kubelet[2703]: I0819 08:17:22.469875 2703 container_manager_linux.go:304] "Creating device plugin manager" Aug 19 08:17:22.469946 kubelet[2703]: I0819 08:17:22.469939 2703 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:22.470235 kubelet[2703]: I0819 08:17:22.470208 2703 kubelet.go:446] "Attempting to sync node with API server" Aug 19 08:17:22.470270 kubelet[2703]: I0819 08:17:22.470239 2703 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:17:22.470270 kubelet[2703]: I0819 08:17:22.470268 2703 kubelet.go:352] "Adding apiserver pod source" Aug 19 08:17:22.470323 kubelet[2703]: I0819 08:17:22.470282 2703 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:17:22.471863 kubelet[2703]: I0819 08:17:22.471812 2703 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:17:22.472370 kubelet[2703]: I0819 08:17:22.472338 2703 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 19 08:17:22.473076 kubelet[2703]: I0819 08:17:22.473008 2703 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 08:17:22.473736 kubelet[2703]: I0819 08:17:22.473686 2703 server.go:1287] "Started kubelet" Aug 19 08:17:22.476780 kubelet[2703]: I0819 08:17:22.476752 2703 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:17:22.479621 kubelet[2703]: I0819 08:17:22.479567 2703 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:17:22.480969 kubelet[2703]: I0819 08:17:22.480763 2703 server.go:479] "Adding debug handlers to kubelet server" Aug 19 08:17:22.482031 kubelet[2703]: I0819 08:17:22.481963 2703 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:17:22.483063 kubelet[2703]: I0819 08:17:22.483030 2703 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:17:22.483377 kubelet[2703]: I0819 08:17:22.483346 2703 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:17:22.486884 kubelet[2703]: E0819 08:17:22.486849 2703 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:17:22.490291 kubelet[2703]: I0819 08:17:22.490238 2703 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 08:17:22.490383 kubelet[2703]: E0819 08:17:22.490231 2703 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:17:22.490882 kubelet[2703]: I0819 08:17:22.490853 2703 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 08:17:22.491106 kubelet[2703]: I0819 08:17:22.491061 2703 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:17:22.491542 kubelet[2703]: I0819 08:17:22.491485 2703 factory.go:221] Registration of the systemd container factory successfully Aug 19 08:17:22.491704 kubelet[2703]: I0819 08:17:22.491621 2703 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:17:22.494027 kubelet[2703]: I0819 08:17:22.493336 2703 factory.go:221] Registration of the containerd container factory successfully Aug 19 08:17:22.500993 kubelet[2703]: I0819 08:17:22.500946 2703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 19 08:17:22.502605 kubelet[2703]: I0819 08:17:22.502588 2703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 19 08:17:22.502722 kubelet[2703]: I0819 08:17:22.502711 2703 status_manager.go:227] "Starting to sync pod status with apiserver" Aug 19 08:17:22.502816 kubelet[2703]: I0819 08:17:22.502802 2703 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 08:17:22.502872 kubelet[2703]: I0819 08:17:22.502862 2703 kubelet.go:2382] "Starting kubelet main sync loop" Aug 19 08:17:22.502978 kubelet[2703]: E0819 08:17:22.502960 2703 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:17:22.542638 kubelet[2703]: I0819 08:17:22.542602 2703 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 08:17:22.542832 kubelet[2703]: I0819 08:17:22.542801 2703 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 08:17:22.542832 kubelet[2703]: I0819 08:17:22.542826 2703 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:17:22.543026 kubelet[2703]: I0819 08:17:22.543008 2703 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 08:17:22.543052 kubelet[2703]: I0819 08:17:22.543024 2703 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 08:17:22.543052 kubelet[2703]: I0819 08:17:22.543051 2703 policy_none.go:49] "None policy: Start" Aug 19 08:17:22.543107 kubelet[2703]: I0819 08:17:22.543062 2703 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 08:17:22.543107 kubelet[2703]: I0819 08:17:22.543073 2703 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:17:22.543213 kubelet[2703]: I0819 08:17:22.543198 2703 state_mem.go:75] "Updated machine memory state" Aug 19 08:17:22.548530 kubelet[2703]: I0819 08:17:22.548494 2703 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 19 08:17:22.548847 kubelet[2703]: I0819 08:17:22.548756 2703 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:17:22.548847 kubelet[2703]: I0819 08:17:22.548776 2703 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:17:22.549267 kubelet[2703]: I0819 08:17:22.549239 2703 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:17:22.551689 kubelet[2703]: E0819 08:17:22.550868 2703 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 08:17:22.604181 kubelet[2703]: I0819 08:17:22.604079 2703 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:22.604181 kubelet[2703]: I0819 08:17:22.604132 2703 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:22.604526 kubelet[2703]: I0819 08:17:22.604136 2703 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:22.651556 kubelet[2703]: I0819 08:17:22.651512 2703 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:17:22.660852 kubelet[2703]: I0819 08:17:22.660808 2703 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Aug 19 08:17:22.661226 kubelet[2703]: I0819 08:17:22.660918 2703 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 08:17:22.792561 kubelet[2703]: I0819 08:17:22.792488 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:22.792561 kubelet[2703]: I0819 08:17:22.792545 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:22.792561 kubelet[2703]: I0819 08:17:22.792570 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:22.792786 kubelet[2703]: I0819 08:17:22.792586 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:22.792786 kubelet[2703]: I0819 08:17:22.792606 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:22.792786 kubelet[2703]: I0819 08:17:22.792635 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:22.792786 kubelet[2703]: I0819 08:17:22.792656 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d5a2bcdaf7f37e3707ed2dc1143c0d3f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d5a2bcdaf7f37e3707ed2dc1143c0d3f\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:22.792786 kubelet[2703]: I0819 08:17:22.792722 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:22.792945 kubelet[2703]: I0819 08:17:22.792786 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:17:23.470897 kubelet[2703]: I0819 08:17:23.470838 2703 apiserver.go:52] "Watching apiserver" Aug 19 08:17:23.491931 kubelet[2703]: I0819 08:17:23.491859 2703 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 08:17:23.521202 kubelet[2703]: I0819 08:17:23.521139 2703 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:23.521529 kubelet[2703]: I0819 08:17:23.521489 2703 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:23.530534 kubelet[2703]: E0819 08:17:23.530444 2703 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Aug 19 08:17:23.531258 kubelet[2703]: E0819 08:17:23.531232 2703 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Aug 19 08:17:23.555281 kubelet[2703]: I0819 08:17:23.555192 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.555152959 podStartE2EDuration="1.555152959s" podCreationTimestamp="2025-08-19 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:23.547228587 +0000 UTC m=+1.163575226" watchObservedRunningTime="2025-08-19 08:17:23.555152959 +0000 UTC m=+1.171499598" Aug 19 08:17:23.555529 kubelet[2703]: I0819 08:17:23.555368 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.555360355 podStartE2EDuration="1.555360355s" podCreationTimestamp="2025-08-19 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:23.555120126 +0000 UTC m=+1.171466775" watchObservedRunningTime="2025-08-19 08:17:23.555360355 +0000 UTC m=+1.171706994" Aug 19 08:17:23.563825 kubelet[2703]: I0819 08:17:23.563743 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.563714557 podStartE2EDuration="1.563714557s" podCreationTimestamp="2025-08-19 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:23.563613694 +0000 UTC m=+1.179960333" watchObservedRunningTime="2025-08-19 08:17:23.563714557 +0000 UTC m=+1.180061196" Aug 19 08:17:26.172042 kubelet[2703]: I0819 08:17:26.171978 2703 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 08:17:26.172817 kubelet[2703]: I0819 08:17:26.172691 2703 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 08:17:26.172858 containerd[1567]: time="2025-08-19T08:17:26.172408529Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 08:17:27.010945 systemd[1]: Created slice kubepods-besteffort-pod0fc3e9f7_1d5b_4497_ac89_afac141cea85.slice - libcontainer container kubepods-besteffort-pod0fc3e9f7_1d5b_4497_ac89_afac141cea85.slice. Aug 19 08:17:27.021498 kubelet[2703]: I0819 08:17:27.021445 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zn6\" (UniqueName: \"kubernetes.io/projected/0fc3e9f7-1d5b-4497-ac89-afac141cea85-kube-api-access-k7zn6\") pod \"kube-proxy-wqlgm\" (UID: \"0fc3e9f7-1d5b-4497-ac89-afac141cea85\") " pod="kube-system/kube-proxy-wqlgm" Aug 19 08:17:27.021498 kubelet[2703]: I0819 08:17:27.021496 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0fc3e9f7-1d5b-4497-ac89-afac141cea85-xtables-lock\") pod \"kube-proxy-wqlgm\" (UID: \"0fc3e9f7-1d5b-4497-ac89-afac141cea85\") " pod="kube-system/kube-proxy-wqlgm" Aug 19 08:17:27.021619 kubelet[2703]: I0819 08:17:27.021515 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0fc3e9f7-1d5b-4497-ac89-afac141cea85-kube-proxy\") pod \"kube-proxy-wqlgm\" (UID: \"0fc3e9f7-1d5b-4497-ac89-afac141cea85\") " pod="kube-system/kube-proxy-wqlgm" Aug 19 08:17:27.021619 kubelet[2703]: I0819 08:17:27.021528 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fc3e9f7-1d5b-4497-ac89-afac141cea85-lib-modules\") pod \"kube-proxy-wqlgm\" (UID: \"0fc3e9f7-1d5b-4497-ac89-afac141cea85\") " pod="kube-system/kube-proxy-wqlgm" Aug 19 08:17:27.432062 systemd[1]: Created slice kubepods-besteffort-pod1f5c9bba_6bbf_42b9_9a46_035bdd50247f.slice - libcontainer container kubepods-besteffort-pod1f5c9bba_6bbf_42b9_9a46_035bdd50247f.slice. Aug 19 08:17:27.525752 kubelet[2703]: I0819 08:17:27.525682 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvtrn\" (UniqueName: \"kubernetes.io/projected/1f5c9bba-6bbf-42b9-9a46-035bdd50247f-kube-api-access-tvtrn\") pod \"tigera-operator-747864d56d-ngzs6\" (UID: \"1f5c9bba-6bbf-42b9-9a46-035bdd50247f\") " pod="tigera-operator/tigera-operator-747864d56d-ngzs6" Aug 19 08:17:27.525752 kubelet[2703]: I0819 08:17:27.525730 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1f5c9bba-6bbf-42b9-9a46-035bdd50247f-var-lib-calico\") pod \"tigera-operator-747864d56d-ngzs6\" (UID: \"1f5c9bba-6bbf-42b9-9a46-035bdd50247f\") " pod="tigera-operator/tigera-operator-747864d56d-ngzs6" Aug 19 08:17:27.622473 containerd[1567]: time="2025-08-19T08:17:27.622387859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wqlgm,Uid:0fc3e9f7-1d5b-4497-ac89-afac141cea85,Namespace:kube-system,Attempt:0,}" Aug 19 08:17:27.651110 containerd[1567]: time="2025-08-19T08:17:27.651055863Z" level=info msg="connecting to shim 37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393" address="unix:///run/containerd/s/bf9cd661806afeaf25c342f81a29ff8a20d8849a59b62d97b099c72042c22d49" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:27.709612 systemd[1]: Started cri-containerd-37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393.scope - libcontainer container 37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393. Aug 19 08:17:27.735800 containerd[1567]: time="2025-08-19T08:17:27.735753321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-ngzs6,Uid:1f5c9bba-6bbf-42b9-9a46-035bdd50247f,Namespace:tigera-operator,Attempt:0,}" Aug 19 08:17:27.744948 containerd[1567]: time="2025-08-19T08:17:27.744890981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wqlgm,Uid:0fc3e9f7-1d5b-4497-ac89-afac141cea85,Namespace:kube-system,Attempt:0,} returns sandbox id \"37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393\"" Aug 19 08:17:27.748083 containerd[1567]: time="2025-08-19T08:17:27.748041850Z" level=info msg="CreateContainer within sandbox \"37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 08:17:27.759894 containerd[1567]: time="2025-08-19T08:17:27.759839686Z" level=info msg="connecting to shim ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017" address="unix:///run/containerd/s/bacd43730a6e8ccf05697613f48cd04d45a2cd4c55d984fc5b160cc5f1e6ba33" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:27.763192 containerd[1567]: time="2025-08-19T08:17:27.763131282Z" level=info msg="Container f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:27.777438 containerd[1567]: time="2025-08-19T08:17:27.777290597Z" level=info msg="CreateContainer within sandbox \"37e61660ab4b23dce5f954d9d1bd5e004f1e239333120db8aafdc8f8130a4393\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827\"" Aug 19 08:17:27.778148 containerd[1567]: time="2025-08-19T08:17:27.778109685Z" level=info msg="StartContainer for \"f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827\"" Aug 19 08:17:27.780058 containerd[1567]: time="2025-08-19T08:17:27.780021950Z" level=info msg="connecting to shim f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827" address="unix:///run/containerd/s/bf9cd661806afeaf25c342f81a29ff8a20d8849a59b62d97b099c72042c22d49" protocol=ttrpc version=3 Aug 19 08:17:27.794658 systemd[1]: Started cri-containerd-ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017.scope - libcontainer container ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017. Aug 19 08:17:27.801345 systemd[1]: Started cri-containerd-f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827.scope - libcontainer container f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827. Aug 19 08:17:27.852444 containerd[1567]: time="2025-08-19T08:17:27.852389330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-ngzs6,Uid:1f5c9bba-6bbf-42b9-9a46-035bdd50247f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017\"" Aug 19 08:17:27.857282 containerd[1567]: time="2025-08-19T08:17:27.857232235Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 08:17:27.859502 containerd[1567]: time="2025-08-19T08:17:27.859431396Z" level=info msg="StartContainer for \"f1335f506e893e9a51bfe2c7fbcdbd04d571ce8bdf81a20c9f60065012f1e827\" returns successfully" Aug 19 08:17:29.714763 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2409322209.mount: Deactivated successfully. Aug 19 08:17:30.496756 containerd[1567]: time="2025-08-19T08:17:30.496680013Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:30.498381 containerd[1567]: time="2025-08-19T08:17:30.498327016Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 19 08:17:30.499751 containerd[1567]: time="2025-08-19T08:17:30.499694509Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:30.502164 containerd[1567]: time="2025-08-19T08:17:30.502094671Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:30.502886 containerd[1567]: time="2025-08-19T08:17:30.502835436Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.645300966s" Aug 19 08:17:30.502886 containerd[1567]: time="2025-08-19T08:17:30.502872727Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 19 08:17:30.505304 containerd[1567]: time="2025-08-19T08:17:30.505261457Z" level=info msg="CreateContainer within sandbox \"ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 08:17:30.515184 containerd[1567]: time="2025-08-19T08:17:30.515112709Z" level=info msg="Container 6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:30.519362 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3059581903.mount: Deactivated successfully. Aug 19 08:17:30.521710 containerd[1567]: time="2025-08-19T08:17:30.521672839Z" level=info msg="CreateContainer within sandbox \"ec824d8bc15a825eb30cffdd45f3f3dd73def1b0017c9da46266123fe5a73017\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f\"" Aug 19 08:17:30.522234 containerd[1567]: time="2025-08-19T08:17:30.522208676Z" level=info msg="StartContainer for \"6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f\"" Aug 19 08:17:30.523097 containerd[1567]: time="2025-08-19T08:17:30.523070410Z" level=info msg="connecting to shim 6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f" address="unix:///run/containerd/s/bacd43730a6e8ccf05697613f48cd04d45a2cd4c55d984fc5b160cc5f1e6ba33" protocol=ttrpc version=3 Aug 19 08:17:30.584772 systemd[1]: Started cri-containerd-6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f.scope - libcontainer container 6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f. Aug 19 08:17:30.623606 containerd[1567]: time="2025-08-19T08:17:30.623535125Z" level=info msg="StartContainer for \"6c5ed56b91bd3d14fecf6bcf54e93319fdf69cc68e8191bf89c805c85f69af6f\" returns successfully" Aug 19 08:17:30.845234 update_engine[1547]: I20250819 08:17:30.844784 1547 update_attempter.cc:509] Updating boot flags... Aug 19 08:17:31.604790 kubelet[2703]: I0819 08:17:31.604699 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-wqlgm" podStartSLOduration=5.60465768 podStartE2EDuration="5.60465768s" podCreationTimestamp="2025-08-19 08:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:17:28.545234329 +0000 UTC m=+6.161580968" watchObservedRunningTime="2025-08-19 08:17:31.60465768 +0000 UTC m=+9.221004319" Aug 19 08:17:32.645033 kubelet[2703]: I0819 08:17:32.644905 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-ngzs6" podStartSLOduration=2.995185942 podStartE2EDuration="5.644877159s" podCreationTimestamp="2025-08-19 08:17:27 +0000 UTC" firstStartedPulling="2025-08-19 08:17:27.85400452 +0000 UTC m=+5.470351159" lastFinishedPulling="2025-08-19 08:17:30.503695737 +0000 UTC m=+8.120042376" observedRunningTime="2025-08-19 08:17:31.605152807 +0000 UTC m=+9.221499446" watchObservedRunningTime="2025-08-19 08:17:32.644877159 +0000 UTC m=+10.261223798" Aug 19 08:17:36.623174 sudo[1772]: pam_unix(sudo:session): session closed for user root Aug 19 08:17:36.625799 sshd[1771]: Connection closed by 10.0.0.1 port 55348 Aug 19 08:17:36.632560 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Aug 19 08:17:36.644287 systemd[1]: sshd@6-10.0.0.123:22-10.0.0.1:55348.service: Deactivated successfully. Aug 19 08:17:36.647133 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 08:17:36.647395 systemd[1]: session-7.scope: Consumed 5.089s CPU time, 221.2M memory peak. Aug 19 08:17:36.648941 systemd-logind[1541]: Session 7 logged out. Waiting for processes to exit. Aug 19 08:17:36.650573 systemd-logind[1541]: Removed session 7. Aug 19 08:17:41.710744 systemd[1]: Created slice kubepods-besteffort-pod821208c1_648e_4b35_aa11_700947e83d77.slice - libcontainer container kubepods-besteffort-pod821208c1_648e_4b35_aa11_700947e83d77.slice. Aug 19 08:17:41.728179 kubelet[2703]: I0819 08:17:41.728102 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/821208c1-648e-4b35-aa11-700947e83d77-typha-certs\") pod \"calico-typha-8c476f69c-cqwwg\" (UID: \"821208c1-648e-4b35-aa11-700947e83d77\") " pod="calico-system/calico-typha-8c476f69c-cqwwg" Aug 19 08:17:41.728179 kubelet[2703]: I0819 08:17:41.728172 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/821208c1-648e-4b35-aa11-700947e83d77-tigera-ca-bundle\") pod \"calico-typha-8c476f69c-cqwwg\" (UID: \"821208c1-648e-4b35-aa11-700947e83d77\") " pod="calico-system/calico-typha-8c476f69c-cqwwg" Aug 19 08:17:41.728760 kubelet[2703]: I0819 08:17:41.728202 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7b8g\" (UniqueName: \"kubernetes.io/projected/821208c1-648e-4b35-aa11-700947e83d77-kube-api-access-t7b8g\") pod \"calico-typha-8c476f69c-cqwwg\" (UID: \"821208c1-648e-4b35-aa11-700947e83d77\") " pod="calico-system/calico-typha-8c476f69c-cqwwg" Aug 19 08:17:41.777833 systemd[1]: Created slice kubepods-besteffort-poddff8300b_38ce_4240_aac9_0defa4bcece9.slice - libcontainer container kubepods-besteffort-poddff8300b_38ce_4240_aac9_0defa4bcece9.slice. Aug 19 08:17:41.831527 kubelet[2703]: I0819 08:17:41.829466 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-cni-net-dir\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831527 kubelet[2703]: I0819 08:17:41.829538 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-flexvol-driver-host\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831527 kubelet[2703]: I0819 08:17:41.829565 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/dff8300b-38ce-4240-aac9-0defa4bcece9-node-certs\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831527 kubelet[2703]: I0819 08:17:41.829585 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-cni-log-dir\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831527 kubelet[2703]: I0819 08:17:41.829606 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-var-lib-calico\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831922 kubelet[2703]: I0819 08:17:41.829635 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-policysync\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831922 kubelet[2703]: I0819 08:17:41.829663 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-var-run-calico\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831922 kubelet[2703]: I0819 08:17:41.829720 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-lib-modules\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831922 kubelet[2703]: I0819 08:17:41.829747 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkl2\" (UniqueName: \"kubernetes.io/projected/dff8300b-38ce-4240-aac9-0defa4bcece9-kube-api-access-5qkl2\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.831922 kubelet[2703]: I0819 08:17:41.829793 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff8300b-38ce-4240-aac9-0defa4bcece9-tigera-ca-bundle\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.832081 kubelet[2703]: I0819 08:17:41.829818 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-cni-bin-dir\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.832081 kubelet[2703]: I0819 08:17:41.829855 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dff8300b-38ce-4240-aac9-0defa4bcece9-xtables-lock\") pod \"calico-node-nz6kt\" (UID: \"dff8300b-38ce-4240-aac9-0defa4bcece9\") " pod="calico-system/calico-node-nz6kt" Aug 19 08:17:41.884828 kubelet[2703]: E0819 08:17:41.884378 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:41.930892 kubelet[2703]: I0819 08:17:41.930821 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af59fbd1-ef40-4fa1-8148-7cb071fdc3e9-registration-dir\") pod \"csi-node-driver-fmzrp\" (UID: \"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9\") " pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:17:41.930892 kubelet[2703]: I0819 08:17:41.930903 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af59fbd1-ef40-4fa1-8148-7cb071fdc3e9-socket-dir\") pod \"csi-node-driver-fmzrp\" (UID: \"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9\") " pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:17:41.931139 kubelet[2703]: I0819 08:17:41.930954 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af59fbd1-ef40-4fa1-8148-7cb071fdc3e9-kubelet-dir\") pod \"csi-node-driver-fmzrp\" (UID: \"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9\") " pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:17:41.931139 kubelet[2703]: I0819 08:17:41.930976 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n42gs\" (UniqueName: \"kubernetes.io/projected/af59fbd1-ef40-4fa1-8148-7cb071fdc3e9-kube-api-access-n42gs\") pod \"csi-node-driver-fmzrp\" (UID: \"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9\") " pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:17:41.931139 kubelet[2703]: I0819 08:17:41.931005 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/af59fbd1-ef40-4fa1-8148-7cb071fdc3e9-varrun\") pod \"csi-node-driver-fmzrp\" (UID: \"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9\") " pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:17:41.936602 kubelet[2703]: E0819 08:17:41.935024 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:41.936602 kubelet[2703]: W0819 08:17:41.936491 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:41.936602 kubelet[2703]: E0819 08:17:41.936551 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:41.937639 kubelet[2703]: E0819 08:17:41.937621 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:41.937732 kubelet[2703]: W0819 08:17:41.937718 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:41.937801 kubelet[2703]: E0819 08:17:41.937785 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:41.942622 kubelet[2703]: E0819 08:17:41.942589 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:41.942754 kubelet[2703]: W0819 08:17:41.942737 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:41.942816 kubelet[2703]: E0819 08:17:41.942803 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:41.963156 kubelet[2703]: E0819 08:17:41.962063 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:41.963156 kubelet[2703]: W0819 08:17:41.963053 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:41.963156 kubelet[2703]: E0819 08:17:41.963090 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.015329 containerd[1567]: time="2025-08-19T08:17:42.015267255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c476f69c-cqwwg,Uid:821208c1-648e-4b35-aa11-700947e83d77,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:42.032161 kubelet[2703]: E0819 08:17:42.032098 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.032161 kubelet[2703]: W0819 08:17:42.032134 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.032161 kubelet[2703]: E0819 08:17:42.032160 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.032521 kubelet[2703]: E0819 08:17:42.032394 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.032521 kubelet[2703]: W0819 08:17:42.032406 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.032521 kubelet[2703]: E0819 08:17:42.032424 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.032712 kubelet[2703]: E0819 08:17:42.032674 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.032712 kubelet[2703]: W0819 08:17:42.032692 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.032712 kubelet[2703]: E0819 08:17:42.032709 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.032938 kubelet[2703]: E0819 08:17:42.032914 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.032938 kubelet[2703]: W0819 08:17:42.032929 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.032996 kubelet[2703]: E0819 08:17:42.032946 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.033180 kubelet[2703]: E0819 08:17:42.033165 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.033180 kubelet[2703]: W0819 08:17:42.033178 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.033267 kubelet[2703]: E0819 08:17:42.033195 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.033622 kubelet[2703]: E0819 08:17:42.033584 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.033669 kubelet[2703]: W0819 08:17:42.033620 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.033669 kubelet[2703]: E0819 08:17:42.033655 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.033915 kubelet[2703]: E0819 08:17:42.033895 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.033915 kubelet[2703]: W0819 08:17:42.033908 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.033990 kubelet[2703]: E0819 08:17:42.033941 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.034160 kubelet[2703]: E0819 08:17:42.034132 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.034160 kubelet[2703]: W0819 08:17:42.034147 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.034218 kubelet[2703]: E0819 08:17:42.034178 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.034369 kubelet[2703]: E0819 08:17:42.034351 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.034369 kubelet[2703]: W0819 08:17:42.034363 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.034437 kubelet[2703]: E0819 08:17:42.034393 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.034585 kubelet[2703]: E0819 08:17:42.034566 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.034585 kubelet[2703]: W0819 08:17:42.034579 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.034652 kubelet[2703]: E0819 08:17:42.034618 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.034863 kubelet[2703]: E0819 08:17:42.034844 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.034863 kubelet[2703]: W0819 08:17:42.034860 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.034929 kubelet[2703]: E0819 08:17:42.034894 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.035096 kubelet[2703]: E0819 08:17:42.035076 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.035096 kubelet[2703]: W0819 08:17:42.035089 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.035168 kubelet[2703]: E0819 08:17:42.035109 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.035399 kubelet[2703]: E0819 08:17:42.035382 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.035399 kubelet[2703]: W0819 08:17:42.035394 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.035497 kubelet[2703]: E0819 08:17:42.035427 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.035741 kubelet[2703]: E0819 08:17:42.035631 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.035741 kubelet[2703]: W0819 08:17:42.035645 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.035741 kubelet[2703]: E0819 08:17:42.035677 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.035851 kubelet[2703]: E0819 08:17:42.035833 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.035851 kubelet[2703]: W0819 08:17:42.035845 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.035914 kubelet[2703]: E0819 08:17:42.035876 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.036852 kubelet[2703]: E0819 08:17:42.036079 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.036852 kubelet[2703]: W0819 08:17:42.036100 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.036852 kubelet[2703]: E0819 08:17:42.036133 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.038577 kubelet[2703]: E0819 08:17:42.037726 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.038577 kubelet[2703]: W0819 08:17:42.037745 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.038577 kubelet[2703]: E0819 08:17:42.037790 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.040283 kubelet[2703]: E0819 08:17:42.040250 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.040283 kubelet[2703]: W0819 08:17:42.040270 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.040419 kubelet[2703]: E0819 08:17:42.040395 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.041632 kubelet[2703]: E0819 08:17:42.041598 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.042001 kubelet[2703]: W0819 08:17:42.041680 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.042001 kubelet[2703]: E0819 08:17:42.041812 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.042283 kubelet[2703]: E0819 08:17:42.042264 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.042283 kubelet[2703]: W0819 08:17:42.042281 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.042349 kubelet[2703]: E0819 08:17:42.042324 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.042840 kubelet[2703]: E0819 08:17:42.042808 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.042840 kubelet[2703]: W0819 08:17:42.042827 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.043155 kubelet[2703]: E0819 08:17:42.043123 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.043198 kubelet[2703]: E0819 08:17:42.043186 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.043220 kubelet[2703]: W0819 08:17:42.043199 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.043250 kubelet[2703]: E0819 08:17:42.043217 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.043546 kubelet[2703]: E0819 08:17:42.043527 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.043546 kubelet[2703]: W0819 08:17:42.043543 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.043619 kubelet[2703]: E0819 08:17:42.043592 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.044248 kubelet[2703]: E0819 08:17:42.043886 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.044248 kubelet[2703]: W0819 08:17:42.043900 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.044248 kubelet[2703]: E0819 08:17:42.043920 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.044392 kubelet[2703]: E0819 08:17:42.044378 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.044444 kubelet[2703]: W0819 08:17:42.044433 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.044530 kubelet[2703]: E0819 08:17:42.044518 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.053309 kubelet[2703]: E0819 08:17:42.053280 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:42.053483 kubelet[2703]: W0819 08:17:42.053447 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:42.053556 kubelet[2703]: E0819 08:17:42.053544 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:42.066741 containerd[1567]: time="2025-08-19T08:17:42.066681966Z" level=info msg="connecting to shim b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888" address="unix:///run/containerd/s/00ae9dc4a617bfe026f9db381f16e6a955cd462a533d4450194e5a0ad7f6d6c8" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:42.083430 containerd[1567]: time="2025-08-19T08:17:42.083379786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nz6kt,Uid:dff8300b-38ce-4240-aac9-0defa4bcece9,Namespace:calico-system,Attempt:0,}" Aug 19 08:17:42.095769 systemd[1]: Started cri-containerd-b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888.scope - libcontainer container b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888. Aug 19 08:17:42.112775 containerd[1567]: time="2025-08-19T08:17:42.112712516Z" level=info msg="connecting to shim ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737" address="unix:///run/containerd/s/f9c2a4c7fc07c101fd3942919994487616b8a48c64b112d018c2d5dc5f2ddb22" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:17:42.143887 systemd[1]: Started cri-containerd-ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737.scope - libcontainer container ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737. Aug 19 08:17:42.155857 containerd[1567]: time="2025-08-19T08:17:42.155708985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c476f69c-cqwwg,Uid:821208c1-648e-4b35-aa11-700947e83d77,Namespace:calico-system,Attempt:0,} returns sandbox id \"b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888\"" Aug 19 08:17:42.157922 containerd[1567]: time="2025-08-19T08:17:42.157888024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 08:17:42.185170 containerd[1567]: time="2025-08-19T08:17:42.185122789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nz6kt,Uid:dff8300b-38ce-4240-aac9-0defa4bcece9,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\"" Aug 19 08:17:43.503291 kubelet[2703]: E0819 08:17:43.503195 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:44.375535 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3293281803.mount: Deactivated successfully. Aug 19 08:17:44.910328 containerd[1567]: time="2025-08-19T08:17:44.910246542Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:44.914670 containerd[1567]: time="2025-08-19T08:17:44.914605686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 19 08:17:44.918676 containerd[1567]: time="2025-08-19T08:17:44.918593639Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:44.922860 containerd[1567]: time="2025-08-19T08:17:44.922807859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:44.923494 containerd[1567]: time="2025-08-19T08:17:44.923429571Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.765172842s" Aug 19 08:17:44.923567 containerd[1567]: time="2025-08-19T08:17:44.923499663Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 19 08:17:44.924788 containerd[1567]: time="2025-08-19T08:17:44.924762422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 08:17:44.935868 containerd[1567]: time="2025-08-19T08:17:44.935804236Z" level=info msg="CreateContainer within sandbox \"b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 08:17:44.948533 containerd[1567]: time="2025-08-19T08:17:44.948440444Z" level=info msg="Container 85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:44.960911 containerd[1567]: time="2025-08-19T08:17:44.960838133Z" level=info msg="CreateContainer within sandbox \"b6f7456889ee5144d4a4aab61274c00894f9575bff6a344754070b253f28a888\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15\"" Aug 19 08:17:44.961596 containerd[1567]: time="2025-08-19T08:17:44.961345569Z" level=info msg="StartContainer for \"85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15\"" Aug 19 08:17:44.962715 containerd[1567]: time="2025-08-19T08:17:44.962635078Z" level=info msg="connecting to shim 85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15" address="unix:///run/containerd/s/00ae9dc4a617bfe026f9db381f16e6a955cd462a533d4450194e5a0ad7f6d6c8" protocol=ttrpc version=3 Aug 19 08:17:44.987815 systemd[1]: Started cri-containerd-85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15.scope - libcontainer container 85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15. Aug 19 08:17:45.055024 containerd[1567]: time="2025-08-19T08:17:45.054954570Z" level=info msg="StartContainer for \"85cfe07a13c3147d47d440373bcd779981a5fdbfc6fd43b450e3d1b6abda3f15\" returns successfully" Aug 19 08:17:45.503792 kubelet[2703]: E0819 08:17:45.503719 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:45.634836 kubelet[2703]: E0819 08:17:45.634764 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.634836 kubelet[2703]: W0819 08:17:45.634802 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.634836 kubelet[2703]: E0819 08:17:45.634827 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.635232 kubelet[2703]: E0819 08:17:45.635176 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.635285 kubelet[2703]: W0819 08:17:45.635234 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.635285 kubelet[2703]: E0819 08:17:45.635274 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.635721 kubelet[2703]: E0819 08:17:45.635692 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.635721 kubelet[2703]: W0819 08:17:45.635709 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.635721 kubelet[2703]: E0819 08:17:45.635721 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.636173 kubelet[2703]: E0819 08:17:45.636131 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.636173 kubelet[2703]: W0819 08:17:45.636149 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.636173 kubelet[2703]: E0819 08:17:45.636162 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.636444 kubelet[2703]: E0819 08:17:45.636404 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.636444 kubelet[2703]: W0819 08:17:45.636422 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.636444 kubelet[2703]: E0819 08:17:45.636434 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.636752 kubelet[2703]: E0819 08:17:45.636707 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.636752 kubelet[2703]: W0819 08:17:45.636727 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.636752 kubelet[2703]: E0819 08:17:45.636742 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.636983 kubelet[2703]: E0819 08:17:45.636946 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.636983 kubelet[2703]: W0819 08:17:45.636971 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.636983 kubelet[2703]: E0819 08:17:45.636981 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.637226 kubelet[2703]: E0819 08:17:45.637202 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.637226 kubelet[2703]: W0819 08:17:45.637214 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.637226 kubelet[2703]: E0819 08:17:45.637224 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.637507 kubelet[2703]: E0819 08:17:45.637445 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.637507 kubelet[2703]: W0819 08:17:45.637479 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.637507 kubelet[2703]: E0819 08:17:45.637490 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.637807 kubelet[2703]: E0819 08:17:45.637711 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.637807 kubelet[2703]: W0819 08:17:45.637723 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.637807 kubelet[2703]: E0819 08:17:45.637733 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.638649 kubelet[2703]: E0819 08:17:45.638628 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.638649 kubelet[2703]: W0819 08:17:45.638641 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.638649 kubelet[2703]: E0819 08:17:45.638651 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.639014 kubelet[2703]: E0819 08:17:45.638968 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.639083 kubelet[2703]: W0819 08:17:45.639012 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.639083 kubelet[2703]: E0819 08:17:45.639052 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.639523 kubelet[2703]: E0819 08:17:45.639479 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.639523 kubelet[2703]: W0819 08:17:45.639500 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.639523 kubelet[2703]: E0819 08:17:45.639513 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.640313 kubelet[2703]: E0819 08:17:45.640279 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.640313 kubelet[2703]: W0819 08:17:45.640296 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.640313 kubelet[2703]: E0819 08:17:45.640309 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.640655 kubelet[2703]: E0819 08:17:45.640626 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.640655 kubelet[2703]: W0819 08:17:45.640643 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.640655 kubelet[2703]: E0819 08:17:45.640654 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.661838 kubelet[2703]: E0819 08:17:45.661762 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.661838 kubelet[2703]: W0819 08:17:45.661794 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.661838 kubelet[2703]: E0819 08:17:45.661821 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.662094 kubelet[2703]: E0819 08:17:45.662070 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.662094 kubelet[2703]: W0819 08:17:45.662081 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.662094 kubelet[2703]: E0819 08:17:45.662092 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.662307 kubelet[2703]: E0819 08:17:45.662287 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.662307 kubelet[2703]: W0819 08:17:45.662302 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.662360 kubelet[2703]: E0819 08:17:45.662318 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.662614 kubelet[2703]: E0819 08:17:45.662560 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.662614 kubelet[2703]: W0819 08:17:45.662588 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.662614 kubelet[2703]: E0819 08:17:45.662605 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.662873 kubelet[2703]: E0819 08:17:45.662822 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.662873 kubelet[2703]: W0819 08:17:45.662833 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.662873 kubelet[2703]: E0819 08:17:45.662850 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.663127 kubelet[2703]: E0819 08:17:45.663083 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.663127 kubelet[2703]: W0819 08:17:45.663102 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.663235 kubelet[2703]: E0819 08:17:45.663134 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.663469 kubelet[2703]: E0819 08:17:45.663428 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.663469 kubelet[2703]: W0819 08:17:45.663445 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.663541 kubelet[2703]: E0819 08:17:45.663483 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.663775 kubelet[2703]: E0819 08:17:45.663753 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.663775 kubelet[2703]: W0819 08:17:45.663772 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.663879 kubelet[2703]: E0819 08:17:45.663793 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.664018 kubelet[2703]: E0819 08:17:45.663999 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.664018 kubelet[2703]: W0819 08:17:45.664015 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.664130 kubelet[2703]: E0819 08:17:45.664091 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.664238 kubelet[2703]: E0819 08:17:45.664222 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.664238 kubelet[2703]: W0819 08:17:45.664234 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.664318 kubelet[2703]: E0819 08:17:45.664292 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.664622 kubelet[2703]: E0819 08:17:45.664441 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.664622 kubelet[2703]: W0819 08:17:45.664477 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.664622 kubelet[2703]: E0819 08:17:45.664495 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.664741 kubelet[2703]: E0819 08:17:45.664730 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.664784 kubelet[2703]: W0819 08:17:45.664741 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.664784 kubelet[2703]: E0819 08:17:45.664757 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.665100 kubelet[2703]: E0819 08:17:45.665068 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.665100 kubelet[2703]: W0819 08:17:45.665083 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.665173 kubelet[2703]: E0819 08:17:45.665098 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.665374 kubelet[2703]: E0819 08:17:45.665297 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.665374 kubelet[2703]: W0819 08:17:45.665311 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.665374 kubelet[2703]: E0819 08:17:45.665326 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.665628 kubelet[2703]: E0819 08:17:45.665610 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.665628 kubelet[2703]: W0819 08:17:45.665623 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.665721 kubelet[2703]: E0819 08:17:45.665639 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.665939 kubelet[2703]: E0819 08:17:45.665921 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.665939 kubelet[2703]: W0819 08:17:45.665936 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.666044 kubelet[2703]: E0819 08:17:45.665949 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.666184 kubelet[2703]: E0819 08:17:45.666167 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.666184 kubelet[2703]: W0819 08:17:45.666180 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.666260 kubelet[2703]: E0819 08:17:45.666195 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.666632 kubelet[2703]: E0819 08:17:45.666612 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:45.666632 kubelet[2703]: W0819 08:17:45.666628 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:45.666632 kubelet[2703]: E0819 08:17:45.666639 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:45.795671 kubelet[2703]: I0819 08:17:45.795439 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8c476f69c-cqwwg" podStartSLOduration=2.028463284 podStartE2EDuration="4.795413046s" podCreationTimestamp="2025-08-19 08:17:41 +0000 UTC" firstStartedPulling="2025-08-19 08:17:42.157591905 +0000 UTC m=+19.773938544" lastFinishedPulling="2025-08-19 08:17:44.924541667 +0000 UTC m=+22.540888306" observedRunningTime="2025-08-19 08:17:45.793922559 +0000 UTC m=+23.410269198" watchObservedRunningTime="2025-08-19 08:17:45.795413046 +0000 UTC m=+23.411759685" Aug 19 08:17:46.575541 kubelet[2703]: I0819 08:17:46.575489 2703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:17:46.648437 kubelet[2703]: E0819 08:17:46.648388 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.648437 kubelet[2703]: W0819 08:17:46.648416 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.648437 kubelet[2703]: E0819 08:17:46.648444 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.648821 kubelet[2703]: E0819 08:17:46.648779 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.648869 kubelet[2703]: W0819 08:17:46.648819 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.648869 kubelet[2703]: E0819 08:17:46.648856 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.649225 kubelet[2703]: E0819 08:17:46.649202 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.649225 kubelet[2703]: W0819 08:17:46.649218 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.649310 kubelet[2703]: E0819 08:17:46.649231 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.649611 kubelet[2703]: E0819 08:17:46.649589 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.649611 kubelet[2703]: W0819 08:17:46.649605 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.649695 kubelet[2703]: E0819 08:17:46.649618 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.649855 kubelet[2703]: E0819 08:17:46.649836 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.649855 kubelet[2703]: W0819 08:17:46.649848 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.649931 kubelet[2703]: E0819 08:17:46.649856 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.650121 kubelet[2703]: E0819 08:17:46.650102 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.650121 kubelet[2703]: W0819 08:17:46.650115 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.650192 kubelet[2703]: E0819 08:17:46.650124 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.650340 kubelet[2703]: E0819 08:17:46.650320 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.650340 kubelet[2703]: W0819 08:17:46.650335 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.650408 kubelet[2703]: E0819 08:17:46.650345 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.650584 kubelet[2703]: E0819 08:17:46.650563 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.650620 kubelet[2703]: W0819 08:17:46.650585 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.650620 kubelet[2703]: E0819 08:17:46.650595 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.650816 kubelet[2703]: E0819 08:17:46.650796 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.650816 kubelet[2703]: W0819 08:17:46.650809 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.650816 kubelet[2703]: E0819 08:17:46.650818 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.651021 kubelet[2703]: E0819 08:17:46.651003 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.651021 kubelet[2703]: W0819 08:17:46.651016 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.651021 kubelet[2703]: E0819 08:17:46.651024 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.651225 kubelet[2703]: E0819 08:17:46.651209 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.651225 kubelet[2703]: W0819 08:17:46.651219 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.651225 kubelet[2703]: E0819 08:17:46.651226 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.651423 kubelet[2703]: E0819 08:17:46.651405 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.651423 kubelet[2703]: W0819 08:17:46.651417 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.651513 kubelet[2703]: E0819 08:17:46.651425 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.651673 kubelet[2703]: E0819 08:17:46.651648 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.651673 kubelet[2703]: W0819 08:17:46.651662 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.651673 kubelet[2703]: E0819 08:17:46.651671 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.651946 kubelet[2703]: E0819 08:17:46.651885 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.651946 kubelet[2703]: W0819 08:17:46.651898 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.651946 kubelet[2703]: E0819 08:17:46.651912 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.652179 kubelet[2703]: E0819 08:17:46.652160 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.652179 kubelet[2703]: W0819 08:17:46.652178 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.652283 kubelet[2703]: E0819 08:17:46.652192 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.671276 kubelet[2703]: E0819 08:17:46.671212 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.671276 kubelet[2703]: W0819 08:17:46.671257 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.671276 kubelet[2703]: E0819 08:17:46.671293 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.671774 kubelet[2703]: E0819 08:17:46.671701 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.671774 kubelet[2703]: W0819 08:17:46.671720 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.671774 kubelet[2703]: E0819 08:17:46.671740 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.672077 kubelet[2703]: E0819 08:17:46.672041 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.672077 kubelet[2703]: W0819 08:17:46.672057 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.672077 kubelet[2703]: E0819 08:17:46.672074 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.672474 kubelet[2703]: E0819 08:17:46.672394 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.672528 kubelet[2703]: W0819 08:17:46.672439 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.672528 kubelet[2703]: E0819 08:17:46.672508 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.672813 kubelet[2703]: E0819 08:17:46.672773 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.672813 kubelet[2703]: W0819 08:17:46.672791 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.672813 kubelet[2703]: E0819 08:17:46.672806 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.673036 kubelet[2703]: E0819 08:17:46.673014 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.673036 kubelet[2703]: W0819 08:17:46.673026 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.673036 kubelet[2703]: E0819 08:17:46.673039 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.673282 kubelet[2703]: E0819 08:17:46.673259 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.673282 kubelet[2703]: W0819 08:17:46.673279 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.673383 kubelet[2703]: E0819 08:17:46.673318 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.673751 kubelet[2703]: E0819 08:17:46.673689 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.673751 kubelet[2703]: W0819 08:17:46.673706 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.673751 kubelet[2703]: E0819 08:17:46.673741 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.673986 kubelet[2703]: E0819 08:17:46.673969 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.673986 kubelet[2703]: W0819 08:17:46.673982 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.674064 kubelet[2703]: E0819 08:17:46.674016 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.674253 kubelet[2703]: E0819 08:17:46.674236 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.674253 kubelet[2703]: W0819 08:17:46.674249 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.674321 kubelet[2703]: E0819 08:17:46.674269 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.674591 kubelet[2703]: E0819 08:17:46.674571 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.674591 kubelet[2703]: W0819 08:17:46.674584 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.674768 kubelet[2703]: E0819 08:17:46.674596 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.674850 kubelet[2703]: E0819 08:17:46.674815 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.674850 kubelet[2703]: W0819 08:17:46.674835 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.674924 kubelet[2703]: E0819 08:17:46.674852 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.675112 kubelet[2703]: E0819 08:17:46.675092 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.675112 kubelet[2703]: W0819 08:17:46.675108 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.675186 kubelet[2703]: E0819 08:17:46.675126 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.675341 kubelet[2703]: E0819 08:17:46.675324 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.675341 kubelet[2703]: W0819 08:17:46.675338 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.675409 kubelet[2703]: E0819 08:17:46.675372 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.675656 kubelet[2703]: E0819 08:17:46.675623 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.675656 kubelet[2703]: W0819 08:17:46.675637 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.675656 kubelet[2703]: E0819 08:17:46.675646 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.675896 kubelet[2703]: E0819 08:17:46.675829 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.675896 kubelet[2703]: W0819 08:17:46.675840 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.675896 kubelet[2703]: E0819 08:17:46.675852 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.676112 kubelet[2703]: E0819 08:17:46.676091 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.676112 kubelet[2703]: W0819 08:17:46.676107 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.676185 kubelet[2703]: E0819 08:17:46.676119 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:46.676585 kubelet[2703]: E0819 08:17:46.676563 2703 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:17:46.676585 kubelet[2703]: W0819 08:17:46.676579 2703 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:17:46.676683 kubelet[2703]: E0819 08:17:46.676593 2703 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:17:47.503751 kubelet[2703]: E0819 08:17:47.503680 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:49.504278 kubelet[2703]: E0819 08:17:49.504179 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:51.504222 kubelet[2703]: E0819 08:17:51.504119 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:51.787835 containerd[1567]: time="2025-08-19T08:17:51.787594629Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:51.794080 containerd[1567]: time="2025-08-19T08:17:51.793969981Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 19 08:17:51.797211 containerd[1567]: time="2025-08-19T08:17:51.797127130Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:51.804078 containerd[1567]: time="2025-08-19T08:17:51.804001411Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:51.804935 containerd[1567]: time="2025-08-19T08:17:51.804866298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 6.88005864s" Aug 19 08:17:51.805010 containerd[1567]: time="2025-08-19T08:17:51.804935639Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 19 08:17:51.807868 containerd[1567]: time="2025-08-19T08:17:51.807786912Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 08:17:51.824875 containerd[1567]: time="2025-08-19T08:17:51.824791739Z" level=info msg="Container 1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:51.836967 containerd[1567]: time="2025-08-19T08:17:51.836884186Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\"" Aug 19 08:17:51.837662 containerd[1567]: time="2025-08-19T08:17:51.837514060Z" level=info msg="StartContainer for \"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\"" Aug 19 08:17:51.839172 containerd[1567]: time="2025-08-19T08:17:51.839117707Z" level=info msg="connecting to shim 1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8" address="unix:///run/containerd/s/f9c2a4c7fc07c101fd3942919994487616b8a48c64b112d018c2d5dc5f2ddb22" protocol=ttrpc version=3 Aug 19 08:17:51.867812 systemd[1]: Started cri-containerd-1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8.scope - libcontainer container 1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8. Aug 19 08:17:51.916837 containerd[1567]: time="2025-08-19T08:17:51.916769875Z" level=info msg="StartContainer for \"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\" returns successfully" Aug 19 08:17:51.926503 systemd[1]: cri-containerd-1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8.scope: Deactivated successfully. Aug 19 08:17:51.928909 containerd[1567]: time="2025-08-19T08:17:51.928853245Z" level=info msg="received exit event container_id:\"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\" id:\"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\" pid:3389 exited_at:{seconds:1755591471 nanos:928288683}" Aug 19 08:17:51.929296 containerd[1567]: time="2025-08-19T08:17:51.929055244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\" id:\"1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8\" pid:3389 exited_at:{seconds:1755591471 nanos:928288683}" Aug 19 08:17:51.960964 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1c1b2421c2ea5a62e5b02433918706c5bdb0a4b24aabb1404934dc92eca8a2c8-rootfs.mount: Deactivated successfully. Aug 19 08:17:53.503381 kubelet[2703]: E0819 08:17:53.503310 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:53.593721 containerd[1567]: time="2025-08-19T08:17:53.593613867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 08:17:55.503686 kubelet[2703]: E0819 08:17:55.503590 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:57.503432 kubelet[2703]: E0819 08:17:57.503355 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:58.224633 containerd[1567]: time="2025-08-19T08:17:58.223763795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.225857 containerd[1567]: time="2025-08-19T08:17:58.225794190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 19 08:17:58.227592 containerd[1567]: time="2025-08-19T08:17:58.227538328Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.230427 containerd[1567]: time="2025-08-19T08:17:58.230373676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:17:58.231162 containerd[1567]: time="2025-08-19T08:17:58.231109759Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 4.637442882s" Aug 19 08:17:58.231162 containerd[1567]: time="2025-08-19T08:17:58.231152749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 19 08:17:58.234153 containerd[1567]: time="2025-08-19T08:17:58.233729932Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 08:17:58.247061 containerd[1567]: time="2025-08-19T08:17:58.246972306Z" level=info msg="Container 30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:17:58.262329 containerd[1567]: time="2025-08-19T08:17:58.262246458Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\"" Aug 19 08:17:58.263024 containerd[1567]: time="2025-08-19T08:17:58.262956533Z" level=info msg="StartContainer for \"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\"" Aug 19 08:17:58.264762 containerd[1567]: time="2025-08-19T08:17:58.264693958Z" level=info msg="connecting to shim 30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a" address="unix:///run/containerd/s/f9c2a4c7fc07c101fd3942919994487616b8a48c64b112d018c2d5dc5f2ddb22" protocol=ttrpc version=3 Aug 19 08:17:58.290824 systemd[1]: Started cri-containerd-30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a.scope - libcontainer container 30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a. Aug 19 08:17:58.345102 containerd[1567]: time="2025-08-19T08:17:58.345049856Z" level=info msg="StartContainer for \"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\" returns successfully" Aug 19 08:17:59.504302 kubelet[2703]: E0819 08:17:59.504174 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:17:59.901759 systemd[1]: cri-containerd-30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a.scope: Deactivated successfully. Aug 19 08:17:59.902200 systemd[1]: cri-containerd-30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a.scope: Consumed 623ms CPU time, 179.4M memory peak, 4.2M read from disk, 171.2M written to disk. Aug 19 08:17:59.903792 containerd[1567]: time="2025-08-19T08:17:59.903750321Z" level=info msg="received exit event container_id:\"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\" id:\"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\" pid:3451 exited_at:{seconds:1755591479 nanos:903501834}" Aug 19 08:17:59.904170 containerd[1567]: time="2025-08-19T08:17:59.903813480Z" level=info msg="TaskExit event in podsandbox handler container_id:\"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\" id:\"30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a\" pid:3451 exited_at:{seconds:1755591479 nanos:903501834}" Aug 19 08:17:59.928528 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-30309c01def42037515163ec44321c227bff9c994c81e8ece08f0e6ea5464c4a-rootfs.mount: Deactivated successfully. Aug 19 08:17:59.967651 kubelet[2703]: I0819 08:17:59.965826 2703 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 19 08:17:59.975851 kubelet[2703]: I0819 08:17:59.975798 2703 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:18:00.281372 systemd[1]: Created slice kubepods-besteffort-pod1bc364cb_8d71_4f96_9547_19e65218a8e6.slice - libcontainer container kubepods-besteffort-pod1bc364cb_8d71_4f96_9547_19e65218a8e6.slice. Aug 19 08:18:00.296796 systemd[1]: Created slice kubepods-besteffort-pod8c0f1f13_1397_41d8_bdc1_892dcc18456b.slice - libcontainer container kubepods-besteffort-pod8c0f1f13_1397_41d8_bdc1_892dcc18456b.slice. Aug 19 08:18:00.307261 systemd[1]: Created slice kubepods-burstable-pod67ce1d8b_65f3_4306_aa87_3575d50b5ebe.slice - libcontainer container kubepods-burstable-pod67ce1d8b_65f3_4306_aa87_3575d50b5ebe.slice. Aug 19 08:18:00.315362 systemd[1]: Created slice kubepods-burstable-pod161d8145_a00d_4936_8415_8fb415a18f07.slice - libcontainer container kubepods-burstable-pod161d8145_a00d_4936_8415_8fb415a18f07.slice. Aug 19 08:18:00.321658 systemd[1]: Created slice kubepods-besteffort-podc6aeac62_957c_47bb_881b_da7164d33ac3.slice - libcontainer container kubepods-besteffort-podc6aeac62_957c_47bb_881b_da7164d33ac3.slice. Aug 19 08:18:00.327948 systemd[1]: Created slice kubepods-besteffort-pod7495ee8b_4990_47af_bdd8_8d350506c7a6.slice - libcontainer container kubepods-besteffort-pod7495ee8b_4990_47af_bdd8_8d350506c7a6.slice. Aug 19 08:18:00.333309 systemd[1]: Created slice kubepods-besteffort-pod50a4115f_7018_4a8c_a733_cdd9909ab0b4.slice - libcontainer container kubepods-besteffort-pod50a4115f_7018_4a8c_a733_cdd9909ab0b4.slice. Aug 19 08:18:00.370855 kubelet[2703]: I0819 08:18:00.370763 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-backend-key-pair\") pod \"whisker-84fb8898cd-bc727\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " pod="calico-system/whisker-84fb8898cd-bc727" Aug 19 08:18:00.371059 kubelet[2703]: I0819 08:18:00.370840 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsllf\" (UniqueName: \"kubernetes.io/projected/50a4115f-7018-4a8c-a733-cdd9909ab0b4-kube-api-access-nsllf\") pod \"whisker-84fb8898cd-bc727\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " pod="calico-system/whisker-84fb8898cd-bc727" Aug 19 08:18:00.371059 kubelet[2703]: I0819 08:18:00.370925 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67ce1d8b-65f3-4306-aa87-3575d50b5ebe-config-volume\") pod \"coredns-668d6bf9bc-6kgnl\" (UID: \"67ce1d8b-65f3-4306-aa87-3575d50b5ebe\") " pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:00.371059 kubelet[2703]: I0819 08:18:00.370967 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgskf\" (UniqueName: \"kubernetes.io/projected/67ce1d8b-65f3-4306-aa87-3575d50b5ebe-kube-api-access-lgskf\") pod \"coredns-668d6bf9bc-6kgnl\" (UID: \"67ce1d8b-65f3-4306-aa87-3575d50b5ebe\") " pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:00.371059 kubelet[2703]: I0819 08:18:00.371025 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7495ee8b-4990-47af-bdd8-8d350506c7a6-config\") pod \"goldmane-768f4c5c69-kf7tj\" (UID: \"7495ee8b-4990-47af-bdd8-8d350506c7a6\") " pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.371059 kubelet[2703]: I0819 08:18:00.371043 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aeac62-957c-47bb-881b-da7164d33ac3-tigera-ca-bundle\") pod \"calico-kube-controllers-697fc54c9b-6gqqp\" (UID: \"c6aeac62-957c-47bb-881b-da7164d33ac3\") " pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:00.371241 kubelet[2703]: I0819 08:18:00.371072 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-ca-bundle\") pod \"whisker-84fb8898cd-bc727\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " pod="calico-system/whisker-84fb8898cd-bc727" Aug 19 08:18:00.371241 kubelet[2703]: I0819 08:18:00.371088 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7495ee8b-4990-47af-bdd8-8d350506c7a6-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-kf7tj\" (UID: \"7495ee8b-4990-47af-bdd8-8d350506c7a6\") " pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.371241 kubelet[2703]: I0819 08:18:00.371146 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvzl\" (UniqueName: \"kubernetes.io/projected/7495ee8b-4990-47af-bdd8-8d350506c7a6-kube-api-access-kfvzl\") pod \"goldmane-768f4c5c69-kf7tj\" (UID: \"7495ee8b-4990-47af-bdd8-8d350506c7a6\") " pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.371241 kubelet[2703]: I0819 08:18:00.371182 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8c0f1f13-1397-41d8-bdc1-892dcc18456b-calico-apiserver-certs\") pod \"calico-apiserver-5865b8886b-dzrf9\" (UID: \"8c0f1f13-1397-41d8-bdc1-892dcc18456b\") " pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" Aug 19 08:18:00.371241 kubelet[2703]: I0819 08:18:00.371207 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t82w\" (UniqueName: \"kubernetes.io/projected/8c0f1f13-1397-41d8-bdc1-892dcc18456b-kube-api-access-6t82w\") pod \"calico-apiserver-5865b8886b-dzrf9\" (UID: \"8c0f1f13-1397-41d8-bdc1-892dcc18456b\") " pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" Aug 19 08:18:00.371391 kubelet[2703]: I0819 08:18:00.371237 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7495ee8b-4990-47af-bdd8-8d350506c7a6-goldmane-key-pair\") pod \"goldmane-768f4c5c69-kf7tj\" (UID: \"7495ee8b-4990-47af-bdd8-8d350506c7a6\") " pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.371391 kubelet[2703]: I0819 08:18:00.371273 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6nk\" (UniqueName: \"kubernetes.io/projected/c6aeac62-957c-47bb-881b-da7164d33ac3-kube-api-access-ql6nk\") pod \"calico-kube-controllers-697fc54c9b-6gqqp\" (UID: \"c6aeac62-957c-47bb-881b-da7164d33ac3\") " pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:00.371391 kubelet[2703]: I0819 08:18:00.371304 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1bc364cb-8d71-4f96-9547-19e65218a8e6-calico-apiserver-certs\") pod \"calico-apiserver-5865b8886b-h88ng\" (UID: \"1bc364cb-8d71-4f96-9547-19e65218a8e6\") " pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:00.371391 kubelet[2703]: I0819 08:18:00.371333 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gtw\" (UniqueName: \"kubernetes.io/projected/1bc364cb-8d71-4f96-9547-19e65218a8e6-kube-api-access-57gtw\") pod \"calico-apiserver-5865b8886b-h88ng\" (UID: \"1bc364cb-8d71-4f96-9547-19e65218a8e6\") " pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:00.371391 kubelet[2703]: I0819 08:18:00.371360 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/161d8145-a00d-4936-8415-8fb415a18f07-config-volume\") pod \"coredns-668d6bf9bc-qtqtf\" (UID: \"161d8145-a00d-4936-8415-8fb415a18f07\") " pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:00.371593 kubelet[2703]: I0819 08:18:00.371385 2703 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sg6j\" (UniqueName: \"kubernetes.io/projected/161d8145-a00d-4936-8415-8fb415a18f07-kube-api-access-6sg6j\") pod \"coredns-668d6bf9bc-qtqtf\" (UID: \"161d8145-a00d-4936-8415-8fb415a18f07\") " pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:00.658080 containerd[1567]: time="2025-08-19T08:18:00.658009121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 08:18:00.668071 containerd[1567]: time="2025-08-19T08:18:00.667986265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:00.668430 containerd[1567]: time="2025-08-19T08:18:00.668360768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fb8898cd-bc727,Uid:50a4115f-7018-4a8c-a733-cdd9909ab0b4,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:00.844878 containerd[1567]: time="2025-08-19T08:18:00.844801338Z" level=error msg="Failed to destroy network for sandbox \"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.846317 containerd[1567]: time="2025-08-19T08:18:00.846265679Z" level=error msg="Failed to destroy network for sandbox \"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.847562 containerd[1567]: time="2025-08-19T08:18:00.847433743Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.847928 kubelet[2703]: E0819 08:18:00.847862 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.848394 kubelet[2703]: E0819 08:18:00.847978 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.848394 kubelet[2703]: E0819 08:18:00.848011 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:00.848394 kubelet[2703]: E0819 08:18:00.848085 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-kf7tj_calico-system(7495ee8b-4990-47af-bdd8-8d350506c7a6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-kf7tj_calico-system(7495ee8b-4990-47af-bdd8-8d350506c7a6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"253adc3a1488585b11391b39333fa6595468843c60a653cef4d9c309ae73ded4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-kf7tj" podUID="7495ee8b-4990-47af-bdd8-8d350506c7a6" Aug 19 08:18:00.849091 containerd[1567]: time="2025-08-19T08:18:00.849028710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fb8898cd-bc727,Uid:50a4115f-7018-4a8c-a733-cdd9909ab0b4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.849533 kubelet[2703]: E0819 08:18:00.849391 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:00.849651 kubelet[2703]: E0819 08:18:00.849603 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84fb8898cd-bc727" Aug 19 08:18:00.849651 kubelet[2703]: E0819 08:18:00.849643 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84fb8898cd-bc727" Aug 19 08:18:00.850083 kubelet[2703]: E0819 08:18:00.849825 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-84fb8898cd-bc727_calico-system(50a4115f-7018-4a8c-a733-cdd9909ab0b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-84fb8898cd-bc727_calico-system(50a4115f-7018-4a8c-a733-cdd9909ab0b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3831c37c6f47ff87199b9e3eb9ebc485a94f4e65575b8a1417cb44c3c881cdee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84fb8898cd-bc727" podUID="50a4115f-7018-4a8c-a733-cdd9909ab0b4" Aug 19 08:18:00.889268 containerd[1567]: time="2025-08-19T08:18:00.889195101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:00.902466 containerd[1567]: time="2025-08-19T08:18:00.902381163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-dzrf9,Uid:8c0f1f13-1397-41d8-bdc1-892dcc18456b,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:00.912797 containerd[1567]: time="2025-08-19T08:18:00.912620110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:00.919517 containerd[1567]: time="2025-08-19T08:18:00.919430614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:00.936680 containerd[1567]: time="2025-08-19T08:18:00.936588148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:01.001910 containerd[1567]: time="2025-08-19T08:18:01.001846229Z" level=error msg="Failed to destroy network for sandbox \"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.005921 systemd[1]: run-netns-cni\x2d7b2c1b1d\x2d5591\x2da88d\x2deb76\x2d438aa9295eec.mount: Deactivated successfully. Aug 19 08:18:01.016491 containerd[1567]: time="2025-08-19T08:18:01.016314898Z" level=error msg="Failed to destroy network for sandbox \"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.019423 systemd[1]: run-netns-cni\x2da6f55c02\x2d21dc\x2dcbed\x2d4341\x2daa1404aed869.mount: Deactivated successfully. Aug 19 08:18:01.031090 containerd[1567]: time="2025-08-19T08:18:01.031002208Z" level=error msg="Failed to destroy network for sandbox \"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.063550 containerd[1567]: time="2025-08-19T08:18:01.063427270Z" level=error msg="Failed to destroy network for sandbox \"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.065127 containerd[1567]: time="2025-08-19T08:18:01.065072510Z" level=error msg="Failed to destroy network for sandbox \"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.067609 containerd[1567]: time="2025-08-19T08:18:01.067555434Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.067963 kubelet[2703]: E0819 08:18:01.067910 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.068726 kubelet[2703]: E0819 08:18:01.068179 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:01.068726 kubelet[2703]: E0819 08:18:01.068207 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:01.068726 kubelet[2703]: E0819 08:18:01.068283 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5865b8886b-h88ng_calico-apiserver(1bc364cb-8d71-4f96-9547-19e65218a8e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5865b8886b-h88ng_calico-apiserver(1bc364cb-8d71-4f96-9547-19e65218a8e6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c007b6da08b54eba2c4c983eb1dd83d4d1746f45deda1eea1311fe6b363e174\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" podUID="1bc364cb-8d71-4f96-9547-19e65218a8e6" Aug 19 08:18:01.069570 containerd[1567]: time="2025-08-19T08:18:01.069491562Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.070308 kubelet[2703]: E0819 08:18:01.069896 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.070308 kubelet[2703]: E0819 08:18:01.070004 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:01.070308 kubelet[2703]: E0819 08:18:01.070033 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:01.070427 kubelet[2703]: E0819 08:18:01.070104 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6kgnl_kube-system(67ce1d8b-65f3-4306-aa87-3575d50b5ebe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6kgnl_kube-system(67ce1d8b-65f3-4306-aa87-3575d50b5ebe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5fd9c31534a89e0dba42578b12c09d7fe20797a71eda95283518aa3eca20aa56\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6kgnl" podUID="67ce1d8b-65f3-4306-aa87-3575d50b5ebe" Aug 19 08:18:01.081430 containerd[1567]: time="2025-08-19T08:18:01.081341502Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-dzrf9,Uid:8c0f1f13-1397-41d8-bdc1-892dcc18456b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.081788 kubelet[2703]: E0819 08:18:01.081731 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.081880 kubelet[2703]: E0819 08:18:01.081822 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" Aug 19 08:18:01.081880 kubelet[2703]: E0819 08:18:01.081850 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" Aug 19 08:18:01.081969 kubelet[2703]: E0819 08:18:01.081899 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5865b8886b-dzrf9_calico-apiserver(8c0f1f13-1397-41d8-bdc1-892dcc18456b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5865b8886b-dzrf9_calico-apiserver(8c0f1f13-1397-41d8-bdc1-892dcc18456b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f4b286c36132daffb062358beba73a78724b7171a5eb2dfa22a4e502ffbf03b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" podUID="8c0f1f13-1397-41d8-bdc1-892dcc18456b" Aug 19 08:18:01.083428 containerd[1567]: time="2025-08-19T08:18:01.083331590Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.083630 kubelet[2703]: E0819 08:18:01.083579 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.083713 kubelet[2703]: E0819 08:18:01.083643 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:01.083713 kubelet[2703]: E0819 08:18:01.083679 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:01.083798 kubelet[2703]: E0819 08:18:01.083723 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fc54c9b-6gqqp_calico-system(c6aeac62-957c-47bb-881b-da7164d33ac3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fc54c9b-6gqqp_calico-system(c6aeac62-957c-47bb-881b-da7164d33ac3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe2370195129bc44e443065e0524206ecf02846013f4b45826fd4f143c2aee8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" podUID="c6aeac62-957c-47bb-881b-da7164d33ac3" Aug 19 08:18:01.085431 containerd[1567]: time="2025-08-19T08:18:01.085361273Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.085778 kubelet[2703]: E0819 08:18:01.085696 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.085848 kubelet[2703]: E0819 08:18:01.085795 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:01.085848 kubelet[2703]: E0819 08:18:01.085819 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:01.085933 kubelet[2703]: E0819 08:18:01.085893 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-qtqtf_kube-system(161d8145-a00d-4936-8415-8fb415a18f07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-qtqtf_kube-system(161d8145-a00d-4936-8415-8fb415a18f07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acd9c56a9913b59524df7bca44176f328cc533066b126125a19b6b8f80714be2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-qtqtf" podUID="161d8145-a00d-4936-8415-8fb415a18f07" Aug 19 08:18:01.509818 systemd[1]: Created slice kubepods-besteffort-podaf59fbd1_ef40_4fa1_8148_7cb071fdc3e9.slice - libcontainer container kubepods-besteffort-podaf59fbd1_ef40_4fa1_8148_7cb071fdc3e9.slice. Aug 19 08:18:01.512276 containerd[1567]: time="2025-08-19T08:18:01.512236855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmzrp,Uid:af59fbd1-ef40-4fa1-8148-7cb071fdc3e9,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:01.666748 containerd[1567]: time="2025-08-19T08:18:01.666659180Z" level=error msg="Failed to destroy network for sandbox \"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.669910 containerd[1567]: time="2025-08-19T08:18:01.669825649Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmzrp,Uid:af59fbd1-ef40-4fa1-8148-7cb071fdc3e9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.670245 kubelet[2703]: E0819 08:18:01.670185 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:01.670307 kubelet[2703]: E0819 08:18:01.670288 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:18:01.670344 kubelet[2703]: E0819 08:18:01.670321 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fmzrp" Aug 19 08:18:01.670464 kubelet[2703]: E0819 08:18:01.670397 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fmzrp_calico-system(af59fbd1-ef40-4fa1-8148-7cb071fdc3e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fmzrp_calico-system(af59fbd1-ef40-4fa1-8148-7cb071fdc3e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a656754ee7526b78bd7909a2fb0f63fe1b9630bc8c17e725bcd85dee6369a42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fmzrp" podUID="af59fbd1-ef40-4fa1-8148-7cb071fdc3e9" Aug 19 08:18:01.938533 systemd[1]: run-netns-cni\x2dad9b1b37\x2d3ca7\x2d53f7\x2d0c08\x2d8f9086f35dcc.mount: Deactivated successfully. Aug 19 08:18:01.938819 systemd[1]: run-netns-cni\x2de721d2bb\x2dffdc\x2dd994\x2de723\x2dd9068495ef06.mount: Deactivated successfully. Aug 19 08:18:01.939049 systemd[1]: run-netns-cni\x2d38f19660\x2d3ac1\x2d278d\x2de573\x2d711ea9b708cc.mount: Deactivated successfully. Aug 19 08:18:05.029618 systemd[1]: Started sshd@7-10.0.0.123:22-10.0.0.1:51960.service - OpenSSH per-connection server daemon (10.0.0.1:51960). Aug 19 08:18:05.135896 sshd[3757]: Accepted publickey for core from 10.0.0.1 port 51960 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:05.139101 sshd-session[3757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:05.148189 systemd-logind[1541]: New session 8 of user core. Aug 19 08:18:05.155734 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 08:18:05.309312 sshd[3760]: Connection closed by 10.0.0.1 port 51960 Aug 19 08:18:05.309748 sshd-session[3757]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:05.315158 systemd[1]: sshd@7-10.0.0.123:22-10.0.0.1:51960.service: Deactivated successfully. Aug 19 08:18:05.318132 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 08:18:05.321084 systemd-logind[1541]: Session 8 logged out. Waiting for processes to exit. Aug 19 08:18:05.324537 systemd-logind[1541]: Removed session 8. Aug 19 08:18:10.325317 systemd[1]: Started sshd@8-10.0.0.123:22-10.0.0.1:53490.service - OpenSSH per-connection server daemon (10.0.0.1:53490). Aug 19 08:18:10.413904 sshd[3781]: Accepted publickey for core from 10.0.0.1 port 53490 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:10.416431 sshd-session[3781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:10.423309 systemd-logind[1541]: New session 9 of user core. Aug 19 08:18:10.431969 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 08:18:10.599057 sshd[3784]: Connection closed by 10.0.0.1 port 53490 Aug 19 08:18:10.600182 sshd-session[3781]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:10.608366 systemd-logind[1541]: Session 9 logged out. Waiting for processes to exit. Aug 19 08:18:10.609269 systemd[1]: sshd@8-10.0.0.123:22-10.0.0.1:53490.service: Deactivated successfully. Aug 19 08:18:10.613313 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 08:18:10.616936 systemd-logind[1541]: Removed session 9. Aug 19 08:18:12.284259 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1432291052.mount: Deactivated successfully. Aug 19 08:18:12.507580 containerd[1567]: time="2025-08-19T08:18:12.507527034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:12.508114 containerd[1567]: time="2025-08-19T08:18:12.507527946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:12.508114 containerd[1567]: time="2025-08-19T08:18:12.507526984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:13.085829 containerd[1567]: time="2025-08-19T08:18:13.085534880Z" level=error msg="Failed to destroy network for sandbox \"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.086552 containerd[1567]: time="2025-08-19T08:18:13.086487889Z" level=error msg="Failed to destroy network for sandbox \"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.089162 systemd[1]: run-netns-cni\x2dfcb79572\x2d3344\x2d20a5\x2da906\x2df5ce20b59088.mount: Deactivated successfully. Aug 19 08:18:13.089303 systemd[1]: run-netns-cni\x2d365a6880\x2dc409\x2df9e2\x2d83b4\x2d627b0e1808a0.mount: Deactivated successfully. Aug 19 08:18:13.098981 containerd[1567]: time="2025-08-19T08:18:13.098896069Z" level=error msg="Failed to destroy network for sandbox \"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.271803 containerd[1567]: time="2025-08-19T08:18:13.271645240Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.272096 kubelet[2703]: E0819 08:18:13.272048 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.272656 kubelet[2703]: E0819 08:18:13.272134 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:13.272656 kubelet[2703]: E0819 08:18:13.272167 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" Aug 19 08:18:13.272656 kubelet[2703]: E0819 08:18:13.272215 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697fc54c9b-6gqqp_calico-system(c6aeac62-957c-47bb-881b-da7164d33ac3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697fc54c9b-6gqqp_calico-system(c6aeac62-957c-47bb-881b-da7164d33ac3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3cc6d931a7cfd0bb507562025282ff2c292cfdc585a3dd776dc5982844084e53\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" podUID="c6aeac62-957c-47bb-881b-da7164d33ac3" Aug 19 08:18:13.274064 containerd[1567]: time="2025-08-19T08:18:13.273854787Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.274299 kubelet[2703]: E0819 08:18:13.274183 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.274299 kubelet[2703]: E0819 08:18:13.274246 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:13.274299 kubelet[2703]: E0819 08:18:13.274268 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qtqtf" Aug 19 08:18:13.274434 kubelet[2703]: E0819 08:18:13.274305 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-qtqtf_kube-system(161d8145-a00d-4936-8415-8fb415a18f07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-qtqtf_kube-system(161d8145-a00d-4936-8415-8fb415a18f07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d449078a1c8821a176ac3dfc15f23fda1016d0cf2ee1c21bd4ca1228aeeee7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-qtqtf" podUID="161d8145-a00d-4936-8415-8fb415a18f07" Aug 19 08:18:13.277607 containerd[1567]: time="2025-08-19T08:18:13.277343225Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.278015 kubelet[2703]: E0819 08:18:13.277948 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.278242 kubelet[2703]: E0819 08:18:13.278081 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:13.278242 kubelet[2703]: E0819 08:18:13.278106 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-kf7tj" Aug 19 08:18:13.278242 kubelet[2703]: E0819 08:18:13.278145 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-kf7tj_calico-system(7495ee8b-4990-47af-bdd8-8d350506c7a6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-kf7tj_calico-system(7495ee8b-4990-47af-bdd8-8d350506c7a6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f1a5adff87f7e284192531d5dd64f81ad5fc652be62e48255fe92ff0a78c8303\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-kf7tj" podUID="7495ee8b-4990-47af-bdd8-8d350506c7a6" Aug 19 08:18:13.282036 containerd[1567]: time="2025-08-19T08:18:13.281320020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:13.284629 systemd[1]: run-netns-cni\x2dac043a2b\x2d230c\x2d2deb\x2dc174\x2d656b790b2f18.mount: Deactivated successfully. Aug 19 08:18:13.285201 containerd[1567]: time="2025-08-19T08:18:13.285109022Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 19 08:18:13.289482 containerd[1567]: time="2025-08-19T08:18:13.289300920Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:13.293361 containerd[1567]: time="2025-08-19T08:18:13.292710630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:13.293361 containerd[1567]: time="2025-08-19T08:18:13.293216801Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 12.635148119s" Aug 19 08:18:13.293361 containerd[1567]: time="2025-08-19T08:18:13.293249552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 19 08:18:13.307020 containerd[1567]: time="2025-08-19T08:18:13.306963043Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 08:18:13.335127 containerd[1567]: time="2025-08-19T08:18:13.333897269Z" level=info msg="Container e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:13.505088 containerd[1567]: time="2025-08-19T08:18:13.505023353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:13.879552 containerd[1567]: time="2025-08-19T08:18:13.879497994Z" level=info msg="CreateContainer within sandbox \"ac3b6fa6618651b8f7a4d17593bb88b7f8183c89950aeca8fec409fd3e3a3737\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\"" Aug 19 08:18:13.881434 containerd[1567]: time="2025-08-19T08:18:13.881390296Z" level=info msg="StartContainer for \"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\"" Aug 19 08:18:13.885435 containerd[1567]: time="2025-08-19T08:18:13.885394252Z" level=info msg="connecting to shim e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514" address="unix:///run/containerd/s/f9c2a4c7fc07c101fd3942919994487616b8a48c64b112d018c2d5dc5f2ddb22" protocol=ttrpc version=3 Aug 19 08:18:13.903065 containerd[1567]: time="2025-08-19T08:18:13.902938983Z" level=error msg="Failed to destroy network for sandbox \"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.926962 systemd[1]: Started cri-containerd-e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514.scope - libcontainer container e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514. Aug 19 08:18:13.950357 containerd[1567]: time="2025-08-19T08:18:13.950258832Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.950734 kubelet[2703]: E0819 08:18:13.950683 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:13.950796 kubelet[2703]: E0819 08:18:13.950759 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:13.950796 kubelet[2703]: E0819 08:18:13.950781 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6kgnl" Aug 19 08:18:13.950861 kubelet[2703]: E0819 08:18:13.950832 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6kgnl_kube-system(67ce1d8b-65f3-4306-aa87-3575d50b5ebe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6kgnl_kube-system(67ce1d8b-65f3-4306-aa87-3575d50b5ebe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb6ceacef6df58c7cdc2124aa6f3500fea72a7efdad35b24e2c29861b60965f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6kgnl" podUID="67ce1d8b-65f3-4306-aa87-3575d50b5ebe" Aug 19 08:18:14.258875 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 08:18:14.259641 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 08:18:14.282617 containerd[1567]: time="2025-08-19T08:18:14.282570551Z" level=info msg="StartContainer for \"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\" returns successfully" Aug 19 08:18:14.504697 containerd[1567]: time="2025-08-19T08:18:14.504633486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:14.999432 containerd[1567]: time="2025-08-19T08:18:14.999372775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\" id:\"98a69b631033f6e1f2dde529ce0b4335eb65555ecbaffe6cc672f40feafe4404\" pid:3987 exit_status:1 exited_at:{seconds:1755591494 nanos:999022608}" Aug 19 08:18:15.064254 kubelet[2703]: I0819 08:18:15.063523 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nz6kt" podStartSLOduration=2.955574308 podStartE2EDuration="34.063498008s" podCreationTimestamp="2025-08-19 08:17:41 +0000 UTC" firstStartedPulling="2025-08-19 08:17:42.186493292 +0000 UTC m=+19.802839932" lastFinishedPulling="2025-08-19 08:18:13.294416993 +0000 UTC m=+50.910763632" observedRunningTime="2025-08-19 08:18:15.060319724 +0000 UTC m=+52.676666363" watchObservedRunningTime="2025-08-19 08:18:15.063498008 +0000 UTC m=+52.679844647" Aug 19 08:18:15.102078 containerd[1567]: time="2025-08-19T08:18:15.101992927Z" level=error msg="Failed to destroy network for sandbox \"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:15.104780 systemd[1]: run-netns-cni\x2df1387345\x2dd689\x2d930b\x2d8ecf\x2d4ec3f4e79fd5.mount: Deactivated successfully. Aug 19 08:18:15.148088 containerd[1567]: time="2025-08-19T08:18:15.147954410Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:15.148531 kubelet[2703]: E0819 08:18:15.148432 2703 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:18:15.148634 kubelet[2703]: E0819 08:18:15.148566 2703 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:15.148634 kubelet[2703]: E0819 08:18:15.148598 2703 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" Aug 19 08:18:15.148710 kubelet[2703]: E0819 08:18:15.148666 2703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5865b8886b-h88ng_calico-apiserver(1bc364cb-8d71-4f96-9547-19e65218a8e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5865b8886b-h88ng_calico-apiserver(1bc364cb-8d71-4f96-9547-19e65218a8e6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"863b03e351553abbd7a46bdce3d1d3a02c8af53443db769245a059288ba22304\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" podUID="1bc364cb-8d71-4f96-9547-19e65218a8e6" Aug 19 08:18:15.504471 containerd[1567]: time="2025-08-19T08:18:15.504377472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fb8898cd-bc727,Uid:50a4115f-7018-4a8c-a733-cdd9909ab0b4,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:15.504471 containerd[1567]: time="2025-08-19T08:18:15.504469445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmzrp,Uid:af59fbd1-ef40-4fa1-8148-7cb071fdc3e9,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:15.504724 containerd[1567]: time="2025-08-19T08:18:15.504378133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-dzrf9,Uid:8c0f1f13-1397-41d8-bdc1-892dcc18456b,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:15.619019 systemd[1]: Started sshd@9-10.0.0.123:22-10.0.0.1:53498.service - OpenSSH per-connection server daemon (10.0.0.1:53498). Aug 19 08:18:15.701587 sshd[4074]: Accepted publickey for core from 10.0.0.1 port 53498 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:15.704727 sshd-session[4074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:15.717438 systemd-logind[1541]: New session 10 of user core. Aug 19 08:18:15.720938 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 08:18:15.825713 containerd[1567]: time="2025-08-19T08:18:15.825220416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\" id:\"d4ee95d2c0336932eb10b86a090bf1baba6c8f82fd0446c22b22082ff9795d00\" pid:4109 exit_status:1 exited_at:{seconds:1755591495 nanos:824236850}" Aug 19 08:18:16.126225 sshd[4108]: Connection closed by 10.0.0.1 port 53498 Aug 19 08:18:16.126534 sshd-session[4074]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:16.132208 systemd[1]: sshd@9-10.0.0.123:22-10.0.0.1:53498.service: Deactivated successfully. Aug 19 08:18:16.132673 systemd-logind[1541]: Session 10 logged out. Waiting for processes to exit. Aug 19 08:18:16.135939 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 08:18:16.139506 systemd-logind[1541]: Removed session 10. Aug 19 08:18:16.698818 systemd-networkd[1480]: calic0037ce8586: Link UP Aug 19 08:18:16.703483 systemd-networkd[1480]: calic0037ce8586: Gained carrier Aug 19 08:18:16.738196 containerd[1567]: 2025-08-19 08:18:15.638 [INFO][4060] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:18:16.738196 containerd[1567]: 2025-08-19 08:18:15.708 [INFO][4060] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0 calico-apiserver-5865b8886b- calico-apiserver 8c0f1f13-1397-41d8-bdc1-892dcc18456b 819 0 2025-08-19 08:17:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5865b8886b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5865b8886b-dzrf9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic0037ce8586 [] [] }} ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-" Aug 19 08:18:16.738196 containerd[1567]: 2025-08-19 08:18:15.710 [INFO][4060] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.738196 containerd[1567]: 2025-08-19 08:18:16.381 [INFO][4106] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" HandleID="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Workload="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4106] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" HandleID="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Workload="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e580), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5865b8886b-dzrf9", "timestamp":"2025-08-19 08:18:16.381434059 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4106] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4106] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4106] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.613 [INFO][4106] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" host="localhost" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.624 [INFO][4106] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.630 [INFO][4106] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.635 [INFO][4106] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.640 [INFO][4106] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:16.739586 containerd[1567]: 2025-08-19 08:18:16.641 [INFO][4106] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" host="localhost" Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.645 [INFO][4106] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8 Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.654 [INFO][4106] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" host="localhost" Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.666 [INFO][4106] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" host="localhost" Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.667 [INFO][4106] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" host="localhost" Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.667 [INFO][4106] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:16.739943 containerd[1567]: 2025-08-19 08:18:16.667 [INFO][4106] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" HandleID="k8s-pod-network.1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Workload="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.740172 containerd[1567]: 2025-08-19 08:18:16.679 [INFO][4060] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0", GenerateName:"calico-apiserver-5865b8886b-", Namespace:"calico-apiserver", SelfLink:"", UID:"8c0f1f13-1397-41d8-bdc1-892dcc18456b", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5865b8886b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5865b8886b-dzrf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0037ce8586", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:16.740257 containerd[1567]: 2025-08-19 08:18:16.680 [INFO][4060] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.740257 containerd[1567]: 2025-08-19 08:18:16.680 [INFO][4060] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic0037ce8586 ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.740257 containerd[1567]: 2025-08-19 08:18:16.704 [INFO][4060] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.740371 containerd[1567]: 2025-08-19 08:18:16.704 [INFO][4060] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0", GenerateName:"calico-apiserver-5865b8886b-", Namespace:"calico-apiserver", SelfLink:"", UID:"8c0f1f13-1397-41d8-bdc1-892dcc18456b", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5865b8886b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8", Pod:"calico-apiserver-5865b8886b-dzrf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic0037ce8586", MAC:"36:ac:81:79:15:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:16.740447 containerd[1567]: 2025-08-19 08:18:16.731 [INFO][4060] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-dzrf9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--dzrf9-eth0" Aug 19 08:18:16.770269 systemd-networkd[1480]: calidc3d4c31899: Link UP Aug 19 08:18:16.771101 systemd-networkd[1480]: calidc3d4c31899: Gained carrier Aug 19 08:18:16.790278 containerd[1567]: 2025-08-19 08:18:15.652 [INFO][4046] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:18:16.790278 containerd[1567]: 2025-08-19 08:18:15.717 [INFO][4046] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--84fb8898cd--bc727-eth0 whisker-84fb8898cd- calico-system 50a4115f-7018-4a8c-a733-cdd9909ab0b4 966 0 2025-08-19 08:17:45 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:84fb8898cd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-84fb8898cd-bc727 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calidc3d4c31899 [] [] }} ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-" Aug 19 08:18:16.790278 containerd[1567]: 2025-08-19 08:18:15.717 [INFO][4046] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.790278 containerd[1567]: 2025-08-19 08:18:16.381 [INFO][4116] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.381 [INFO][4116] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb020), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-84fb8898cd-bc727", "timestamp":"2025-08-19 08:18:16.381387712 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4116] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.668 [INFO][4116] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.670 [INFO][4116] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.715 [INFO][4116] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" host="localhost" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.726 [INFO][4116] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.737 [INFO][4116] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.742 [INFO][4116] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.745 [INFO][4116] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:16.790917 containerd[1567]: 2025-08-19 08:18:16.745 [INFO][4116] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" host="localhost" Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.746 [INFO][4116] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918 Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.752 [INFO][4116] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" host="localhost" Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.757 [INFO][4116] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" host="localhost" Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.757 [INFO][4116] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" host="localhost" Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.758 [INFO][4116] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:16.791304 containerd[1567]: 2025-08-19 08:18:16.758 [INFO][4116] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.791487 containerd[1567]: 2025-08-19 08:18:16.765 [INFO][4046] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--84fb8898cd--bc727-eth0", GenerateName:"whisker-84fb8898cd-", Namespace:"calico-system", SelfLink:"", UID:"50a4115f-7018-4a8c-a733-cdd9909ab0b4", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84fb8898cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-84fb8898cd-bc727", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidc3d4c31899", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:16.791487 containerd[1567]: 2025-08-19 08:18:16.766 [INFO][4046] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.791571 containerd[1567]: 2025-08-19 08:18:16.766 [INFO][4046] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc3d4c31899 ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.791571 containerd[1567]: 2025-08-19 08:18:16.769 [INFO][4046] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.791650 containerd[1567]: 2025-08-19 08:18:16.773 [INFO][4046] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--84fb8898cd--bc727-eth0", GenerateName:"whisker-84fb8898cd-", Namespace:"calico-system", SelfLink:"", UID:"50a4115f-7018-4a8c-a733-cdd9909ab0b4", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84fb8898cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918", Pod:"whisker-84fb8898cd-bc727", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidc3d4c31899", MAC:"32:33:86:74:32:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:16.791745 containerd[1567]: 2025-08-19 08:18:16.786 [INFO][4046] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Namespace="calico-system" Pod="whisker-84fb8898cd-bc727" WorkloadEndpoint="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:16.822777 containerd[1567]: time="2025-08-19T08:18:16.822678927Z" level=info msg="connecting to shim 1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8" address="unix:///run/containerd/s/c3bfd2f9d9a24e97f2a57d7cd1e9684c4f5379145c7e1a53dc08b98cb003e1aa" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:16.850345 containerd[1567]: time="2025-08-19T08:18:16.850198145Z" level=info msg="connecting to shim 7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" address="unix:///run/containerd/s/589603f3439c58959f116beb55d6e85b9688e975e6afd95f15800b4c65f24d99" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:16.866929 systemd[1]: Started cri-containerd-1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8.scope - libcontainer container 1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8. Aug 19 08:18:16.889672 systemd[1]: Started cri-containerd-7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918.scope - libcontainer container 7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918. Aug 19 08:18:16.896760 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:16.906338 systemd-networkd[1480]: cali345b0ac98da: Link UP Aug 19 08:18:16.907282 systemd-networkd[1480]: cali345b0ac98da: Gained carrier Aug 19 08:18:16.919767 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:17.017694 containerd[1567]: time="2025-08-19T08:18:17.015086302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-dzrf9,Uid:8c0f1f13-1397-41d8-bdc1-892dcc18456b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8\"" Aug 19 08:18:17.017694 containerd[1567]: time="2025-08-19T08:18:17.017045158Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:18:17.071200 containerd[1567]: 2025-08-19 08:18:15.623 [INFO][4033] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:18:17.071200 containerd[1567]: 2025-08-19 08:18:15.709 [INFO][4033] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--fmzrp-eth0 csi-node-driver- calico-system af59fbd1-ef40-4fa1-8148-7cb071fdc3e9 685 0 2025-08-19 08:17:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-fmzrp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali345b0ac98da [] [] }} ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-" Aug 19 08:18:17.071200 containerd[1567]: 2025-08-19 08:18:15.709 [INFO][4033] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.071200 containerd[1567]: 2025-08-19 08:18:16.381 [INFO][4105] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" HandleID="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Workload="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4105] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" HandleID="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Workload="localhost-k8s-csi--node--driver--fmzrp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000bead0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-fmzrp", "timestamp":"2025-08-19 08:18:16.381387592 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.382 [INFO][4105] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.758 [INFO][4105] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.758 [INFO][4105] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.809 [INFO][4105] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" host="localhost" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.826 [INFO][4105] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.843 [INFO][4105] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.848 [INFO][4105] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.853 [INFO][4105] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:17.071676 containerd[1567]: 2025-08-19 08:18:16.853 [INFO][4105] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" host="localhost" Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.855 [INFO][4105] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.865 [INFO][4105] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" host="localhost" Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.897 [INFO][4105] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" host="localhost" Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.897 [INFO][4105] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" host="localhost" Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.897 [INFO][4105] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:17.072005 containerd[1567]: 2025-08-19 08:18:16.897 [INFO][4105] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" HandleID="k8s-pod-network.68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Workload="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.072165 containerd[1567]: 2025-08-19 08:18:16.903 [INFO][4033] cni-plugin/k8s.go 418: Populated endpoint ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fmzrp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-fmzrp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali345b0ac98da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:17.072246 containerd[1567]: 2025-08-19 08:18:16.903 [INFO][4033] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.072246 containerd[1567]: 2025-08-19 08:18:16.903 [INFO][4033] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali345b0ac98da ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.072246 containerd[1567]: 2025-08-19 08:18:16.908 [INFO][4033] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.072325 containerd[1567]: 2025-08-19 08:18:16.908 [INFO][4033] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fmzrp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af59fbd1-ef40-4fa1-8148-7cb071fdc3e9", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff", Pod:"csi-node-driver-fmzrp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali345b0ac98da", MAC:"1a:4a:e4:0a:80:46", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:17.072398 containerd[1567]: 2025-08-19 08:18:17.067 [INFO][4033] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" Namespace="calico-system" Pod="csi-node-driver-fmzrp" WorkloadEndpoint="localhost-k8s-csi--node--driver--fmzrp-eth0" Aug 19 08:18:17.168004 containerd[1567]: time="2025-08-19T08:18:17.167929665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84fb8898cd-bc727,Uid:50a4115f-7018-4a8c-a733-cdd9909ab0b4,Namespace:calico-system,Attempt:0,} returns sandbox id \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\"" Aug 19 08:18:17.602523 containerd[1567]: time="2025-08-19T08:18:17.602467779Z" level=info msg="connecting to shim 68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff" address="unix:///run/containerd/s/8edcd914c2457332ea2c8775808f63264465ce964bdb36bc427cd0ac2849a856" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:17.634699 systemd[1]: Started cri-containerd-68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff.scope - libcontainer container 68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff. Aug 19 08:18:17.649931 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:17.709933 containerd[1567]: time="2025-08-19T08:18:17.709825438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fmzrp,Uid:af59fbd1-ef40-4fa1-8148-7cb071fdc3e9,Namespace:calico-system,Attempt:0,} returns sandbox id \"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff\"" Aug 19 08:18:17.771836 systemd-networkd[1480]: calic0037ce8586: Gained IPv6LL Aug 19 08:18:17.960796 systemd-networkd[1480]: calidc3d4c31899: Gained IPv6LL Aug 19 08:18:18.316705 systemd-networkd[1480]: vxlan.calico: Link UP Aug 19 08:18:18.316720 systemd-networkd[1480]: vxlan.calico: Gained carrier Aug 19 08:18:18.472682 systemd-networkd[1480]: cali345b0ac98da: Gained IPv6LL Aug 19 08:18:19.367648 systemd-networkd[1480]: vxlan.calico: Gained IPv6LL Aug 19 08:18:20.977867 containerd[1567]: time="2025-08-19T08:18:20.977770595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:20.979732 containerd[1567]: time="2025-08-19T08:18:20.979678636Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 19 08:18:20.982052 containerd[1567]: time="2025-08-19T08:18:20.982008188Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:21.032272 containerd[1567]: time="2025-08-19T08:18:21.032151021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:21.033312 containerd[1567]: time="2025-08-19T08:18:21.033237128Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 4.016145734s" Aug 19 08:18:21.033312 containerd[1567]: time="2025-08-19T08:18:21.033293915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:18:21.034738 containerd[1567]: time="2025-08-19T08:18:21.034654930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 08:18:21.037562 containerd[1567]: time="2025-08-19T08:18:21.037407104Z" level=info msg="CreateContainer within sandbox \"1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:18:21.048321 containerd[1567]: time="2025-08-19T08:18:21.048249860Z" level=info msg="Container 5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:21.143613 systemd[1]: Started sshd@10-10.0.0.123:22-10.0.0.1:57942.service - OpenSSH per-connection server daemon (10.0.0.1:57942). Aug 19 08:18:21.169979 containerd[1567]: time="2025-08-19T08:18:21.169890294Z" level=info msg="CreateContainer within sandbox \"1c21e0a2bc608230b27fb981b3b6252ecdc6574433b759341e636b2daa2283e8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509\"" Aug 19 08:18:21.171525 containerd[1567]: time="2025-08-19T08:18:21.171306591Z" level=info msg="StartContainer for \"5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509\"" Aug 19 08:18:21.173492 containerd[1567]: time="2025-08-19T08:18:21.173390584Z" level=info msg="connecting to shim 5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509" address="unix:///run/containerd/s/c3bfd2f9d9a24e97f2a57d7cd1e9684c4f5379145c7e1a53dc08b98cb003e1aa" protocol=ttrpc version=3 Aug 19 08:18:21.221018 sshd[4565]: Accepted publickey for core from 10.0.0.1 port 57942 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:21.223608 sshd-session[4565]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:21.229392 systemd-logind[1541]: New session 11 of user core. Aug 19 08:18:21.238688 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 08:18:21.244130 systemd[1]: Started cri-containerd-5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509.scope - libcontainer container 5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509. Aug 19 08:18:21.357360 containerd[1567]: time="2025-08-19T08:18:21.357279346Z" level=info msg="StartContainer for \"5111dcede8c971daa2ba0bce1f0143798c8c2f41dab306c7ab344de9fac10509\" returns successfully" Aug 19 08:18:21.447290 sshd[4580]: Connection closed by 10.0.0.1 port 57942 Aug 19 08:18:21.447767 sshd-session[4565]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:21.452334 systemd[1]: sshd@10-10.0.0.123:22-10.0.0.1:57942.service: Deactivated successfully. Aug 19 08:18:21.454944 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 08:18:21.457498 systemd-logind[1541]: Session 11 logged out. Waiting for processes to exit. Aug 19 08:18:21.458665 systemd-logind[1541]: Removed session 11. Aug 19 08:18:23.186253 kubelet[2703]: I0819 08:18:23.185222 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5865b8886b-dzrf9" podStartSLOduration=42.167512608 podStartE2EDuration="46.185185057s" podCreationTimestamp="2025-08-19 08:17:37 +0000 UTC" firstStartedPulling="2025-08-19 08:18:17.016621333 +0000 UTC m=+54.632967972" lastFinishedPulling="2025-08-19 08:18:21.034293781 +0000 UTC m=+58.650640421" observedRunningTime="2025-08-19 08:18:21.850870822 +0000 UTC m=+59.467217451" watchObservedRunningTime="2025-08-19 08:18:23.185185057 +0000 UTC m=+60.801531696" Aug 19 08:18:24.589921 containerd[1567]: time="2025-08-19T08:18:24.589818615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:24.614890 containerd[1567]: time="2025-08-19T08:18:24.614796849Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 19 08:18:24.651283 containerd[1567]: time="2025-08-19T08:18:24.651159275Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:24.683490 containerd[1567]: time="2025-08-19T08:18:24.683373786Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:24.684090 containerd[1567]: time="2025-08-19T08:18:24.684050190Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 3.649347419s" Aug 19 08:18:24.684137 containerd[1567]: time="2025-08-19T08:18:24.684094515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 19 08:18:24.685481 containerd[1567]: time="2025-08-19T08:18:24.685385082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 08:18:24.686529 containerd[1567]: time="2025-08-19T08:18:24.686501033Z" level=info msg="CreateContainer within sandbox \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 08:18:24.837062 containerd[1567]: time="2025-08-19T08:18:24.836995328Z" level=info msg="Container 4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:25.023488 containerd[1567]: time="2025-08-19T08:18:25.023313722Z" level=info msg="CreateContainer within sandbox \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\"" Aug 19 08:18:25.024271 containerd[1567]: time="2025-08-19T08:18:25.024035793Z" level=info msg="StartContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\"" Aug 19 08:18:25.025938 containerd[1567]: time="2025-08-19T08:18:25.025891066Z" level=info msg="connecting to shim 4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5" address="unix:///run/containerd/s/589603f3439c58959f116beb55d6e85b9688e975e6afd95f15800b4c65f24d99" protocol=ttrpc version=3 Aug 19 08:18:25.051846 systemd[1]: Started cri-containerd-4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5.scope - libcontainer container 4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5. Aug 19 08:18:25.213428 containerd[1567]: time="2025-08-19T08:18:25.213373296Z" level=info msg="StartContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" returns successfully" Aug 19 08:18:26.470129 systemd[1]: Started sshd@11-10.0.0.123:22-10.0.0.1:57950.service - OpenSSH per-connection server daemon (10.0.0.1:57950). Aug 19 08:18:26.504598 containerd[1567]: time="2025-08-19T08:18:26.504544578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:26.505029 containerd[1567]: time="2025-08-19T08:18:26.504545510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,}" Aug 19 08:18:26.555706 sshd[4674]: Accepted publickey for core from 10.0.0.1 port 57950 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:26.557983 sshd-session[4674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:26.565499 systemd-logind[1541]: New session 12 of user core. Aug 19 08:18:26.570623 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 08:18:26.657746 systemd-networkd[1480]: cali50cfa0596fc: Link UP Aug 19 08:18:26.658383 systemd-networkd[1480]: cali50cfa0596fc: Gained carrier Aug 19 08:18:26.678570 containerd[1567]: 2025-08-19 08:18:26.571 [INFO][4677] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0 coredns-668d6bf9bc- kube-system 161d8145-a00d-4936-8415-8fb415a18f07 820 0 2025-08-19 08:17:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-qtqtf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali50cfa0596fc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-" Aug 19 08:18:26.678570 containerd[1567]: 2025-08-19 08:18:26.571 [INFO][4677] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.678570 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" HandleID="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Workload="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" HandleID="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Workload="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001393a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-qtqtf", "timestamp":"2025-08-19 08:18:26.603086543 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4708] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.610 [INFO][4708] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" host="localhost" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.616 [INFO][4708] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.621 [INFO][4708] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.623 [INFO][4708] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.627 [INFO][4708] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:26.679670 containerd[1567]: 2025-08-19 08:18:26.627 [INFO][4708] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" host="localhost" Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.628 [INFO][4708] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610 Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.633 [INFO][4708] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" host="localhost" Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.641 [INFO][4708] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" host="localhost" Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.641 [INFO][4708] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" host="localhost" Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.641 [INFO][4708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:26.679955 containerd[1567]: 2025-08-19 08:18:26.641 [INFO][4708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" HandleID="k8s-pod-network.1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Workload="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.680125 containerd[1567]: 2025-08-19 08:18:26.650 [INFO][4677] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"161d8145-a00d-4936-8415-8fb415a18f07", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-qtqtf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali50cfa0596fc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:26.680225 containerd[1567]: 2025-08-19 08:18:26.650 [INFO][4677] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.680225 containerd[1567]: 2025-08-19 08:18:26.650 [INFO][4677] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50cfa0596fc ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.680225 containerd[1567]: 2025-08-19 08:18:26.659 [INFO][4677] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.680316 containerd[1567]: 2025-08-19 08:18:26.659 [INFO][4677] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"161d8145-a00d-4936-8415-8fb415a18f07", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610", Pod:"coredns-668d6bf9bc-qtqtf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali50cfa0596fc", MAC:"1e:7e:1b:d7:12:04", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:26.680316 containerd[1567]: 2025-08-19 08:18:26.670 [INFO][4677] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" Namespace="kube-system" Pod="coredns-668d6bf9bc-qtqtf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qtqtf-eth0" Aug 19 08:18:26.721603 containerd[1567]: time="2025-08-19T08:18:26.720801088Z" level=info msg="connecting to shim 1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610" address="unix:///run/containerd/s/8657c6db1ac1d2f090ff8628e1e409449a51bdc3ebb6c887e7c8261927637f14" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:26.778887 sshd[4706]: Connection closed by 10.0.0.1 port 57950 Aug 19 08:18:26.780639 sshd-session[4674]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:26.784275 systemd[1]: Started cri-containerd-1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610.scope - libcontainer container 1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610. Aug 19 08:18:26.791181 systemd[1]: sshd@11-10.0.0.123:22-10.0.0.1:57950.service: Deactivated successfully. Aug 19 08:18:26.794305 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 08:18:26.797005 systemd-logind[1541]: Session 12 logged out. Waiting for processes to exit. Aug 19 08:18:26.799789 systemd[1]: Started sshd@12-10.0.0.123:22-10.0.0.1:57964.service - OpenSSH per-connection server daemon (10.0.0.1:57964). Aug 19 08:18:26.802388 systemd-logind[1541]: Removed session 12. Aug 19 08:18:26.805078 systemd-networkd[1480]: calib2bd8cab91d: Link UP Aug 19 08:18:26.806538 systemd-networkd[1480]: calib2bd8cab91d: Gained carrier Aug 19 08:18:26.813236 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.573 [INFO][4687] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0 coredns-668d6bf9bc- kube-system 67ce1d8b-65f3-4306-aa87-3575d50b5ebe 809 0 2025-08-19 08:17:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-6kgnl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib2bd8cab91d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.573 [INFO][4687] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4710] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" HandleID="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Workload="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4710] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" HandleID="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Workload="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7110), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-6kgnl", "timestamp":"2025-08-19 08:18:26.603628336 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.603 [INFO][4710] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.641 [INFO][4710] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.642 [INFO][4710] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.714 [INFO][4710] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.729 [INFO][4710] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.737 [INFO][4710] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.741 [INFO][4710] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.744 [INFO][4710] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.744 [INFO][4710] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.747 [INFO][4710] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4 Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.769 [INFO][4710] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.788 [INFO][4710] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.788 [INFO][4710] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" host="localhost" Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.788 [INFO][4710] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:26.832663 containerd[1567]: 2025-08-19 08:18:26.788 [INFO][4710] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" HandleID="k8s-pod-network.cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Workload="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.798 [INFO][4687] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"67ce1d8b-65f3-4306-aa87-3575d50b5ebe", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-6kgnl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2bd8cab91d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.800 [INFO][4687] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.801 [INFO][4687] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2bd8cab91d ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.807 [INFO][4687] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.808 [INFO][4687] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"67ce1d8b-65f3-4306-aa87-3575d50b5ebe", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4", Pod:"coredns-668d6bf9bc-6kgnl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2bd8cab91d", MAC:"a2:b2:12:69:e5:cb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:26.833341 containerd[1567]: 2025-08-19 08:18:26.824 [INFO][4687] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" Namespace="kube-system" Pod="coredns-668d6bf9bc-6kgnl" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--6kgnl-eth0" Aug 19 08:18:26.862673 containerd[1567]: time="2025-08-19T08:18:26.862624313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qtqtf,Uid:161d8145-a00d-4936-8415-8fb415a18f07,Namespace:kube-system,Attempt:0,} returns sandbox id \"1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610\"" Aug 19 08:18:26.865749 sshd[4791]: Accepted publickey for core from 10.0.0.1 port 57964 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:26.868812 sshd-session[4791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:26.869547 containerd[1567]: time="2025-08-19T08:18:26.869509173Z" level=info msg="CreateContainer within sandbox \"1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:18:26.870778 containerd[1567]: time="2025-08-19T08:18:26.870425196Z" level=info msg="connecting to shim cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4" address="unix:///run/containerd/s/e5f5532c42761df6bbdda5c8894a33be9b31c523334da51601f07827d7f2f817" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:26.876017 systemd-logind[1541]: New session 13 of user core. Aug 19 08:18:26.883769 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 08:18:26.890614 containerd[1567]: time="2025-08-19T08:18:26.890535861Z" level=info msg="Container e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:26.895648 systemd[1]: Started cri-containerd-cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4.scope - libcontainer container cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4. Aug 19 08:18:26.902193 containerd[1567]: time="2025-08-19T08:18:26.902070892Z" level=info msg="CreateContainer within sandbox \"1c5cdb515c0aeaca52aea2285c2b0fd2b9a3f32333fa44cb90cda1b60131a610\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26\"" Aug 19 08:18:26.905434 containerd[1567]: time="2025-08-19T08:18:26.903540750Z" level=info msg="StartContainer for \"e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26\"" Aug 19 08:18:26.906144 containerd[1567]: time="2025-08-19T08:18:26.906118472Z" level=info msg="connecting to shim e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26" address="unix:///run/containerd/s/8657c6db1ac1d2f090ff8628e1e409449a51bdc3ebb6c887e7c8261927637f14" protocol=ttrpc version=3 Aug 19 08:18:26.913936 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:26.932976 systemd[1]: Started cri-containerd-e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26.scope - libcontainer container e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26. Aug 19 08:18:26.970297 containerd[1567]: time="2025-08-19T08:18:26.970241073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6kgnl,Uid:67ce1d8b-65f3-4306-aa87-3575d50b5ebe,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4\"" Aug 19 08:18:26.979962 containerd[1567]: time="2025-08-19T08:18:26.979804629Z" level=info msg="CreateContainer within sandbox \"cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:18:27.019959 containerd[1567]: time="2025-08-19T08:18:27.019794148Z" level=info msg="StartContainer for \"e24132d784e930d694a7370989c431c87a92489baf87eb24c49bff8c8307ef26\" returns successfully" Aug 19 08:18:27.028494 containerd[1567]: time="2025-08-19T08:18:27.027673932Z" level=info msg="Container 36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:27.038787 containerd[1567]: time="2025-08-19T08:18:27.038719536Z" level=info msg="CreateContainer within sandbox \"cf9c241c1197aba8733800038f58ad4fb6156f7c20c946dfaaf1311079a759d4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c\"" Aug 19 08:18:27.039712 containerd[1567]: time="2025-08-19T08:18:27.039675064Z" level=info msg="StartContainer for \"36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c\"" Aug 19 08:18:27.045509 containerd[1567]: time="2025-08-19T08:18:27.044413658Z" level=info msg="connecting to shim 36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c" address="unix:///run/containerd/s/e5f5532c42761df6bbdda5c8894a33be9b31c523334da51601f07827d7f2f817" protocol=ttrpc version=3 Aug 19 08:18:27.073072 systemd[1]: Started cri-containerd-36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c.scope - libcontainer container 36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c. Aug 19 08:18:27.124998 sshd[4841]: Connection closed by 10.0.0.1 port 57964 Aug 19 08:18:27.122489 sshd-session[4791]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:27.139792 systemd[1]: sshd@12-10.0.0.123:22-10.0.0.1:57964.service: Deactivated successfully. Aug 19 08:18:27.144600 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 08:18:27.150484 containerd[1567]: time="2025-08-19T08:18:27.148500835Z" level=info msg="StartContainer for \"36104c9a8d4f0b6aa2497a7f583010f56bf06cea6f98850c5cbdec27a1dfcb2c\" returns successfully" Aug 19 08:18:27.150838 systemd-logind[1541]: Session 13 logged out. Waiting for processes to exit. Aug 19 08:18:27.154346 systemd[1]: Started sshd@13-10.0.0.123:22-10.0.0.1:57974.service - OpenSSH per-connection server daemon (10.0.0.1:57974). Aug 19 08:18:27.157280 systemd-logind[1541]: Removed session 13. Aug 19 08:18:27.233793 sshd[4931]: Accepted publickey for core from 10.0.0.1 port 57974 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:27.237369 sshd-session[4931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:27.248012 systemd-logind[1541]: New session 14 of user core. Aug 19 08:18:27.253635 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 08:18:27.439689 sshd[4935]: Connection closed by 10.0.0.1 port 57974 Aug 19 08:18:27.440157 sshd-session[4931]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:27.446303 systemd[1]: sshd@13-10.0.0.123:22-10.0.0.1:57974.service: Deactivated successfully. Aug 19 08:18:27.449152 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 08:18:27.450215 systemd-logind[1541]: Session 14 logged out. Waiting for processes to exit. Aug 19 08:18:27.451963 systemd-logind[1541]: Removed session 14. Aug 19 08:18:27.504429 containerd[1567]: time="2025-08-19T08:18:27.504266110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:27.573285 containerd[1567]: time="2025-08-19T08:18:27.573238722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:27.578672 containerd[1567]: time="2025-08-19T08:18:27.578327410Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 19 08:18:27.587096 containerd[1567]: time="2025-08-19T08:18:27.586999619Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:27.599596 containerd[1567]: time="2025-08-19T08:18:27.599514818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:27.602119 containerd[1567]: time="2025-08-19T08:18:27.601720911Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.916291744s" Aug 19 08:18:27.602119 containerd[1567]: time="2025-08-19T08:18:27.602034755Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 19 08:18:27.607102 containerd[1567]: time="2025-08-19T08:18:27.607035433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 08:18:27.610449 containerd[1567]: time="2025-08-19T08:18:27.610372742Z" level=info msg="CreateContainer within sandbox \"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 08:18:27.728116 containerd[1567]: time="2025-08-19T08:18:27.728050188Z" level=info msg="Container 9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:27.756038 containerd[1567]: time="2025-08-19T08:18:27.755865864Z" level=info msg="CreateContainer within sandbox \"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066\"" Aug 19 08:18:27.757325 containerd[1567]: time="2025-08-19T08:18:27.757188919Z" level=info msg="StartContainer for \"9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066\"" Aug 19 08:18:27.759492 containerd[1567]: time="2025-08-19T08:18:27.759324526Z" level=info msg="connecting to shim 9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066" address="unix:///run/containerd/s/8edcd914c2457332ea2c8775808f63264465ce964bdb36bc427cd0ac2849a856" protocol=ttrpc version=3 Aug 19 08:18:27.789306 kubelet[2703]: I0819 08:18:27.789211 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6kgnl" podStartSLOduration=60.789195456 podStartE2EDuration="1m0.789195456s" podCreationTimestamp="2025-08-19 08:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:18:27.788499898 +0000 UTC m=+65.404846538" watchObservedRunningTime="2025-08-19 08:18:27.789195456 +0000 UTC m=+65.405542095" Aug 19 08:18:27.790962 systemd[1]: Started cri-containerd-9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066.scope - libcontainer container 9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066. Aug 19 08:18:27.794621 systemd-networkd[1480]: calia49177b4e5d: Link UP Aug 19 08:18:27.794875 systemd-networkd[1480]: calia49177b4e5d: Gained carrier Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.636 [INFO][4953] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0 goldmane-768f4c5c69- calico-system 7495ee8b-4990-47af-bdd8-8d350506c7a6 816 0 2025-08-19 08:17:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-768f4c5c69-kf7tj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia49177b4e5d [] [] }} ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.636 [INFO][4953] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.699 [INFO][4971] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" HandleID="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Workload="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.699 [INFO][4971] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" HandleID="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Workload="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003500b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-768f4c5c69-kf7tj", "timestamp":"2025-08-19 08:18:27.699502563 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.699 [INFO][4971] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.699 [INFO][4971] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.699 [INFO][4971] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.722 [INFO][4971] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.730 [INFO][4971] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.746 [INFO][4971] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.749 [INFO][4971] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.752 [INFO][4971] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.752 [INFO][4971] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.755 [INFO][4971] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.762 [INFO][4971] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.782 [INFO][4971] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.782 [INFO][4971] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" host="localhost" Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.783 [INFO][4971] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:27.831213 containerd[1567]: 2025-08-19 08:18:27.783 [INFO][4971] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" HandleID="k8s-pod-network.5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Workload="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.786 [INFO][4953] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"7495ee8b-4990-47af-bdd8-8d350506c7a6", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-768f4c5c69-kf7tj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia49177b4e5d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.787 [INFO][4953] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.787 [INFO][4953] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia49177b4e5d ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.796 [INFO][4953] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.797 [INFO][4953] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"7495ee8b-4990-47af-bdd8-8d350506c7a6", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a", Pod:"goldmane-768f4c5c69-kf7tj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia49177b4e5d", MAC:"42:1c:b5:f3:6d:71", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:27.831974 containerd[1567]: 2025-08-19 08:18:27.819 [INFO][4953] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" Namespace="calico-system" Pod="goldmane-768f4c5c69-kf7tj" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--kf7tj-eth0" Aug 19 08:18:27.833811 kubelet[2703]: I0819 08:18:27.833665 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-qtqtf" podStartSLOduration=60.833638913 podStartE2EDuration="1m0.833638913s" podCreationTimestamp="2025-08-19 08:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:18:27.824814341 +0000 UTC m=+65.441161020" watchObservedRunningTime="2025-08-19 08:18:27.833638913 +0000 UTC m=+65.449985552" Aug 19 08:18:27.884051 containerd[1567]: time="2025-08-19T08:18:27.883794687Z" level=info msg="connecting to shim 5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a" address="unix:///run/containerd/s/c691e5fe47aee8620d3799171edfeaec2ca8627ab03cd374c640ab37236027ad" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:27.947783 systemd[1]: Started cri-containerd-5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a.scope - libcontainer container 5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a. Aug 19 08:18:27.965707 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:27.971030 containerd[1567]: time="2025-08-19T08:18:27.970988664Z" level=info msg="StartContainer for \"9b40dbbfa9407019fc26beca478ebe3476eaa3389ffa87691a04ae709cf17066\" returns successfully" Aug 19 08:18:28.007773 systemd-networkd[1480]: cali50cfa0596fc: Gained IPv6LL Aug 19 08:18:28.015298 containerd[1567]: time="2025-08-19T08:18:28.015242018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-kf7tj,Uid:7495ee8b-4990-47af-bdd8-8d350506c7a6,Namespace:calico-system,Attempt:0,} returns sandbox id \"5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a\"" Aug 19 08:18:28.135717 systemd-networkd[1480]: calib2bd8cab91d: Gained IPv6LL Aug 19 08:18:28.505155 containerd[1567]: time="2025-08-19T08:18:28.505097399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,}" Aug 19 08:18:28.638332 systemd-networkd[1480]: calif247c85ea82: Link UP Aug 19 08:18:28.638937 systemd-networkd[1480]: calif247c85ea82: Gained carrier Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.550 [INFO][5073] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0 calico-kube-controllers-697fc54c9b- calico-system c6aeac62-957c-47bb-881b-da7164d33ac3 821 0 2025-08-19 08:17:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:697fc54c9b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-697fc54c9b-6gqqp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif247c85ea82 [] [] }} ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.550 [INFO][5073] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.585 [INFO][5087] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" HandleID="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Workload="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.586 [INFO][5087] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" HandleID="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Workload="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df720), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-697fc54c9b-6gqqp", "timestamp":"2025-08-19 08:18:28.58594681 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.586 [INFO][5087] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.586 [INFO][5087] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.586 [INFO][5087] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.596 [INFO][5087] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.602 [INFO][5087] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.608 [INFO][5087] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.610 [INFO][5087] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.613 [INFO][5087] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.613 [INFO][5087] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.615 [INFO][5087] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6 Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.621 [INFO][5087] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.630 [INFO][5087] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.630 [INFO][5087] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" host="localhost" Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.630 [INFO][5087] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:28.665209 containerd[1567]: 2025-08-19 08:18:28.630 [INFO][5087] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" HandleID="k8s-pod-network.08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Workload="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.634 [INFO][5073] cni-plugin/k8s.go 418: Populated endpoint ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0", GenerateName:"calico-kube-controllers-697fc54c9b-", Namespace:"calico-system", SelfLink:"", UID:"c6aeac62-957c-47bb-881b-da7164d33ac3", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697fc54c9b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-697fc54c9b-6gqqp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif247c85ea82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.635 [INFO][5073] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.635 [INFO][5073] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif247c85ea82 ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.640 [INFO][5073] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.640 [INFO][5073] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0", GenerateName:"calico-kube-controllers-697fc54c9b-", Namespace:"calico-system", SelfLink:"", UID:"c6aeac62-957c-47bb-881b-da7164d33ac3", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697fc54c9b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6", Pod:"calico-kube-controllers-697fc54c9b-6gqqp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif247c85ea82", MAC:"5a:c7:23:e1:13:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:28.666142 containerd[1567]: 2025-08-19 08:18:28.659 [INFO][5073] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" Namespace="calico-system" Pod="calico-kube-controllers-697fc54c9b-6gqqp" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697fc54c9b--6gqqp-eth0" Aug 19 08:18:28.691642 containerd[1567]: time="2025-08-19T08:18:28.691565787Z" level=info msg="connecting to shim 08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6" address="unix:///run/containerd/s/24ab49569db1adefb72f38bd88b3738e418cd9ae7550c1a6ff000d76e1278e9f" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:28.724961 systemd[1]: Started cri-containerd-08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6.scope - libcontainer container 08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6. Aug 19 08:18:28.742090 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:28.778957 containerd[1567]: time="2025-08-19T08:18:28.778800861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697fc54c9b-6gqqp,Uid:c6aeac62-957c-47bb-881b-da7164d33ac3,Namespace:calico-system,Attempt:0,} returns sandbox id \"08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6\"" Aug 19 08:18:29.543726 systemd-networkd[1480]: calia49177b4e5d: Gained IPv6LL Aug 19 08:18:30.365375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3379364475.mount: Deactivated successfully. Aug 19 08:18:30.439730 systemd-networkd[1480]: calif247c85ea82: Gained IPv6LL Aug 19 08:18:30.505393 containerd[1567]: time="2025-08-19T08:18:30.505168939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:18:30.783229 systemd-networkd[1480]: cali413f31ee0eb: Link UP Aug 19 08:18:30.784095 systemd-networkd[1480]: cali413f31ee0eb: Gained carrier Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.647 [INFO][5159] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0 calico-apiserver-5865b8886b- calico-apiserver 1bc364cb-8d71-4f96-9547-19e65218a8e6 813 0 2025-08-19 08:17:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5865b8886b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5865b8886b-h88ng eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali413f31ee0eb [] [] }} ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.647 [INFO][5159] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.683 [INFO][5173] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" HandleID="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Workload="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.683 [INFO][5173] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" HandleID="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Workload="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df5f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5865b8886b-h88ng", "timestamp":"2025-08-19 08:18:30.683441634 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.683 [INFO][5173] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.683 [INFO][5173] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.683 [INFO][5173] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.694 [INFO][5173] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.702 [INFO][5173] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.710 [INFO][5173] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.713 [INFO][5173] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.718 [INFO][5173] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.718 [INFO][5173] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.720 [INFO][5173] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7 Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.738 [INFO][5173] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.776 [INFO][5173] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.776 [INFO][5173] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" host="localhost" Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.776 [INFO][5173] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:30.857934 containerd[1567]: 2025-08-19 08:18:30.776 [INFO][5173] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" HandleID="k8s-pod-network.8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Workload="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.780 [INFO][5159] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0", GenerateName:"calico-apiserver-5865b8886b-", Namespace:"calico-apiserver", SelfLink:"", UID:"1bc364cb-8d71-4f96-9547-19e65218a8e6", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5865b8886b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5865b8886b-h88ng", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali413f31ee0eb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.780 [INFO][5159] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.780 [INFO][5159] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali413f31ee0eb ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.784 [INFO][5159] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.784 [INFO][5159] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0", GenerateName:"calico-apiserver-5865b8886b-", Namespace:"calico-apiserver", SelfLink:"", UID:"1bc364cb-8d71-4f96-9547-19e65218a8e6", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5865b8886b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7", Pod:"calico-apiserver-5865b8886b-h88ng", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali413f31ee0eb", MAC:"ce:bd:4b:d6:23:3a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:18:30.858801 containerd[1567]: 2025-08-19 08:18:30.854 [INFO][5159] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" Namespace="calico-apiserver" Pod="calico-apiserver-5865b8886b-h88ng" WorkloadEndpoint="localhost-k8s-calico--apiserver--5865b8886b--h88ng-eth0" Aug 19 08:18:30.949646 containerd[1567]: time="2025-08-19T08:18:30.949529001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:30.969355 containerd[1567]: time="2025-08-19T08:18:30.969273420Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 19 08:18:30.989300 containerd[1567]: time="2025-08-19T08:18:30.989196362Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:31.006205 containerd[1567]: time="2025-08-19T08:18:31.006109130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:31.007296 containerd[1567]: time="2025-08-19T08:18:31.007204191Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 3.400106769s" Aug 19 08:18:31.007296 containerd[1567]: time="2025-08-19T08:18:31.007271370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 19 08:18:31.011930 containerd[1567]: time="2025-08-19T08:18:31.011855575Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 08:18:31.012935 containerd[1567]: time="2025-08-19T08:18:31.012911711Z" level=info msg="CreateContainer within sandbox \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 08:18:31.024316 containerd[1567]: time="2025-08-19T08:18:31.024240253Z" level=info msg="connecting to shim 8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7" address="unix:///run/containerd/s/5d5a1235622302198558cd5e57ef7052bd0562e41fa5b3e1a74c47eb5a0f51f3" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:18:31.037957 containerd[1567]: time="2025-08-19T08:18:31.037764617Z" level=info msg="Container 247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:31.061748 containerd[1567]: time="2025-08-19T08:18:31.061655728Z" level=info msg="CreateContainer within sandbox \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\"" Aug 19 08:18:31.063358 containerd[1567]: time="2025-08-19T08:18:31.062715430Z" level=info msg="StartContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\"" Aug 19 08:18:31.065513 containerd[1567]: time="2025-08-19T08:18:31.065275882Z" level=info msg="connecting to shim 247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b" address="unix:///run/containerd/s/589603f3439c58959f116beb55d6e85b9688e975e6afd95f15800b4c65f24d99" protocol=ttrpc version=3 Aug 19 08:18:31.069853 systemd[1]: Started cri-containerd-8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7.scope - libcontainer container 8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7. Aug 19 08:18:31.098858 systemd[1]: Started cri-containerd-247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b.scope - libcontainer container 247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b. Aug 19 08:18:31.106565 systemd-resolved[1402]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:18:31.158117 containerd[1567]: time="2025-08-19T08:18:31.158054171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5865b8886b-h88ng,Uid:1bc364cb-8d71-4f96-9547-19e65218a8e6,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7\"" Aug 19 08:18:31.164334 containerd[1567]: time="2025-08-19T08:18:31.164264996Z" level=info msg="CreateContainer within sandbox \"8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:18:31.177411 containerd[1567]: time="2025-08-19T08:18:31.177345318Z" level=info msg="StartContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" returns successfully" Aug 19 08:18:31.187982 containerd[1567]: time="2025-08-19T08:18:31.187903632Z" level=info msg="Container 680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:31.202947 containerd[1567]: time="2025-08-19T08:18:31.202867888Z" level=info msg="CreateContainer within sandbox \"8ef012b2c4b40852350abcdf59eb3a4523f93cc6fb6cb96d684961b05ef72fb7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022\"" Aug 19 08:18:31.204507 containerd[1567]: time="2025-08-19T08:18:31.203969091Z" level=info msg="StartContainer for \"680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022\"" Aug 19 08:18:31.205992 containerd[1567]: time="2025-08-19T08:18:31.205949360Z" level=info msg="connecting to shim 680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022" address="unix:///run/containerd/s/5d5a1235622302198558cd5e57ef7052bd0562e41fa5b3e1a74c47eb5a0f51f3" protocol=ttrpc version=3 Aug 19 08:18:31.241792 systemd[1]: Started cri-containerd-680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022.scope - libcontainer container 680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022. Aug 19 08:18:31.296864 containerd[1567]: time="2025-08-19T08:18:31.296669050Z" level=info msg="StartContainer for \"680c7d3e784d7cb513a3bc602f427ff43208597a263039b058d3b61673f8d022\" returns successfully" Aug 19 08:18:31.801553 containerd[1567]: time="2025-08-19T08:18:31.801495171Z" level=info msg="StopContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" with timeout 30 (s)" Aug 19 08:18:31.802131 containerd[1567]: time="2025-08-19T08:18:31.801783293Z" level=info msg="StopContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" with timeout 30 (s)" Aug 19 08:18:31.802865 containerd[1567]: time="2025-08-19T08:18:31.802528823Z" level=info msg="Stop container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" with signal terminated" Aug 19 08:18:31.802865 containerd[1567]: time="2025-08-19T08:18:31.802532621Z" level=info msg="Stop container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" with signal terminated" Aug 19 08:18:31.815125 systemd[1]: cri-containerd-247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b.scope: Deactivated successfully. Aug 19 08:18:31.817492 containerd[1567]: time="2025-08-19T08:18:31.817215016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" id:\"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" pid:5247 exit_status:2 exited_at:{seconds:1755591511 nanos:816247641}" Aug 19 08:18:31.817492 containerd[1567]: time="2025-08-19T08:18:31.817325669Z" level=info msg="received exit event container_id:\"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" id:\"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" pid:5247 exit_status:2 exited_at:{seconds:1755591511 nanos:816247641}" Aug 19 08:18:31.827845 systemd[1]: cri-containerd-4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5.scope: Deactivated successfully. Aug 19 08:18:31.828778 containerd[1567]: time="2025-08-19T08:18:31.828727832Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" id:\"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" pid:4651 exited_at:{seconds:1755591511 nanos:828210019}" Aug 19 08:18:31.828947 containerd[1567]: time="2025-08-19T08:18:31.828876367Z" level=info msg="received exit event container_id:\"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" id:\"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" pid:4651 exited_at:{seconds:1755591511 nanos:828210019}" Aug 19 08:18:32.036105 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b-rootfs.mount: Deactivated successfully. Aug 19 08:18:32.036223 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5-rootfs.mount: Deactivated successfully. Aug 19 08:18:32.454863 systemd[1]: Started sshd@14-10.0.0.123:22-10.0.0.1:49792.service - OpenSSH per-connection server daemon (10.0.0.1:49792). Aug 19 08:18:32.487705 systemd-networkd[1480]: cali413f31ee0eb: Gained IPv6LL Aug 19 08:18:32.641067 sshd[5351]: Accepted publickey for core from 10.0.0.1 port 49792 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:32.642660 sshd-session[5351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:32.647036 systemd-logind[1541]: New session 15 of user core. Aug 19 08:18:32.653580 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 08:18:32.691862 kubelet[2703]: I0819 08:18:32.691741 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-84fb8898cd-bc727" podStartSLOduration=33.850365964 podStartE2EDuration="47.691719683s" podCreationTimestamp="2025-08-19 08:17:45 +0000 UTC" firstStartedPulling="2025-08-19 08:18:17.169236398 +0000 UTC m=+54.785583037" lastFinishedPulling="2025-08-19 08:18:31.010590117 +0000 UTC m=+68.626936756" observedRunningTime="2025-08-19 08:18:31.975251402 +0000 UTC m=+69.591598041" watchObservedRunningTime="2025-08-19 08:18:32.691719683 +0000 UTC m=+70.308066312" Aug 19 08:18:32.692560 kubelet[2703]: I0819 08:18:32.691878 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5865b8886b-h88ng" podStartSLOduration=55.691873086 podStartE2EDuration="55.691873086s" podCreationTimestamp="2025-08-19 08:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:18:32.691267515 +0000 UTC m=+70.307614155" watchObservedRunningTime="2025-08-19 08:18:32.691873086 +0000 UTC m=+70.308219725" Aug 19 08:18:33.076165 sshd[5355]: Connection closed by 10.0.0.1 port 49792 Aug 19 08:18:33.077711 sshd-session[5351]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:33.082223 containerd[1567]: time="2025-08-19T08:18:33.081784600Z" level=info msg="StopContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" returns successfully" Aug 19 08:18:33.082181 systemd[1]: sshd@14-10.0.0.123:22-10.0.0.1:49792.service: Deactivated successfully. Aug 19 08:18:33.083529 containerd[1567]: time="2025-08-19T08:18:33.083428099Z" level=info msg="StopContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" returns successfully" Aug 19 08:18:33.085911 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 08:18:33.088563 systemd-logind[1541]: Session 15 logged out. Waiting for processes to exit. Aug 19 08:18:33.092139 systemd-logind[1541]: Removed session 15. Aug 19 08:18:33.104928 containerd[1567]: time="2025-08-19T08:18:33.104861377Z" level=info msg="StopPodSandbox for \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\"" Aug 19 08:18:33.111897 containerd[1567]: time="2025-08-19T08:18:33.111828576Z" level=info msg="Container to stop \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 08:18:33.111897 containerd[1567]: time="2025-08-19T08:18:33.111892479Z" level=info msg="Container to stop \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 08:18:33.121876 systemd[1]: cri-containerd-7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918.scope: Deactivated successfully. Aug 19 08:18:33.123067 containerd[1567]: time="2025-08-19T08:18:33.123023427Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" id:\"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" pid:4282 exit_status:137 exited_at:{seconds:1755591513 nanos:122332703}" Aug 19 08:18:33.159149 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918-rootfs.mount: Deactivated successfully. Aug 19 08:18:33.159896 containerd[1567]: time="2025-08-19T08:18:33.159809983Z" level=info msg="shim disconnected" id=7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918 namespace=k8s.io Aug 19 08:18:33.160260 containerd[1567]: time="2025-08-19T08:18:33.160019384Z" level=warning msg="cleaning up after shim disconnected" id=7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918 namespace=k8s.io Aug 19 08:18:33.160260 containerd[1567]: time="2025-08-19T08:18:33.160039032Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 08:18:33.234206 containerd[1567]: time="2025-08-19T08:18:33.234149483Z" level=info msg="received exit event sandbox_id:\"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" exit_status:137 exited_at:{seconds:1755591513 nanos:122332703}" Aug 19 08:18:33.234612 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918-shm.mount: Deactivated successfully. Aug 19 08:18:33.285807 systemd-networkd[1480]: calidc3d4c31899: Link DOWN Aug 19 08:18:33.285821 systemd-networkd[1480]: calidc3d4c31899: Lost carrier Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.282 [INFO][5420] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.283 [INFO][5420] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" iface="eth0" netns="/var/run/netns/cni-b73f9b19-304f-7f79-8168-a852c50531d4" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.283 [INFO][5420] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" iface="eth0" netns="/var/run/netns/cni-b73f9b19-304f-7f79-8168-a852c50531d4" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.293 [INFO][5420] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" after=10.436878ms iface="eth0" netns="/var/run/netns/cni-b73f9b19-304f-7f79-8168-a852c50531d4" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.293 [INFO][5420] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.293 [INFO][5420] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.328 [INFO][5430] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.328 [INFO][5430] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.328 [INFO][5430] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.387 [INFO][5430] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.387 [INFO][5430] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" HandleID="k8s-pod-network.7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Workload="localhost-k8s-whisker--84fb8898cd--bc727-eth0" Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.389 [INFO][5430] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:18:33.396514 containerd[1567]: 2025-08-19 08:18:33.392 [INFO][5420] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918" Aug 19 08:18:33.401422 systemd[1]: run-netns-cni\x2db73f9b19\x2d304f\x2d7f79\x2d8168\x2da852c50531d4.mount: Deactivated successfully. Aug 19 08:18:33.406098 containerd[1567]: time="2025-08-19T08:18:33.406021554Z" level=info msg="TearDown network for sandbox \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" successfully" Aug 19 08:18:33.406098 containerd[1567]: time="2025-08-19T08:18:33.406087540Z" level=info msg="StopPodSandbox for \"7e41deee28c4a9a3716cf6381967528d1e522bbc3866732753897f1907ab0918\" returns successfully" Aug 19 08:18:33.507967 kubelet[2703]: I0819 08:18:33.507886 2703 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-backend-key-pair\") pod \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " Aug 19 08:18:33.507967 kubelet[2703]: I0819 08:18:33.507961 2703 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsllf\" (UniqueName: \"kubernetes.io/projected/50a4115f-7018-4a8c-a733-cdd9909ab0b4-kube-api-access-nsllf\") pod \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " Aug 19 08:18:33.513823 kubelet[2703]: I0819 08:18:33.513713 2703 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a4115f-7018-4a8c-a733-cdd9909ab0b4-kube-api-access-nsllf" (OuterVolumeSpecName: "kube-api-access-nsllf") pod "50a4115f-7018-4a8c-a733-cdd9909ab0b4" (UID: "50a4115f-7018-4a8c-a733-cdd9909ab0b4"). InnerVolumeSpecName "kube-api-access-nsllf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 08:18:33.515916 kubelet[2703]: I0819 08:18:33.515831 2703 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "50a4115f-7018-4a8c-a733-cdd9909ab0b4" (UID: "50a4115f-7018-4a8c-a733-cdd9909ab0b4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 08:18:33.517579 systemd[1]: var-lib-kubelet-pods-50a4115f\x2d7018\x2d4a8c\x2da733\x2dcdd9909ab0b4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnsllf.mount: Deactivated successfully. Aug 19 08:18:33.521242 systemd[1]: var-lib-kubelet-pods-50a4115f\x2d7018\x2d4a8c\x2da733\x2dcdd9909ab0b4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 08:18:33.608318 kubelet[2703]: I0819 08:18:33.608242 2703 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-ca-bundle\") pod \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\" (UID: \"50a4115f-7018-4a8c-a733-cdd9909ab0b4\") " Aug 19 08:18:33.608571 kubelet[2703]: I0819 08:18:33.608356 2703 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Aug 19 08:18:33.608571 kubelet[2703]: I0819 08:18:33.608372 2703 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsllf\" (UniqueName: \"kubernetes.io/projected/50a4115f-7018-4a8c-a733-cdd9909ab0b4-kube-api-access-nsllf\") on node \"localhost\" DevicePath \"\"" Aug 19 08:18:33.609050 kubelet[2703]: I0819 08:18:33.608958 2703 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "50a4115f-7018-4a8c-a733-cdd9909ab0b4" (UID: "50a4115f-7018-4a8c-a733-cdd9909ab0b4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 19 08:18:33.708842 kubelet[2703]: I0819 08:18:33.708657 2703 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50a4115f-7018-4a8c-a733-cdd9909ab0b4-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Aug 19 08:18:33.811117 kubelet[2703]: I0819 08:18:33.811067 2703 scope.go:117] "RemoveContainer" containerID="247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b" Aug 19 08:18:33.813271 containerd[1567]: time="2025-08-19T08:18:33.813226771Z" level=info msg="RemoveContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\"" Aug 19 08:18:33.822234 systemd[1]: Removed slice kubepods-besteffort-pod50a4115f_7018_4a8c_a733_cdd9909ab0b4.slice - libcontainer container kubepods-besteffort-pod50a4115f_7018_4a8c_a733_cdd9909ab0b4.slice. Aug 19 08:18:33.987896 containerd[1567]: time="2025-08-19T08:18:33.987690497Z" level=info msg="RemoveContainer for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" returns successfully" Aug 19 08:18:33.993828 kubelet[2703]: I0819 08:18:33.993744 2703 scope.go:117] "RemoveContainer" containerID="4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5" Aug 19 08:18:33.997871 containerd[1567]: time="2025-08-19T08:18:33.997800118Z" level=info msg="RemoveContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\"" Aug 19 08:18:34.008372 containerd[1567]: time="2025-08-19T08:18:34.008305610Z" level=info msg="RemoveContainer for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" returns successfully" Aug 19 08:18:34.008757 kubelet[2703]: I0819 08:18:34.008713 2703 scope.go:117] "RemoveContainer" containerID="247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b" Aug 19 08:18:34.009178 containerd[1567]: time="2025-08-19T08:18:34.009128335Z" level=error msg="ContainerStatus for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": not found" Aug 19 08:18:34.009408 kubelet[2703]: E0819 08:18:34.009357 2703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": not found" containerID="247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b" Aug 19 08:18:34.009639 kubelet[2703]: I0819 08:18:34.009408 2703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b"} err="failed to get container status \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": rpc error: code = NotFound desc = an error occurred when try to find container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": not found" Aug 19 08:18:34.009639 kubelet[2703]: I0819 08:18:34.009613 2703 scope.go:117] "RemoveContainer" containerID="4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5" Aug 19 08:18:34.010339 containerd[1567]: time="2025-08-19T08:18:34.009946342Z" level=error msg="ContainerStatus for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": not found" Aug 19 08:18:34.010439 kubelet[2703]: E0819 08:18:34.010297 2703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": not found" containerID="4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5" Aug 19 08:18:34.010439 kubelet[2703]: I0819 08:18:34.010324 2703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5"} err="failed to get container status \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": rpc error: code = NotFound desc = an error occurred when try to find container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": not found" Aug 19 08:18:34.010439 kubelet[2703]: I0819 08:18:34.010343 2703 scope.go:117] "RemoveContainer" containerID="247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b" Aug 19 08:18:34.011395 containerd[1567]: time="2025-08-19T08:18:34.010509901Z" level=error msg="ContainerStatus for \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": not found" Aug 19 08:18:34.011537 kubelet[2703]: I0819 08:18:34.011398 2703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b"} err="failed to get container status \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": rpc error: code = NotFound desc = an error occurred when try to find container \"247e45b0044c2b93b7a2951abfc19244a849390aa44c97eba6107952e665284b\": not found" Aug 19 08:18:34.011537 kubelet[2703]: I0819 08:18:34.011525 2703 scope.go:117] "RemoveContainer" containerID="4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5" Aug 19 08:18:34.012648 containerd[1567]: time="2025-08-19T08:18:34.012588662Z" level=error msg="ContainerStatus for \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": not found" Aug 19 08:18:34.013325 kubelet[2703]: I0819 08:18:34.012855 2703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5"} err="failed to get container status \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": rpc error: code = NotFound desc = an error occurred when try to find container \"4af13613bda278fbec6130e8a4306460303c4009f63e74adb49fd213765fbff5\": not found" Aug 19 08:18:34.508004 kubelet[2703]: I0819 08:18:34.507927 2703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a4115f-7018-4a8c-a733-cdd9909ab0b4" path="/var/lib/kubelet/pods/50a4115f-7018-4a8c-a733-cdd9909ab0b4/volumes" Aug 19 08:18:34.795956 containerd[1567]: time="2025-08-19T08:18:34.795746996Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:34.798042 containerd[1567]: time="2025-08-19T08:18:34.797946779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 19 08:18:34.802124 containerd[1567]: time="2025-08-19T08:18:34.802031331Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:34.806675 containerd[1567]: time="2025-08-19T08:18:34.806588858Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:34.807231 containerd[1567]: time="2025-08-19T08:18:34.807184288Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 3.795281474s" Aug 19 08:18:34.807297 containerd[1567]: time="2025-08-19T08:18:34.807233292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 19 08:18:34.810863 containerd[1567]: time="2025-08-19T08:18:34.810657289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 08:18:34.810863 containerd[1567]: time="2025-08-19T08:18:34.810840359Z" level=info msg="CreateContainer within sandbox \"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 08:18:34.825027 containerd[1567]: time="2025-08-19T08:18:34.824953697Z" level=info msg="Container 98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:34.843441 containerd[1567]: time="2025-08-19T08:18:34.843348734Z" level=info msg="CreateContainer within sandbox \"68e3c3369efc79318288945be5699410b3296713b179c6a20b7aa124b5871bff\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7\"" Aug 19 08:18:34.844313 containerd[1567]: time="2025-08-19T08:18:34.844259117Z" level=info msg="StartContainer for \"98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7\"" Aug 19 08:18:34.846563 containerd[1567]: time="2025-08-19T08:18:34.846530847Z" level=info msg="connecting to shim 98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7" address="unix:///run/containerd/s/8edcd914c2457332ea2c8775808f63264465ce964bdb36bc427cd0ac2849a856" protocol=ttrpc version=3 Aug 19 08:18:34.878751 systemd[1]: Started cri-containerd-98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7.scope - libcontainer container 98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7. Aug 19 08:18:35.202181 containerd[1567]: time="2025-08-19T08:18:35.202106069Z" level=info msg="StartContainer for \"98931b7a10c99c83dce0770dc4d791fb997fe70a277ff8fc28f4b854eef5bfa7\" returns successfully" Aug 19 08:18:35.601595 kubelet[2703]: I0819 08:18:35.601403 2703 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 08:18:35.601595 kubelet[2703]: I0819 08:18:35.601473 2703 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 08:18:35.841939 kubelet[2703]: I0819 08:18:35.841791 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fmzrp" podStartSLOduration=37.745495996 podStartE2EDuration="54.841773201s" podCreationTimestamp="2025-08-19 08:17:41 +0000 UTC" firstStartedPulling="2025-08-19 08:18:17.712325079 +0000 UTC m=+55.328671718" lastFinishedPulling="2025-08-19 08:18:34.808602284 +0000 UTC m=+72.424948923" observedRunningTime="2025-08-19 08:18:35.841659684 +0000 UTC m=+73.458006323" watchObservedRunningTime="2025-08-19 08:18:35.841773201 +0000 UTC m=+73.458119830" Aug 19 08:18:38.092881 systemd[1]: Started sshd@15-10.0.0.123:22-10.0.0.1:49796.service - OpenSSH per-connection server daemon (10.0.0.1:49796). Aug 19 08:18:38.177639 sshd[5492]: Accepted publickey for core from 10.0.0.1 port 49796 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:38.181256 sshd-session[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:38.188812 systemd-logind[1541]: New session 16 of user core. Aug 19 08:18:38.199911 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 08:18:38.346297 sshd[5495]: Connection closed by 10.0.0.1 port 49796 Aug 19 08:18:38.346814 sshd-session[5492]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:38.353144 systemd[1]: sshd@15-10.0.0.123:22-10.0.0.1:49796.service: Deactivated successfully. Aug 19 08:18:38.355689 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 08:18:38.356703 systemd-logind[1541]: Session 16 logged out. Waiting for processes to exit. Aug 19 08:18:38.358169 systemd-logind[1541]: Removed session 16. Aug 19 08:18:39.558009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2042392971.mount: Deactivated successfully. Aug 19 08:18:40.151538 containerd[1567]: time="2025-08-19T08:18:40.151424990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:40.152362 containerd[1567]: time="2025-08-19T08:18:40.152332773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 19 08:18:40.154751 containerd[1567]: time="2025-08-19T08:18:40.154678261Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:40.159688 containerd[1567]: time="2025-08-19T08:18:40.159586452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:40.160798 containerd[1567]: time="2025-08-19T08:18:40.160740025Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 5.350039032s" Aug 19 08:18:40.160798 containerd[1567]: time="2025-08-19T08:18:40.160792505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 19 08:18:40.163871 containerd[1567]: time="2025-08-19T08:18:40.163806620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 08:18:40.166535 containerd[1567]: time="2025-08-19T08:18:40.166489061Z" level=info msg="CreateContainer within sandbox \"5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 08:18:40.182726 containerd[1567]: time="2025-08-19T08:18:40.182644225Z" level=info msg="Container 6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:40.197612 containerd[1567]: time="2025-08-19T08:18:40.197542920Z" level=info msg="CreateContainer within sandbox \"5ac04579b1d2d60f0d4be6290ec84b574e0910bcd21d49cf61d8b10885dd2d6a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\"" Aug 19 08:18:40.198258 containerd[1567]: time="2025-08-19T08:18:40.198186179Z" level=info msg="StartContainer for \"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\"" Aug 19 08:18:40.199715 containerd[1567]: time="2025-08-19T08:18:40.199674891Z" level=info msg="connecting to shim 6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c" address="unix:///run/containerd/s/c691e5fe47aee8620d3799171edfeaec2ca8627ab03cd374c640ab37236027ad" protocol=ttrpc version=3 Aug 19 08:18:40.236630 systemd[1]: Started cri-containerd-6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c.scope - libcontainer container 6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c. Aug 19 08:18:40.358836 containerd[1567]: time="2025-08-19T08:18:40.358772207Z" level=info msg="StartContainer for \"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\" returns successfully" Aug 19 08:18:40.887402 kubelet[2703]: I0819 08:18:40.887160 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-kf7tj" podStartSLOduration=47.740143989 podStartE2EDuration="59.887138308s" podCreationTimestamp="2025-08-19 08:17:41 +0000 UTC" firstStartedPulling="2025-08-19 08:18:28.016434569 +0000 UTC m=+65.632781208" lastFinishedPulling="2025-08-19 08:18:40.163428888 +0000 UTC m=+77.779775527" observedRunningTime="2025-08-19 08:18:40.886870887 +0000 UTC m=+78.503217526" watchObservedRunningTime="2025-08-19 08:18:40.887138308 +0000 UTC m=+78.503484947" Aug 19 08:18:42.000650 containerd[1567]: time="2025-08-19T08:18:42.000595044Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\" id:\"3922eaf222f5a78bb8649f14de884e941d5417513cd3335904596499a3caef1d\" pid:5579 exit_status:1 exited_at:{seconds:1755591522 nanos:75642}" Aug 19 08:18:43.059019 containerd[1567]: time="2025-08-19T08:18:43.058950460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\" id:\"b76caaeea456aae527812f19f55032139ef8110e9c9eb10b5306c0664ba171a4\" pid:5605 exit_status:1 exited_at:{seconds:1755591523 nanos:58590684}" Aug 19 08:18:43.363163 systemd[1]: Started sshd@16-10.0.0.123:22-10.0.0.1:35318.service - OpenSSH per-connection server daemon (10.0.0.1:35318). Aug 19 08:18:43.457431 sshd[5622]: Accepted publickey for core from 10.0.0.1 port 35318 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:43.460011 sshd-session[5622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:43.466871 systemd-logind[1541]: New session 17 of user core. Aug 19 08:18:43.473778 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 08:18:43.633104 sshd[5626]: Connection closed by 10.0.0.1 port 35318 Aug 19 08:18:43.633606 sshd-session[5622]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:43.639363 systemd[1]: sshd@16-10.0.0.123:22-10.0.0.1:35318.service: Deactivated successfully. Aug 19 08:18:43.642862 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 08:18:43.644057 systemd-logind[1541]: Session 17 logged out. Waiting for processes to exit. Aug 19 08:18:43.647321 systemd-logind[1541]: Removed session 17. Aug 19 08:18:45.597214 containerd[1567]: time="2025-08-19T08:18:45.597106890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:45.598027 containerd[1567]: time="2025-08-19T08:18:45.597967590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 19 08:18:45.599373 containerd[1567]: time="2025-08-19T08:18:45.599332460Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:45.601590 containerd[1567]: time="2025-08-19T08:18:45.601534105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:18:45.602278 containerd[1567]: time="2025-08-19T08:18:45.602226734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 5.438358036s" Aug 19 08:18:45.602278 containerd[1567]: time="2025-08-19T08:18:45.602276720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 19 08:18:45.623138 containerd[1567]: time="2025-08-19T08:18:45.623076328Z" level=info msg="CreateContainer within sandbox \"08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 08:18:45.632209 containerd[1567]: time="2025-08-19T08:18:45.632155685Z" level=info msg="Container eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:18:45.645721 containerd[1567]: time="2025-08-19T08:18:45.645670474Z" level=info msg="CreateContainer within sandbox \"08c81e6a4da943e590f60f832a8f7b54d7efb3e848656d5dee1a717328a6a4b6\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\"" Aug 19 08:18:45.646566 containerd[1567]: time="2025-08-19T08:18:45.646530072Z" level=info msg="StartContainer for \"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\"" Aug 19 08:18:45.647569 containerd[1567]: time="2025-08-19T08:18:45.647546959Z" level=info msg="connecting to shim eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea" address="unix:///run/containerd/s/24ab49569db1adefb72f38bd88b3738e418cd9ae7550c1a6ff000d76e1278e9f" protocol=ttrpc version=3 Aug 19 08:18:45.672996 systemd[1]: Started cri-containerd-eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea.scope - libcontainer container eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea. Aug 19 08:18:45.741730 containerd[1567]: time="2025-08-19T08:18:45.741667900Z" level=info msg="StartContainer for \"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\" returns successfully" Aug 19 08:18:45.805921 containerd[1567]: time="2025-08-19T08:18:45.805877884Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\" id:\"8d30614d3932a3a8df17e0f86c5317649c17d8ca1d0e122c3c37af7a81c3b72c\" pid:5678 exited_at:{seconds:1755591525 nanos:805548336}" Aug 19 08:18:45.909823 containerd[1567]: time="2025-08-19T08:18:45.909762827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\" id:\"121739dd23652b2a8bc04215b533b4c1bfa15edab3487930be2d0baa54ede459\" pid:5725 exit_status:1 exited_at:{seconds:1755591525 nanos:909436916}" Aug 19 08:18:46.072938 kubelet[2703]: I0819 08:18:46.072859 2703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-697fc54c9b-6gqqp" podStartSLOduration=48.253170905 podStartE2EDuration="1m5.072837587s" podCreationTimestamp="2025-08-19 08:17:41 +0000 UTC" firstStartedPulling="2025-08-19 08:18:28.78352975 +0000 UTC m=+66.399876389" lastFinishedPulling="2025-08-19 08:18:45.603196432 +0000 UTC m=+83.219543071" observedRunningTime="2025-08-19 08:18:46.072297057 +0000 UTC m=+83.688643696" watchObservedRunningTime="2025-08-19 08:18:46.072837587 +0000 UTC m=+83.689184226" Aug 19 08:18:46.918714 containerd[1567]: time="2025-08-19T08:18:46.918585702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\" id:\"f01eb299a87c43d96432c48be3f05fe1f12ca1b1ac4d23a9b0c4b820bfce9077\" pid:5754 exited_at:{seconds:1755591526 nanos:918375813}" Aug 19 08:18:48.652916 systemd[1]: Started sshd@17-10.0.0.123:22-10.0.0.1:35328.service - OpenSSH per-connection server daemon (10.0.0.1:35328). Aug 19 08:18:48.730009 sshd[5766]: Accepted publickey for core from 10.0.0.1 port 35328 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:48.732215 sshd-session[5766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:48.737490 systemd-logind[1541]: New session 18 of user core. Aug 19 08:18:48.748647 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 08:18:48.878943 sshd[5769]: Connection closed by 10.0.0.1 port 35328 Aug 19 08:18:48.879352 sshd-session[5766]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:48.886132 systemd[1]: sshd@17-10.0.0.123:22-10.0.0.1:35328.service: Deactivated successfully. Aug 19 08:18:48.888536 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 08:18:48.890171 systemd-logind[1541]: Session 18 logged out. Waiting for processes to exit. Aug 19 08:18:48.892214 systemd-logind[1541]: Removed session 18. Aug 19 08:18:53.895789 systemd[1]: Started sshd@18-10.0.0.123:22-10.0.0.1:36742.service - OpenSSH per-connection server daemon (10.0.0.1:36742). Aug 19 08:18:53.961194 sshd[5785]: Accepted publickey for core from 10.0.0.1 port 36742 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:53.963079 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:53.968321 systemd-logind[1541]: New session 19 of user core. Aug 19 08:18:53.979753 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 08:18:54.116032 sshd[5788]: Connection closed by 10.0.0.1 port 36742 Aug 19 08:18:54.116562 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:54.126806 systemd[1]: sshd@18-10.0.0.123:22-10.0.0.1:36742.service: Deactivated successfully. Aug 19 08:18:54.129362 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 08:18:54.130413 systemd-logind[1541]: Session 19 logged out. Waiting for processes to exit. Aug 19 08:18:54.133991 systemd[1]: Started sshd@19-10.0.0.123:22-10.0.0.1:36744.service - OpenSSH per-connection server daemon (10.0.0.1:36744). Aug 19 08:18:54.135273 systemd-logind[1541]: Removed session 19. Aug 19 08:18:54.193853 sshd[5801]: Accepted publickey for core from 10.0.0.1 port 36744 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:54.195288 sshd-session[5801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:54.200126 systemd-logind[1541]: New session 20 of user core. Aug 19 08:18:54.209587 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 08:18:54.509503 sshd[5804]: Connection closed by 10.0.0.1 port 36744 Aug 19 08:18:54.510188 sshd-session[5801]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:54.520427 systemd[1]: sshd@19-10.0.0.123:22-10.0.0.1:36744.service: Deactivated successfully. Aug 19 08:18:54.522631 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 08:18:54.523582 systemd-logind[1541]: Session 20 logged out. Waiting for processes to exit. Aug 19 08:18:54.527724 systemd[1]: Started sshd@20-10.0.0.123:22-10.0.0.1:36748.service - OpenSSH per-connection server daemon (10.0.0.1:36748). Aug 19 08:18:54.528682 systemd-logind[1541]: Removed session 20. Aug 19 08:18:54.599832 sshd[5815]: Accepted publickey for core from 10.0.0.1 port 36748 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:54.601368 sshd-session[5815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:54.606415 systemd-logind[1541]: New session 21 of user core. Aug 19 08:18:54.615586 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 08:18:55.277847 sshd[5818]: Connection closed by 10.0.0.1 port 36748 Aug 19 08:18:55.278752 sshd-session[5815]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:55.292939 systemd[1]: sshd@20-10.0.0.123:22-10.0.0.1:36748.service: Deactivated successfully. Aug 19 08:18:55.295332 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 08:18:55.300151 systemd-logind[1541]: Session 21 logged out. Waiting for processes to exit. Aug 19 08:18:55.302835 systemd[1]: Started sshd@21-10.0.0.123:22-10.0.0.1:36760.service - OpenSSH per-connection server daemon (10.0.0.1:36760). Aug 19 08:18:55.305904 systemd-logind[1541]: Removed session 21. Aug 19 08:18:55.378527 sshd[5838]: Accepted publickey for core from 10.0.0.1 port 36760 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:55.380900 sshd-session[5838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:55.386359 systemd-logind[1541]: New session 22 of user core. Aug 19 08:18:55.393692 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 08:18:55.685344 sshd[5842]: Connection closed by 10.0.0.1 port 36760 Aug 19 08:18:55.685861 sshd-session[5838]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:55.696568 systemd[1]: sshd@21-10.0.0.123:22-10.0.0.1:36760.service: Deactivated successfully. Aug 19 08:18:55.698551 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 08:18:55.699277 systemd-logind[1541]: Session 22 logged out. Waiting for processes to exit. Aug 19 08:18:55.702381 systemd[1]: Started sshd@22-10.0.0.123:22-10.0.0.1:36762.service - OpenSSH per-connection server daemon (10.0.0.1:36762). Aug 19 08:18:55.703648 systemd-logind[1541]: Removed session 22. Aug 19 08:18:55.760243 sshd[5853]: Accepted publickey for core from 10.0.0.1 port 36762 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:18:55.762753 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:18:55.768978 systemd-logind[1541]: New session 23 of user core. Aug 19 08:18:55.777581 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 19 08:18:55.900608 sshd[5856]: Connection closed by 10.0.0.1 port 36762 Aug 19 08:18:55.901024 sshd-session[5853]: pam_unix(sshd:session): session closed for user core Aug 19 08:18:55.906379 systemd[1]: sshd@22-10.0.0.123:22-10.0.0.1:36762.service: Deactivated successfully. Aug 19 08:18:55.908713 systemd[1]: session-23.scope: Deactivated successfully. Aug 19 08:18:55.909541 systemd-logind[1541]: Session 23 logged out. Waiting for processes to exit. Aug 19 08:18:55.910922 systemd-logind[1541]: Removed session 23. Aug 19 08:19:00.913690 systemd[1]: Started sshd@23-10.0.0.123:22-10.0.0.1:51374.service - OpenSSH per-connection server daemon (10.0.0.1:51374). Aug 19 08:19:00.982067 sshd[5878]: Accepted publickey for core from 10.0.0.1 port 51374 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:19:00.983865 sshd-session[5878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:00.989069 systemd-logind[1541]: New session 24 of user core. Aug 19 08:19:00.998591 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 19 08:19:01.123010 sshd[5881]: Connection closed by 10.0.0.1 port 51374 Aug 19 08:19:01.123416 sshd-session[5878]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:01.127966 systemd[1]: sshd@23-10.0.0.123:22-10.0.0.1:51374.service: Deactivated successfully. Aug 19 08:19:01.130129 systemd[1]: session-24.scope: Deactivated successfully. Aug 19 08:19:01.131176 systemd-logind[1541]: Session 24 logged out. Waiting for processes to exit. Aug 19 08:19:01.133037 systemd-logind[1541]: Removed session 24. Aug 19 08:19:06.141331 systemd[1]: Started sshd@24-10.0.0.123:22-10.0.0.1:51378.service - OpenSSH per-connection server daemon (10.0.0.1:51378). Aug 19 08:19:06.207401 sshd[5896]: Accepted publickey for core from 10.0.0.1 port 51378 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:19:06.209322 sshd-session[5896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:06.214719 systemd-logind[1541]: New session 25 of user core. Aug 19 08:19:06.225651 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 19 08:19:06.358189 sshd[5899]: Connection closed by 10.0.0.1 port 51378 Aug 19 08:19:06.358646 sshd-session[5896]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:06.363064 systemd[1]: sshd@24-10.0.0.123:22-10.0.0.1:51378.service: Deactivated successfully. Aug 19 08:19:06.365571 systemd[1]: session-25.scope: Deactivated successfully. Aug 19 08:19:06.366651 systemd-logind[1541]: Session 25 logged out. Waiting for processes to exit. Aug 19 08:19:06.367932 systemd-logind[1541]: Removed session 25. Aug 19 08:19:11.383486 systemd[1]: Started sshd@25-10.0.0.123:22-10.0.0.1:60254.service - OpenSSH per-connection server daemon (10.0.0.1:60254). Aug 19 08:19:11.464506 sshd[5912]: Accepted publickey for core from 10.0.0.1 port 60254 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:19:11.467129 sshd-session[5912]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:11.476504 systemd-logind[1541]: New session 26 of user core. Aug 19 08:19:11.481783 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 19 08:19:11.719600 sshd[5915]: Connection closed by 10.0.0.1 port 60254 Aug 19 08:19:11.721882 sshd-session[5912]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:11.726894 systemd[1]: sshd@25-10.0.0.123:22-10.0.0.1:60254.service: Deactivated successfully. Aug 19 08:19:11.729015 systemd[1]: session-26.scope: Deactivated successfully. Aug 19 08:19:11.730109 systemd-logind[1541]: Session 26 logged out. Waiting for processes to exit. Aug 19 08:19:11.731559 systemd-logind[1541]: Removed session 26. Aug 19 08:19:12.994096 containerd[1567]: time="2025-08-19T08:19:12.993997225Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ec0144600165c7265dfe839c07ae73ed22e9ae48da6a8befca5e21085b7f84c\" id:\"e50e462aff7fd2a30532078d838b6e5935a2509977e6b54661eec80d89b179ba\" pid:5941 exited_at:{seconds:1755591552 nanos:993443237}" Aug 19 08:19:15.842437 containerd[1567]: time="2025-08-19T08:19:15.842374659Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e618d8761d2dc2cd6ed84bb627d26381ea77483e57cbe259654c129c7dae7514\" id:\"09e450c4c3ac784bb28feead99212e046f785e6825be3b64a5d9d707d509f8a1\" pid:5969 exited_at:{seconds:1755591555 nanos:827739035}" Aug 19 08:19:16.746718 systemd[1]: Started sshd@26-10.0.0.123:22-10.0.0.1:60264.service - OpenSSH per-connection server daemon (10.0.0.1:60264). Aug 19 08:19:16.829417 sshd[5981]: Accepted publickey for core from 10.0.0.1 port 60264 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:19:16.831368 sshd-session[5981]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:19:16.836971 systemd-logind[1541]: New session 27 of user core. Aug 19 08:19:16.845652 systemd[1]: Started session-27.scope - Session 27 of User core. Aug 19 08:19:16.947828 containerd[1567]: time="2025-08-19T08:19:16.947774353Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eb23adf8aaadb88f25d9552c72c9dc0702fe26b80b3d7a0a64542a33ecfc60ea\" id:\"054556baf3e08c545209aa8b20cd7ed2e588ef466965d097a075a954b9af8d64\" pid:5997 exited_at:{seconds:1755591556 nanos:946742703}" Aug 19 08:19:17.073434 sshd[5984]: Connection closed by 10.0.0.1 port 60264 Aug 19 08:19:17.074952 sshd-session[5981]: pam_unix(sshd:session): session closed for user core Aug 19 08:19:17.079767 systemd[1]: sshd@26-10.0.0.123:22-10.0.0.1:60264.service: Deactivated successfully. Aug 19 08:19:17.082798 systemd[1]: session-27.scope: Deactivated successfully. Aug 19 08:19:17.084494 systemd-logind[1541]: Session 27 logged out. Waiting for processes to exit. Aug 19 08:19:17.085998 systemd-logind[1541]: Removed session 27.