Aug 19 08:20:06.879874 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Aug 18 22:19:37 -00 2025 Aug 19 08:20:06.879910 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:20:06.879919 kernel: BIOS-provided physical RAM map: Aug 19 08:20:06.879926 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Aug 19 08:20:06.879933 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Aug 19 08:20:06.879940 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Aug 19 08:20:06.879947 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Aug 19 08:20:06.879954 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Aug 19 08:20:06.879967 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Aug 19 08:20:06.879975 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Aug 19 08:20:06.879983 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Aug 19 08:20:06.879991 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Aug 19 08:20:06.879999 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Aug 19 08:20:06.880008 kernel: NX (Execute Disable) protection: active Aug 19 08:20:06.880020 kernel: APIC: Static calls initialized Aug 19 08:20:06.880030 kernel: SMBIOS 2.8 present. Aug 19 08:20:06.880042 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Aug 19 08:20:06.880051 kernel: DMI: Memory slots populated: 1/1 Aug 19 08:20:06.880060 kernel: Hypervisor detected: KVM Aug 19 08:20:06.880069 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 19 08:20:06.880078 kernel: kvm-clock: using sched offset of 4147202095 cycles Aug 19 08:20:06.880088 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 19 08:20:06.880098 kernel: tsc: Detected 2794.748 MHz processor Aug 19 08:20:06.880110 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 19 08:20:06.880120 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 19 08:20:06.880129 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Aug 19 08:20:06.880138 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Aug 19 08:20:06.880148 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 19 08:20:06.880157 kernel: Using GB pages for direct mapping Aug 19 08:20:06.880166 kernel: ACPI: Early table checksum verification disabled Aug 19 08:20:06.880175 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Aug 19 08:20:06.880184 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880196 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880204 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880212 kernel: ACPI: FACS 0x000000009CFE0000 000040 Aug 19 08:20:06.880219 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880227 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880234 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880241 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 08:20:06.880249 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Aug 19 08:20:06.880263 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Aug 19 08:20:06.880270 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Aug 19 08:20:06.880278 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Aug 19 08:20:06.880285 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Aug 19 08:20:06.880293 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Aug 19 08:20:06.880300 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Aug 19 08:20:06.880310 kernel: No NUMA configuration found Aug 19 08:20:06.880318 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Aug 19 08:20:06.880325 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] Aug 19 08:20:06.880333 kernel: Zone ranges: Aug 19 08:20:06.880340 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 19 08:20:06.880348 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Aug 19 08:20:06.880355 kernel: Normal empty Aug 19 08:20:06.880363 kernel: Device empty Aug 19 08:20:06.880370 kernel: Movable zone start for each node Aug 19 08:20:06.880378 kernel: Early memory node ranges Aug 19 08:20:06.880388 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Aug 19 08:20:06.880395 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Aug 19 08:20:06.880403 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Aug 19 08:20:06.880410 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 19 08:20:06.880418 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Aug 19 08:20:06.880425 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Aug 19 08:20:06.880433 kernel: ACPI: PM-Timer IO Port: 0x608 Aug 19 08:20:06.880443 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 19 08:20:06.880451 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 19 08:20:06.880460 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 19 08:20:06.880468 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 19 08:20:06.880478 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 19 08:20:06.880485 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 19 08:20:06.880493 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 19 08:20:06.880501 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 19 08:20:06.880508 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 19 08:20:06.880516 kernel: TSC deadline timer available Aug 19 08:20:06.880523 kernel: CPU topo: Max. logical packages: 1 Aug 19 08:20:06.880533 kernel: CPU topo: Max. logical dies: 1 Aug 19 08:20:06.880541 kernel: CPU topo: Max. dies per package: 1 Aug 19 08:20:06.880548 kernel: CPU topo: Max. threads per core: 1 Aug 19 08:20:06.880556 kernel: CPU topo: Num. cores per package: 4 Aug 19 08:20:06.880563 kernel: CPU topo: Num. threads per package: 4 Aug 19 08:20:06.880571 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Aug 19 08:20:06.880579 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 19 08:20:06.880586 kernel: kvm-guest: KVM setup pv remote TLB flush Aug 19 08:20:06.880594 kernel: kvm-guest: setup PV sched yield Aug 19 08:20:06.880603 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Aug 19 08:20:06.880611 kernel: Booting paravirtualized kernel on KVM Aug 19 08:20:06.880619 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 19 08:20:06.880627 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Aug 19 08:20:06.880634 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Aug 19 08:20:06.880642 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Aug 19 08:20:06.880649 kernel: pcpu-alloc: [0] 0 1 2 3 Aug 19 08:20:06.880657 kernel: kvm-guest: PV spinlocks enabled Aug 19 08:20:06.880664 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Aug 19 08:20:06.880676 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:20:06.880684 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 08:20:06.880692 kernel: random: crng init done Aug 19 08:20:06.880706 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 08:20:06.880714 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 08:20:06.880721 kernel: Fallback order for Node 0: 0 Aug 19 08:20:06.880729 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 Aug 19 08:20:06.880736 kernel: Policy zone: DMA32 Aug 19 08:20:06.880744 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 08:20:06.880769 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Aug 19 08:20:06.880777 kernel: ftrace: allocating 40101 entries in 157 pages Aug 19 08:20:06.880785 kernel: ftrace: allocated 157 pages with 5 groups Aug 19 08:20:06.880792 kernel: Dynamic Preempt: voluntary Aug 19 08:20:06.880800 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 08:20:06.880808 kernel: rcu: RCU event tracing is enabled. Aug 19 08:20:06.880816 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Aug 19 08:20:06.880824 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 08:20:06.880834 kernel: Rude variant of Tasks RCU enabled. Aug 19 08:20:06.880845 kernel: Tracing variant of Tasks RCU enabled. Aug 19 08:20:06.880853 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 08:20:06.880861 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Aug 19 08:20:06.880868 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:20:06.880876 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:20:06.880884 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Aug 19 08:20:06.880891 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Aug 19 08:20:06.880900 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 08:20:06.880918 kernel: Console: colour VGA+ 80x25 Aug 19 08:20:06.880928 kernel: printk: legacy console [ttyS0] enabled Aug 19 08:20:06.880938 kernel: ACPI: Core revision 20240827 Aug 19 08:20:06.880948 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Aug 19 08:20:06.880960 kernel: APIC: Switch to symmetric I/O mode setup Aug 19 08:20:06.880970 kernel: x2apic enabled Aug 19 08:20:06.880980 kernel: APIC: Switched APIC routing to: physical x2apic Aug 19 08:20:06.880993 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Aug 19 08:20:06.881003 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Aug 19 08:20:06.881016 kernel: kvm-guest: setup PV IPIs Aug 19 08:20:06.881026 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 19 08:20:06.881036 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Aug 19 08:20:06.881046 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Aug 19 08:20:06.881054 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Aug 19 08:20:06.881062 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Aug 19 08:20:06.881070 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Aug 19 08:20:06.881078 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 19 08:20:06.881088 kernel: Spectre V2 : Mitigation: Retpolines Aug 19 08:20:06.881096 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 19 08:20:06.881104 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Aug 19 08:20:06.881112 kernel: RETBleed: Mitigation: untrained return thunk Aug 19 08:20:06.881120 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 19 08:20:06.881128 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 19 08:20:06.881136 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Aug 19 08:20:06.881144 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Aug 19 08:20:06.881152 kernel: x86/bugs: return thunk changed Aug 19 08:20:06.881162 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Aug 19 08:20:06.881170 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 19 08:20:06.881178 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 19 08:20:06.881186 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 19 08:20:06.881194 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 19 08:20:06.881202 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Aug 19 08:20:06.881209 kernel: Freeing SMP alternatives memory: 32K Aug 19 08:20:06.881217 kernel: pid_max: default: 32768 minimum: 301 Aug 19 08:20:06.881242 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 08:20:06.881250 kernel: landlock: Up and running. Aug 19 08:20:06.881257 kernel: SELinux: Initializing. Aug 19 08:20:06.881276 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 08:20:06.881296 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 08:20:06.881304 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Aug 19 08:20:06.881312 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Aug 19 08:20:06.881320 kernel: ... version: 0 Aug 19 08:20:06.881328 kernel: ... bit width: 48 Aug 19 08:20:06.881347 kernel: ... generic registers: 6 Aug 19 08:20:06.881358 kernel: ... value mask: 0000ffffffffffff Aug 19 08:20:06.881383 kernel: ... max period: 00007fffffffffff Aug 19 08:20:06.881393 kernel: ... fixed-purpose events: 0 Aug 19 08:20:06.881401 kernel: ... event mask: 000000000000003f Aug 19 08:20:06.881409 kernel: signal: max sigframe size: 1776 Aug 19 08:20:06.881417 kernel: rcu: Hierarchical SRCU implementation. Aug 19 08:20:06.881425 kernel: rcu: Max phase no-delay instances is 400. Aug 19 08:20:06.881433 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 08:20:06.881441 kernel: smp: Bringing up secondary CPUs ... Aug 19 08:20:06.881453 kernel: smpboot: x86: Booting SMP configuration: Aug 19 08:20:06.881461 kernel: .... node #0, CPUs: #1 #2 #3 Aug 19 08:20:06.881468 kernel: smp: Brought up 1 node, 4 CPUs Aug 19 08:20:06.881476 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Aug 19 08:20:06.881485 kernel: Memory: 2428908K/2571752K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54040K init, 2928K bss, 136904K reserved, 0K cma-reserved) Aug 19 08:20:06.881493 kernel: devtmpfs: initialized Aug 19 08:20:06.881500 kernel: x86/mm: Memory block size: 128MB Aug 19 08:20:06.881508 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 08:20:06.881516 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Aug 19 08:20:06.881527 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 08:20:06.881535 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 08:20:06.881545 kernel: audit: initializing netlink subsys (disabled) Aug 19 08:20:06.881553 kernel: audit: type=2000 audit(1755591604.170:1): state=initialized audit_enabled=0 res=1 Aug 19 08:20:06.881561 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 08:20:06.881569 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 19 08:20:06.881577 kernel: cpuidle: using governor menu Aug 19 08:20:06.881585 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 08:20:06.881593 kernel: dca service started, version 1.12.1 Aug 19 08:20:06.881603 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Aug 19 08:20:06.881611 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Aug 19 08:20:06.881619 kernel: PCI: Using configuration type 1 for base access Aug 19 08:20:06.881627 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 19 08:20:06.881635 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 08:20:06.881643 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 08:20:06.881651 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 08:20:06.881658 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 08:20:06.881669 kernel: ACPI: Added _OSI(Module Device) Aug 19 08:20:06.881677 kernel: ACPI: Added _OSI(Processor Device) Aug 19 08:20:06.881684 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 08:20:06.881692 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 08:20:06.881708 kernel: ACPI: Interpreter enabled Aug 19 08:20:06.881716 kernel: ACPI: PM: (supports S0 S3 S5) Aug 19 08:20:06.881724 kernel: ACPI: Using IOAPIC for interrupt routing Aug 19 08:20:06.881732 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 19 08:20:06.881741 kernel: PCI: Using E820 reservations for host bridge windows Aug 19 08:20:06.881749 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Aug 19 08:20:06.881775 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 19 08:20:06.881980 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 19 08:20:06.882133 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Aug 19 08:20:06.882273 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Aug 19 08:20:06.882283 kernel: PCI host bridge to bus 0000:00 Aug 19 08:20:06.882420 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 19 08:20:06.882574 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 19 08:20:06.882899 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 19 08:20:06.883102 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Aug 19 08:20:06.883263 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Aug 19 08:20:06.883391 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Aug 19 08:20:06.883512 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 19 08:20:06.883694 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Aug 19 08:20:06.883963 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Aug 19 08:20:06.884170 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] Aug 19 08:20:06.884342 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] Aug 19 08:20:06.884511 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] Aug 19 08:20:06.884681 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 19 08:20:06.884919 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Aug 19 08:20:06.885094 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] Aug 19 08:20:06.885264 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] Aug 19 08:20:06.885523 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] Aug 19 08:20:06.885730 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Aug 19 08:20:06.885934 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] Aug 19 08:20:06.886102 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] Aug 19 08:20:06.886261 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] Aug 19 08:20:06.886465 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Aug 19 08:20:06.886647 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] Aug 19 08:20:06.886841 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] Aug 19 08:20:06.886994 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] Aug 19 08:20:06.887137 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] Aug 19 08:20:06.887270 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Aug 19 08:20:06.887402 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Aug 19 08:20:06.887572 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Aug 19 08:20:06.887772 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] Aug 19 08:20:06.887961 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] Aug 19 08:20:06.888267 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Aug 19 08:20:06.888447 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Aug 19 08:20:06.888466 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 19 08:20:06.888478 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 19 08:20:06.888497 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 19 08:20:06.888508 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 19 08:20:06.888519 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Aug 19 08:20:06.888530 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Aug 19 08:20:06.888541 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Aug 19 08:20:06.888552 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Aug 19 08:20:06.888562 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Aug 19 08:20:06.888574 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Aug 19 08:20:06.888585 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Aug 19 08:20:06.888601 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Aug 19 08:20:06.888611 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Aug 19 08:20:06.888622 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Aug 19 08:20:06.888632 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Aug 19 08:20:06.888642 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Aug 19 08:20:06.888652 kernel: iommu: Default domain type: Translated Aug 19 08:20:06.888662 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 19 08:20:06.888672 kernel: PCI: Using ACPI for IRQ routing Aug 19 08:20:06.888682 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 19 08:20:06.888707 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Aug 19 08:20:06.888719 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Aug 19 08:20:06.888932 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Aug 19 08:20:06.889124 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Aug 19 08:20:06.889298 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 19 08:20:06.889316 kernel: vgaarb: loaded Aug 19 08:20:06.889328 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Aug 19 08:20:06.889339 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Aug 19 08:20:06.889356 kernel: clocksource: Switched to clocksource kvm-clock Aug 19 08:20:06.889367 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 08:20:06.889379 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 08:20:06.889390 kernel: pnp: PnP ACPI init Aug 19 08:20:06.889578 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Aug 19 08:20:06.889597 kernel: pnp: PnP ACPI: found 6 devices Aug 19 08:20:06.889608 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 19 08:20:06.889619 kernel: NET: Registered PF_INET protocol family Aug 19 08:20:06.889630 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 08:20:06.889646 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 19 08:20:06.889658 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 08:20:06.889668 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 08:20:06.889679 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 19 08:20:06.889690 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 19 08:20:06.889712 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 08:20:06.889724 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 08:20:06.889735 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 08:20:06.889749 kernel: NET: Registered PF_XDP protocol family Aug 19 08:20:06.889922 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 19 08:20:06.890074 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 19 08:20:06.890196 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 19 08:20:06.890351 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Aug 19 08:20:06.890508 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Aug 19 08:20:06.890662 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Aug 19 08:20:06.890680 kernel: PCI: CLS 0 bytes, default 64 Aug 19 08:20:06.890692 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Aug 19 08:20:06.890722 kernel: Initialise system trusted keyrings Aug 19 08:20:06.890733 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 19 08:20:06.890744 kernel: Key type asymmetric registered Aug 19 08:20:06.890772 kernel: Asymmetric key parser 'x509' registered Aug 19 08:20:06.890784 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 19 08:20:06.890795 kernel: io scheduler mq-deadline registered Aug 19 08:20:06.890805 kernel: io scheduler kyber registered Aug 19 08:20:06.890816 kernel: io scheduler bfq registered Aug 19 08:20:06.890827 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 19 08:20:06.890844 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Aug 19 08:20:06.890855 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Aug 19 08:20:06.890866 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Aug 19 08:20:06.890877 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 08:20:06.890888 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 19 08:20:06.890899 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 19 08:20:06.890911 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 19 08:20:06.890922 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 19 08:20:06.891095 kernel: rtc_cmos 00:04: RTC can wake from S4 Aug 19 08:20:06.891116 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Aug 19 08:20:06.891261 kernel: rtc_cmos 00:04: registered as rtc0 Aug 19 08:20:06.891408 kernel: rtc_cmos 00:04: setting system clock to 2025-08-19T08:20:06 UTC (1755591606) Aug 19 08:20:06.891552 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Aug 19 08:20:06.891568 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Aug 19 08:20:06.891579 kernel: NET: Registered PF_INET6 protocol family Aug 19 08:20:06.891590 kernel: Segment Routing with IPv6 Aug 19 08:20:06.891605 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 08:20:06.891616 kernel: NET: Registered PF_PACKET protocol family Aug 19 08:20:06.891628 kernel: Key type dns_resolver registered Aug 19 08:20:06.891638 kernel: IPI shorthand broadcast: enabled Aug 19 08:20:06.891649 kernel: sched_clock: Marking stable (3400001849, 117778749)->(3538528708, -20748110) Aug 19 08:20:06.891660 kernel: registered taskstats version 1 Aug 19 08:20:06.891670 kernel: Loading compiled-in X.509 certificates Aug 19 08:20:06.891681 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: 93a065b103c00d4b81cc5822e4e7f9674e63afaf' Aug 19 08:20:06.891692 kernel: Demotion targets for Node 0: null Aug 19 08:20:06.891712 kernel: Key type .fscrypt registered Aug 19 08:20:06.891727 kernel: Key type fscrypt-provisioning registered Aug 19 08:20:06.891738 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 08:20:06.891748 kernel: ima: Allocated hash algorithm: sha1 Aug 19 08:20:06.891774 kernel: ima: No architecture policies found Aug 19 08:20:06.891784 kernel: clk: Disabling unused clocks Aug 19 08:20:06.891795 kernel: Warning: unable to open an initial console. Aug 19 08:20:06.891807 kernel: Freeing unused kernel image (initmem) memory: 54040K Aug 19 08:20:06.891818 kernel: Write protecting the kernel read-only data: 24576k Aug 19 08:20:06.891832 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 19 08:20:06.891843 kernel: Run /init as init process Aug 19 08:20:06.891854 kernel: with arguments: Aug 19 08:20:06.891864 kernel: /init Aug 19 08:20:06.891874 kernel: with environment: Aug 19 08:20:06.891885 kernel: HOME=/ Aug 19 08:20:06.891896 kernel: TERM=linux Aug 19 08:20:06.891907 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 08:20:06.891918 systemd[1]: Successfully made /usr/ read-only. Aug 19 08:20:06.891939 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:20:06.891970 systemd[1]: Detected virtualization kvm. Aug 19 08:20:06.891981 systemd[1]: Detected architecture x86-64. Aug 19 08:20:06.891993 systemd[1]: Running in initrd. Aug 19 08:20:06.892004 systemd[1]: No hostname configured, using default hostname. Aug 19 08:20:06.892019 systemd[1]: Hostname set to . Aug 19 08:20:06.892031 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:20:06.892043 systemd[1]: Queued start job for default target initrd.target. Aug 19 08:20:06.892055 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:20:06.892068 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:20:06.892080 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 08:20:06.892092 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:20:06.892104 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 08:20:06.892121 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 08:20:06.892135 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 08:20:06.892147 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 08:20:06.892158 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:20:06.892170 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:20:06.892182 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:20:06.892194 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:20:06.892209 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:20:06.892221 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:20:06.892233 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:20:06.892244 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:20:06.892256 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 08:20:06.892268 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 08:20:06.892280 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:20:06.892292 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:20:06.892304 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:20:06.892318 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:20:06.892330 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 08:20:06.892342 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:20:06.892354 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 08:20:06.892367 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 08:20:06.892384 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 08:20:06.892396 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:20:06.892408 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:20:06.892420 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:20:06.892431 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 08:20:06.892444 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:20:06.892459 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 08:20:06.892505 systemd-journald[219]: Collecting audit messages is disabled. Aug 19 08:20:06.892543 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 08:20:06.892556 systemd-journald[219]: Journal started Aug 19 08:20:06.892581 systemd-journald[219]: Runtime Journal (/run/log/journal/39a22897a2474f84b484b6f1e02538d3) is 6M, max 48.6M, 42.5M free. Aug 19 08:20:06.884059 systemd-modules-load[222]: Inserted module 'overlay' Aug 19 08:20:06.930742 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:20:06.930801 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 08:20:06.930826 kernel: Bridge firewalling registered Aug 19 08:20:06.913279 systemd-modules-load[222]: Inserted module 'br_netfilter' Aug 19 08:20:06.929819 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:20:06.931399 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:20:06.934546 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 08:20:06.937880 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 08:20:06.942454 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:20:06.954575 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:20:06.955717 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:20:06.970110 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:20:06.971282 systemd-tmpfiles[241]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 08:20:06.974941 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:20:06.975740 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:20:06.978397 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:20:06.983199 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 08:20:06.987361 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:20:07.019286 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=cc23dd01793203541561c15ffc568736bb5dae0d652141296dd11bf777bdf42f Aug 19 08:20:07.039709 systemd-resolved[262]: Positive Trust Anchors: Aug 19 08:20:07.039734 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:20:07.039780 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:20:07.042830 systemd-resolved[262]: Defaulting to hostname 'linux'. Aug 19 08:20:07.044156 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:20:07.050418 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:20:07.148813 kernel: SCSI subsystem initialized Aug 19 08:20:07.157798 kernel: Loading iSCSI transport class v2.0-870. Aug 19 08:20:07.169795 kernel: iscsi: registered transport (tcp) Aug 19 08:20:07.197798 kernel: iscsi: registered transport (qla4xxx) Aug 19 08:20:07.197859 kernel: QLogic iSCSI HBA Driver Aug 19 08:20:07.221322 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:20:07.238303 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:20:07.242055 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:20:07.305034 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 08:20:07.308590 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 08:20:07.368789 kernel: raid6: avx2x4 gen() 28454 MB/s Aug 19 08:20:07.385781 kernel: raid6: avx2x2 gen() 26003 MB/s Aug 19 08:20:07.403052 kernel: raid6: avx2x1 gen() 22832 MB/s Aug 19 08:20:07.403093 kernel: raid6: using algorithm avx2x4 gen() 28454 MB/s Aug 19 08:20:07.420879 kernel: raid6: .... xor() 6856 MB/s, rmw enabled Aug 19 08:20:07.420945 kernel: raid6: using avx2x2 recovery algorithm Aug 19 08:20:07.441806 kernel: xor: automatically using best checksumming function avx Aug 19 08:20:07.618818 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 08:20:07.628459 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:20:07.632310 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:20:07.667347 systemd-udevd[472]: Using default interface naming scheme 'v255'. Aug 19 08:20:07.673239 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:20:07.674969 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 08:20:07.706560 dracut-pre-trigger[478]: rd.md=0: removing MD RAID activation Aug 19 08:20:07.739651 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:20:07.743383 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:20:07.823249 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:20:07.826724 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 08:20:07.865372 kernel: cryptd: max_cpu_qlen set to 1000 Aug 19 08:20:07.866817 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Aug 19 08:20:07.871786 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Aug 19 08:20:07.876434 kernel: AES CTR mode by8 optimization enabled Aug 19 08:20:07.887040 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 19 08:20:07.887064 kernel: GPT:9289727 != 19775487 Aug 19 08:20:07.887076 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 19 08:20:07.888080 kernel: GPT:9289727 != 19775487 Aug 19 08:20:07.888101 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 19 08:20:07.889217 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:20:07.907784 kernel: libata version 3.00 loaded. Aug 19 08:20:07.909464 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:20:07.910653 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Aug 19 08:20:07.909597 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:20:07.912000 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:20:07.916669 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:20:07.918142 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:20:07.931783 kernel: ahci 0000:00:1f.2: version 3.0 Aug 19 08:20:07.931999 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Aug 19 08:20:07.933784 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Aug 19 08:20:07.934946 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Aug 19 08:20:07.935125 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Aug 19 08:20:07.940801 kernel: scsi host0: ahci Aug 19 08:20:07.942772 kernel: scsi host1: ahci Aug 19 08:20:07.944799 kernel: scsi host2: ahci Aug 19 08:20:07.948783 kernel: scsi host3: ahci Aug 19 08:20:07.950700 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Aug 19 08:20:07.954961 kernel: scsi host4: ahci Aug 19 08:20:07.955147 kernel: scsi host5: ahci Aug 19 08:20:07.955301 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 0 Aug 19 08:20:07.956933 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 0 Aug 19 08:20:07.956967 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 0 Aug 19 08:20:07.959610 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 0 Aug 19 08:20:07.959649 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 0 Aug 19 08:20:07.959660 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 0 Aug 19 08:20:07.965913 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Aug 19 08:20:07.973264 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Aug 19 08:20:07.973528 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Aug 19 08:20:07.982377 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 08:20:07.984813 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 08:20:08.173237 disk-uuid[631]: Primary Header is updated. Aug 19 08:20:08.173237 disk-uuid[631]: Secondary Entries is updated. Aug 19 08:20:08.173237 disk-uuid[631]: Secondary Header is updated. Aug 19 08:20:08.192563 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:20:08.192304 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:20:08.269346 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 19 08:20:08.269407 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 19 08:20:08.269419 kernel: ata2: SATA link down (SStatus 0 SControl 300) Aug 19 08:20:08.269430 kernel: ata1: SATA link down (SStatus 0 SControl 300) Aug 19 08:20:08.270422 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Aug 19 08:20:08.272048 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Aug 19 08:20:08.272065 kernel: ata3.00: applying bridge limits Aug 19 08:20:08.272787 kernel: ata3.00: configured for UDMA/100 Aug 19 08:20:08.273788 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 19 08:20:08.273815 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 19 08:20:08.312787 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Aug 19 08:20:08.313032 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 19 08:20:08.326783 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Aug 19 08:20:08.716107 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 08:20:08.719008 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:20:08.721553 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:20:08.724098 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:20:08.727356 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 08:20:08.766489 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:20:09.182793 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 19 08:20:09.182878 disk-uuid[632]: The operation has completed successfully. Aug 19 08:20:09.212690 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 08:20:09.212865 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 08:20:09.248912 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 08:20:09.277504 sh[663]: Success Aug 19 08:20:09.300815 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 08:20:09.300963 kernel: device-mapper: uevent: version 1.0.3 Aug 19 08:20:09.302788 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 08:20:09.313798 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Aug 19 08:20:09.346071 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 08:20:09.348799 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 08:20:09.367325 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 08:20:09.371802 kernel: BTRFS: device fsid 99050df3-5e04-4f37-acde-dec46aab7896 devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (675) Aug 19 08:20:09.371833 kernel: BTRFS info (device dm-0): first mount of filesystem 99050df3-5e04-4f37-acde-dec46aab7896 Aug 19 08:20:09.373855 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:20:09.373886 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 08:20:09.379631 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 08:20:09.380884 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:20:09.381743 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 08:20:09.385550 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 08:20:09.388353 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 08:20:09.423370 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (708) Aug 19 08:20:09.425882 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:20:09.425980 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:20:09.426000 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:20:09.437833 kernel: BTRFS info (device vda6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:20:09.440611 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 08:20:09.442311 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 08:20:09.629812 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:20:09.636627 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:20:09.709937 ignition[751]: Ignition 2.21.0 Aug 19 08:20:09.709953 ignition[751]: Stage: fetch-offline Aug 19 08:20:09.709993 ignition[751]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:09.710003 ignition[751]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:09.710127 ignition[751]: parsed url from cmdline: "" Aug 19 08:20:09.710131 ignition[751]: no config URL provided Aug 19 08:20:09.710137 ignition[751]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 08:20:09.710146 ignition[751]: no config at "/usr/lib/ignition/user.ign" Aug 19 08:20:09.710175 ignition[751]: op(1): [started] loading QEMU firmware config module Aug 19 08:20:09.710181 ignition[751]: op(1): executing: "modprobe" "qemu_fw_cfg" Aug 19 08:20:09.720515 ignition[751]: op(1): [finished] loading QEMU firmware config module Aug 19 08:20:09.726159 systemd-networkd[856]: lo: Link UP Aug 19 08:20:09.726172 systemd-networkd[856]: lo: Gained carrier Aug 19 08:20:09.727919 systemd-networkd[856]: Enumeration completed Aug 19 08:20:09.728078 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:20:09.728624 systemd[1]: Reached target network.target - Network. Aug 19 08:20:09.729483 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:20:09.729489 systemd-networkd[856]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:20:09.730084 systemd-networkd[856]: eth0: Link UP Aug 19 08:20:09.730939 systemd-networkd[856]: eth0: Gained carrier Aug 19 08:20:09.730950 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:20:09.764901 systemd-networkd[856]: eth0: DHCPv4 address 10.0.0.150/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 08:20:09.773562 ignition[751]: parsing config with SHA512: 6ebca37aa8d8f237fca1faf1259683048a91063f4abd23963484c9f0efb6d31cd21bfdfb2d33a4ded65851e61f312117d6a8dd62caca4d20d1e3c699f69ffd80 Aug 19 08:20:09.778567 unknown[751]: fetched base config from "system" Aug 19 08:20:09.778580 unknown[751]: fetched user config from "qemu" Aug 19 08:20:09.780841 ignition[751]: fetch-offline: fetch-offline passed Aug 19 08:20:09.781815 ignition[751]: Ignition finished successfully Aug 19 08:20:09.787274 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:20:09.787801 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Aug 19 08:20:09.790305 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 08:20:09.850738 ignition[864]: Ignition 2.21.0 Aug 19 08:20:09.850752 ignition[864]: Stage: kargs Aug 19 08:20:09.850933 ignition[864]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:09.850945 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:09.854698 ignition[864]: kargs: kargs passed Aug 19 08:20:09.854778 ignition[864]: Ignition finished successfully Aug 19 08:20:09.860479 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 08:20:09.863939 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 08:20:09.930394 ignition[872]: Ignition 2.21.0 Aug 19 08:20:09.930409 ignition[872]: Stage: disks Aug 19 08:20:09.930636 ignition[872]: no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:09.930651 ignition[872]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:09.931806 ignition[872]: disks: disks passed Aug 19 08:20:09.932712 ignition[872]: Ignition finished successfully Aug 19 08:20:09.939732 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 08:20:09.940232 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 08:20:09.942329 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 08:20:09.944710 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:20:09.945074 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:20:09.949429 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:20:09.952740 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 08:20:09.980865 systemd-fsck[882]: ROOT: clean, 15/553520 files, 52789/553472 blocks Aug 19 08:20:10.025880 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 08:20:10.030109 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 08:20:10.230857 kernel: EXT4-fs (vda9): mounted filesystem 41966107-04fa-426e-9830-6b4efa50e27b r/w with ordered data mode. Quota mode: none. Aug 19 08:20:10.231976 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 08:20:10.233643 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 08:20:10.236570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:20:10.239183 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 08:20:10.239883 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 19 08:20:10.239929 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 08:20:10.239957 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:20:10.255793 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (890) Aug 19 08:20:10.255850 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:20:10.256000 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 08:20:10.260061 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:20:10.260079 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:20:10.260148 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 08:20:10.265995 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:20:10.303305 initrd-setup-root[914]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 08:20:10.308977 initrd-setup-root[921]: cut: /sysroot/etc/group: No such file or directory Aug 19 08:20:10.313786 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 08:20:10.318467 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 08:20:10.433686 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 08:20:10.436224 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 08:20:10.437739 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 08:20:10.456668 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 08:20:10.458087 kernel: BTRFS info (device vda6): last unmount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:20:10.472147 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 08:20:10.490934 ignition[1003]: INFO : Ignition 2.21.0 Aug 19 08:20:10.490934 ignition[1003]: INFO : Stage: mount Aug 19 08:20:10.493017 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:10.493017 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:10.495196 ignition[1003]: INFO : mount: mount passed Aug 19 08:20:10.495196 ignition[1003]: INFO : Ignition finished successfully Aug 19 08:20:10.496813 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 08:20:10.499111 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 08:20:10.528515 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 08:20:10.555563 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1016) Aug 19 08:20:10.555622 kernel: BTRFS info (device vda6): first mount of filesystem 43dd0637-5e0b-4b8d-a544-a82ca0652f6f Aug 19 08:20:10.555635 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 19 08:20:10.556423 kernel: BTRFS info (device vda6): using free-space-tree Aug 19 08:20:10.560717 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 08:20:10.595925 ignition[1033]: INFO : Ignition 2.21.0 Aug 19 08:20:10.595925 ignition[1033]: INFO : Stage: files Aug 19 08:20:10.598893 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:10.598893 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:10.601699 ignition[1033]: DEBUG : files: compiled without relabeling support, skipping Aug 19 08:20:10.603554 ignition[1033]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 08:20:10.603554 ignition[1033]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 08:20:10.607773 ignition[1033]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 08:20:10.609340 ignition[1033]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 08:20:10.610710 ignition[1033]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 08:20:10.610286 unknown[1033]: wrote ssh authorized keys file for user: core Aug 19 08:20:10.613417 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 19 08:20:10.613417 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Aug 19 08:20:10.701929 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 08:20:10.793660 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:20:10.795941 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 19 08:20:10.810113 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Aug 19 08:20:11.128259 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 08:20:11.787508 systemd-networkd[856]: eth0: Gained IPv6LL Aug 19 08:20:11.792194 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 19 08:20:11.792194 ignition[1033]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 08:20:11.795981 ignition[1033]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:20:11.800169 ignition[1033]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 08:20:11.800169 ignition[1033]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 08:20:11.800169 ignition[1033]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 19 08:20:11.804610 ignition[1033]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 08:20:11.804610 ignition[1033]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Aug 19 08:20:11.804610 ignition[1033]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 19 08:20:11.804610 ignition[1033]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Aug 19 08:20:11.826579 ignition[1033]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 08:20:11.832243 ignition[1033]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 08:20:11.833837 ignition[1033]: INFO : files: files passed Aug 19 08:20:11.833837 ignition[1033]: INFO : Ignition finished successfully Aug 19 08:20:11.837992 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 08:20:11.840209 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 08:20:11.843811 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 08:20:11.860273 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 08:20:11.860405 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 08:20:11.863662 initrd-setup-root-after-ignition[1062]: grep: /sysroot/oem/oem-release: No such file or directory Aug 19 08:20:11.867975 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:20:11.867975 initrd-setup-root-after-ignition[1064]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:20:11.871368 initrd-setup-root-after-ignition[1068]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 08:20:11.873697 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:20:11.874438 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 08:20:11.878455 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 08:20:11.960655 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 08:20:11.960854 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 08:20:11.962451 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 08:20:11.966001 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 08:20:11.966642 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 08:20:11.967970 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 08:20:12.000744 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:20:12.003226 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 08:20:12.034674 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:20:12.035101 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:20:12.035450 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 08:20:12.035801 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 08:20:12.035963 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 08:20:12.036605 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 08:20:12.037103 systemd[1]: Stopped target basic.target - Basic System. Aug 19 08:20:12.037433 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 08:20:12.037778 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 08:20:12.038304 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 08:20:12.038612 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 08:20:12.039090 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 08:20:12.039402 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 08:20:12.039742 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 08:20:12.040216 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 08:20:12.040525 systemd[1]: Stopped target swap.target - Swaps. Aug 19 08:20:12.040986 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 08:20:12.041097 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 08:20:12.065721 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:20:12.066318 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:20:12.066589 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 08:20:12.066739 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:20:12.071401 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 08:20:12.071572 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 08:20:12.075352 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 08:20:12.075508 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 08:20:12.076138 systemd[1]: Stopped target paths.target - Path Units. Aug 19 08:20:12.076367 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 08:20:12.079868 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:20:12.080406 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 08:20:12.080729 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 08:20:12.081258 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 08:20:12.081353 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 08:20:12.086228 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 08:20:12.086321 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 08:20:12.088891 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 08:20:12.089063 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 08:20:12.090675 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 08:20:12.090867 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 08:20:12.094216 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 08:20:12.094626 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 08:20:12.094863 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:20:12.098113 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 08:20:12.102818 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 08:20:12.103006 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:20:12.104805 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 08:20:12.104922 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 08:20:12.113815 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 08:20:12.114329 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 08:20:12.129830 ignition[1088]: INFO : Ignition 2.21.0 Aug 19 08:20:12.129830 ignition[1088]: INFO : Stage: umount Aug 19 08:20:12.131822 ignition[1088]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 08:20:12.131822 ignition[1088]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Aug 19 08:20:12.131822 ignition[1088]: INFO : umount: umount passed Aug 19 08:20:12.131822 ignition[1088]: INFO : Ignition finished successfully Aug 19 08:20:12.133983 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 08:20:12.134127 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 08:20:12.134618 systemd[1]: Stopped target network.target - Network. Aug 19 08:20:12.135223 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 08:20:12.135277 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 08:20:12.135561 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 08:20:12.135614 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 08:20:12.136055 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 08:20:12.136104 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 08:20:12.136377 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 08:20:12.136418 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 08:20:12.136847 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 08:20:12.137323 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 08:20:12.138657 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 08:20:12.153256 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 08:20:12.153415 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 08:20:12.158214 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 08:20:12.159456 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 08:20:12.159626 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 08:20:12.163224 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 08:20:12.163965 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 08:20:12.167866 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 08:20:12.167924 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:20:12.170884 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 08:20:12.172038 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 08:20:12.172096 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 08:20:12.174388 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 08:20:12.174438 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:20:12.177861 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 08:20:12.177911 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 08:20:12.178471 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 08:20:12.178513 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:20:12.182925 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:20:12.185188 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 08:20:12.185261 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:20:12.198380 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 08:20:12.198523 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 08:20:12.207642 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 08:20:12.207852 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:20:12.209372 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 08:20:12.209422 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 08:20:12.211388 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 08:20:12.211431 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:20:12.213447 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 08:20:12.213498 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 08:20:12.214254 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 08:20:12.214318 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 08:20:12.215070 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 08:20:12.215143 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 08:20:12.225427 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 08:20:12.225696 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 08:20:12.225796 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:20:12.229675 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 08:20:12.229727 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:20:12.233312 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 08:20:12.233406 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:20:12.237701 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 19 08:20:12.237792 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 19 08:20:12.237847 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 08:20:12.256420 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 08:20:12.256563 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 08:20:12.340928 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 08:20:12.341149 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 08:20:12.342492 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 08:20:12.344317 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 08:20:12.344377 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 08:20:12.346229 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 08:20:12.369152 systemd[1]: Switching root. Aug 19 08:20:12.414707 systemd-journald[219]: Journal stopped Aug 19 08:20:13.992832 systemd-journald[219]: Received SIGTERM from PID 1 (systemd). Aug 19 08:20:13.992913 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 08:20:13.992952 kernel: SELinux: policy capability open_perms=1 Aug 19 08:20:13.992966 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 08:20:13.992984 kernel: SELinux: policy capability always_check_network=0 Aug 19 08:20:13.992998 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 08:20:13.993012 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 08:20:13.993025 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 08:20:13.993038 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 08:20:13.993052 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 08:20:13.993065 kernel: audit: type=1403 audit(1755591613.132:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 08:20:13.993088 systemd[1]: Successfully loaded SELinux policy in 63.719ms. Aug 19 08:20:13.993115 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.888ms. Aug 19 08:20:13.993132 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 08:20:13.993147 systemd[1]: Detected virtualization kvm. Aug 19 08:20:13.993162 systemd[1]: Detected architecture x86-64. Aug 19 08:20:13.993177 systemd[1]: Detected first boot. Aug 19 08:20:13.993196 systemd[1]: Initializing machine ID from VM UUID. Aug 19 08:20:13.993210 zram_generator::config[1134]: No configuration found. Aug 19 08:20:13.993226 kernel: Guest personality initialized and is inactive Aug 19 08:20:13.993246 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 19 08:20:13.993261 kernel: Initialized host personality Aug 19 08:20:13.993275 kernel: NET: Registered PF_VSOCK protocol family Aug 19 08:20:13.993289 systemd[1]: Populated /etc with preset unit settings. Aug 19 08:20:13.993327 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 08:20:13.993346 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 08:20:13.993361 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 08:20:13.993403 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 08:20:13.993451 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 08:20:13.993479 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 08:20:13.993493 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 08:20:13.993510 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 08:20:13.993525 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 08:20:13.993553 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 08:20:13.993568 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 08:20:13.993583 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 08:20:13.993598 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 08:20:13.993623 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 08:20:13.993638 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 08:20:13.993653 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 08:20:13.993673 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 08:20:13.993688 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 08:20:13.993702 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 19 08:20:13.993717 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 08:20:13.993731 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 08:20:13.993769 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 08:20:13.993787 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 08:20:13.993803 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 08:20:13.993818 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 08:20:13.993833 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 08:20:13.993849 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 08:20:13.993865 systemd[1]: Reached target slices.target - Slice Units. Aug 19 08:20:13.993881 systemd[1]: Reached target swap.target - Swaps. Aug 19 08:20:13.993896 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 08:20:13.993924 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 08:20:13.993940 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 08:20:13.993955 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 08:20:13.993969 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 08:20:13.993989 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 08:20:13.994004 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 08:20:13.994018 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 08:20:13.994033 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 08:20:13.994047 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 08:20:13.994069 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:13.994084 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 08:20:13.994100 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 08:20:13.994114 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 08:20:13.994130 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 08:20:13.994145 systemd[1]: Reached target machines.target - Containers. Aug 19 08:20:13.994160 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 08:20:13.994175 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:20:13.994199 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 08:20:13.994214 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 08:20:13.994229 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:20:13.994243 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:20:13.994258 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:20:13.994272 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 08:20:13.994287 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:20:13.994301 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 08:20:13.994316 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 08:20:13.994338 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 08:20:13.994354 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 08:20:13.994368 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 08:20:13.994384 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:20:13.994399 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 08:20:13.994415 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 08:20:13.994460 systemd-journald[1198]: Collecting audit messages is disabled. Aug 19 08:20:13.994495 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 08:20:13.994511 systemd-journald[1198]: Journal started Aug 19 08:20:13.994546 systemd-journald[1198]: Runtime Journal (/run/log/journal/39a22897a2474f84b484b6f1e02538d3) is 6M, max 48.6M, 42.5M free. Aug 19 08:20:13.997165 kernel: fuse: init (API version 7.41) Aug 19 08:20:13.690720 systemd[1]: Queued start job for default target multi-user.target. Aug 19 08:20:13.710965 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Aug 19 08:20:13.711439 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 08:20:14.000252 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 08:20:14.004212 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 08:20:14.005786 kernel: loop: module loaded Aug 19 08:20:14.008281 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 08:20:14.010143 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 08:20:14.010179 systemd[1]: Stopped verity-setup.service. Aug 19 08:20:14.012798 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:14.021726 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 08:20:14.022433 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 08:20:14.023687 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 08:20:14.024879 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 08:20:14.025949 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 08:20:14.027128 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 08:20:14.028397 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 08:20:14.029641 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 08:20:14.031183 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 08:20:14.031399 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 08:20:14.032911 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:20:14.033205 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:20:14.034823 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:20:14.035058 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:20:14.036524 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 08:20:14.036752 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 08:20:14.038121 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:20:14.038346 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:20:14.039729 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 08:20:14.041142 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 08:20:14.042665 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 08:20:14.044207 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 08:20:14.047806 kernel: ACPI: bus type drm_connector registered Aug 19 08:20:14.049165 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:20:14.049396 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:20:14.063104 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 08:20:14.065900 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 08:20:14.068327 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 08:20:14.069642 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 08:20:14.069735 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 08:20:14.072183 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 08:20:14.081889 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 08:20:14.083614 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:20:14.085905 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 08:20:14.088893 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 08:20:14.090083 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:20:14.093887 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 08:20:14.095047 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:20:14.096381 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 08:20:14.098513 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 08:20:14.101859 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 08:20:14.103243 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 08:20:14.111796 systemd-journald[1198]: Time spent on flushing to /var/log/journal/39a22897a2474f84b484b6f1e02538d3 is 21.746ms for 975 entries. Aug 19 08:20:14.111796 systemd-journald[1198]: System Journal (/var/log/journal/39a22897a2474f84b484b6f1e02538d3) is 8M, max 195.6M, 187.6M free. Aug 19 08:20:14.181164 systemd-journald[1198]: Received client request to flush runtime journal. Aug 19 08:20:14.181217 kernel: loop0: detected capacity change from 0 to 111000 Aug 19 08:20:14.181249 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 08:20:14.181268 kernel: loop1: detected capacity change from 0 to 229808 Aug 19 08:20:14.113115 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 08:20:14.143224 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 08:20:14.151364 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 08:20:14.153809 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 08:20:14.158485 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 08:20:14.183520 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 08:20:14.193191 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 08:20:14.196968 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 08:20:14.198398 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 08:20:14.351797 kernel: loop2: detected capacity change from 0 to 128016 Aug 19 08:20:14.377369 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 08:20:14.380773 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 08:20:14.383776 kernel: loop3: detected capacity change from 0 to 111000 Aug 19 08:20:14.400792 kernel: loop4: detected capacity change from 0 to 229808 Aug 19 08:20:14.413747 kernel: loop5: detected capacity change from 0 to 128016 Aug 19 08:20:14.417155 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Aug 19 08:20:14.417570 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Aug 19 08:20:14.422436 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 08:20:14.424638 (sd-merge)[1276]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Aug 19 08:20:14.425728 (sd-merge)[1276]: Merged extensions into '/usr'. Aug 19 08:20:14.430478 systemd[1]: Reload requested from client PID 1231 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 08:20:14.430501 systemd[1]: Reloading... Aug 19 08:20:14.559823 zram_generator::config[1301]: No configuration found. Aug 19 08:20:14.800351 ldconfig[1226]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 08:20:14.858185 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 08:20:14.858829 systemd[1]: Reloading finished in 427 ms. Aug 19 08:20:14.893057 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 08:20:14.894614 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 08:20:15.122753 systemd[1]: Starting ensure-sysext.service... Aug 19 08:20:15.124873 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 08:20:15.147346 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 08:20:15.147386 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 08:20:15.147748 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 08:20:15.148091 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 08:20:15.149151 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 08:20:15.149428 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Aug 19 08:20:15.149509 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Aug 19 08:20:15.157019 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:20:15.157037 systemd-tmpfiles[1342]: Skipping /boot Aug 19 08:20:15.161496 systemd[1]: Reload requested from client PID 1341 ('systemctl') (unit ensure-sysext.service)... Aug 19 08:20:15.161525 systemd[1]: Reloading... Aug 19 08:20:15.168724 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 08:20:15.168902 systemd-tmpfiles[1342]: Skipping /boot Aug 19 08:20:15.204804 zram_generator::config[1370]: No configuration found. Aug 19 08:20:15.389359 systemd[1]: Reloading finished in 227 ms. Aug 19 08:20:15.414661 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 08:20:15.431370 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 08:20:15.440651 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:20:15.443303 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 08:20:15.445662 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 08:20:15.453720 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 08:20:15.457774 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 08:20:15.462043 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 08:20:15.467967 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:15.468155 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:20:15.473956 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:20:15.477622 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:20:15.480852 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:20:15.482055 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:20:15.482165 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:20:15.485558 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 08:20:15.486608 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:15.492204 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 08:20:15.495656 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:20:15.496056 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:20:15.497927 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:20:15.498197 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:20:15.500052 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:20:15.500274 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:20:15.512753 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 08:20:15.515830 systemd-udevd[1412]: Using default interface naming scheme 'v255'. Aug 19 08:20:15.518847 systemd[1]: Finished ensure-sysext.service. Aug 19 08:20:15.521013 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:15.521273 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 08:20:15.523566 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 08:20:15.546659 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 08:20:15.552734 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 08:20:15.555329 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 08:20:15.557971 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 08:20:15.558019 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 08:20:15.560039 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 19 08:20:15.564025 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 08:20:15.565203 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 19 08:20:15.565606 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 08:20:15.567303 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 08:20:15.567588 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 08:20:15.569595 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 08:20:15.569844 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 08:20:15.572272 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 08:20:15.572514 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 08:20:15.574115 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 08:20:15.574500 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 08:20:15.579998 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 08:20:15.580090 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 08:20:15.583354 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 08:20:15.585099 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 08:20:15.591769 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 08:20:15.592884 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 08:20:15.673111 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 19 08:20:15.690240 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 19 08:20:15.693044 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 08:20:15.702525 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 08:20:15.719160 augenrules[1498]: No rules Aug 19 08:20:15.721394 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:20:15.722631 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:20:15.724789 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Aug 19 08:20:15.730787 kernel: ACPI: button: Power Button [PWRF] Aug 19 08:20:15.731830 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 08:20:15.736236 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 08:20:15.768935 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Aug 19 08:20:15.769293 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Aug 19 08:20:15.794290 systemd-networkd[1467]: lo: Link UP Aug 19 08:20:15.794304 systemd-networkd[1467]: lo: Gained carrier Aug 19 08:20:15.796090 systemd-networkd[1467]: Enumeration completed Aug 19 08:20:15.796218 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 08:20:15.796497 systemd-networkd[1467]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:20:15.796510 systemd-networkd[1467]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 08:20:15.800381 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 08:20:15.800983 systemd-networkd[1467]: eth0: Link UP Aug 19 08:20:15.801141 systemd-networkd[1467]: eth0: Gained carrier Aug 19 08:20:15.801165 systemd-networkd[1467]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 08:20:15.803891 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 08:20:15.816835 systemd-networkd[1467]: eth0: DHCPv4 address 10.0.0.150/16, gateway 10.0.0.1 acquired from 10.0.0.1 Aug 19 08:20:15.818504 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 19 08:20:15.819872 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 08:20:16.916329 systemd-timesyncd[1448]: Contacted time server 10.0.0.1:123 (10.0.0.1). Aug 19 08:20:16.916458 systemd-timesyncd[1448]: Initial clock synchronization to Tue 2025-08-19 08:20:16.916122 UTC. Aug 19 08:20:16.921733 systemd-resolved[1411]: Positive Trust Anchors: Aug 19 08:20:16.921742 systemd-resolved[1411]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 08:20:16.921773 systemd-resolved[1411]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 08:20:16.932384 systemd-resolved[1411]: Defaulting to hostname 'linux'. Aug 19 08:20:16.934048 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 08:20:16.935615 systemd[1]: Reached target network.target - Network. Aug 19 08:20:16.938156 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 08:20:16.939361 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 08:20:16.941303 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 08:20:16.942695 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 08:20:16.944009 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 19 08:20:16.946387 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 08:20:16.947816 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 08:20:16.949146 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 08:20:16.951170 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 08:20:16.951205 systemd[1]: Reached target paths.target - Path Units. Aug 19 08:20:16.952204 systemd[1]: Reached target timers.target - Timer Units. Aug 19 08:20:16.954401 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 08:20:16.957650 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 08:20:16.962324 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 08:20:16.964205 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 08:20:16.966208 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 08:20:16.983213 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 08:20:16.985301 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 08:20:16.987610 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 08:20:16.989252 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 08:20:16.995576 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 08:20:16.996689 systemd[1]: Reached target basic.target - Basic System. Aug 19 08:20:16.997896 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:20:16.997996 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 08:20:17.001304 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 08:20:17.003579 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 08:20:17.006364 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 08:20:17.014447 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 08:20:17.016716 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 08:20:17.017824 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 08:20:17.019152 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 19 08:20:17.028608 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 08:20:17.031598 jq[1532]: false Aug 19 08:20:17.033369 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 08:20:17.035483 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 08:20:17.042137 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Refreshing passwd entry cache Aug 19 08:20:17.041158 oslogin_cache_refresh[1534]: Refreshing passwd entry cache Aug 19 08:20:17.044362 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 08:20:17.055391 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 08:20:17.056982 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Failure getting users, quitting Aug 19 08:20:17.056982 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:20:17.056982 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Refreshing group entry cache Aug 19 08:20:17.056577 oslogin_cache_refresh[1534]: Failure getting users, quitting Aug 19 08:20:17.056595 oslogin_cache_refresh[1534]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 19 08:20:17.056640 oslogin_cache_refresh[1534]: Refreshing group entry cache Aug 19 08:20:17.057336 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 08:20:17.057978 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 08:20:17.058700 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 08:20:17.061348 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 08:20:17.066261 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Failure getting groups, quitting Aug 19 08:20:17.066261 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:20:17.066252 oslogin_cache_refresh[1534]: Failure getting groups, quitting Aug 19 08:20:17.066270 oslogin_cache_refresh[1534]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 19 08:20:17.067107 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 08:20:17.067233 extend-filesystems[1533]: Found /dev/vda6 Aug 19 08:20:17.069487 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 08:20:17.069790 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 08:20:17.070700 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 19 08:20:17.071538 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 19 08:20:17.076001 extend-filesystems[1533]: Found /dev/vda9 Aug 19 08:20:17.077639 extend-filesystems[1533]: Checking size of /dev/vda9 Aug 19 08:20:17.081235 kernel: kvm_amd: TSC scaling supported Aug 19 08:20:17.081281 kernel: kvm_amd: Nested Virtualization enabled Aug 19 08:20:17.081299 kernel: kvm_amd: Nested Paging enabled Aug 19 08:20:17.081313 kernel: kvm_amd: LBR virtualization supported Aug 19 08:20:17.083439 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 08:20:17.083701 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 08:20:17.085246 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 08:20:17.085497 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 08:20:17.097279 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Aug 19 08:20:17.097318 kernel: kvm_amd: Virtual GIF supported Aug 19 08:20:17.098044 update_engine[1548]: I20250819 08:20:17.097896 1548 main.cc:92] Flatcar Update Engine starting Aug 19 08:20:17.099433 (ntainerd)[1559]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 08:20:17.104386 jq[1550]: true Aug 19 08:20:17.118786 extend-filesystems[1533]: Resized partition /dev/vda9 Aug 19 08:20:17.118846 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 08:20:17.123846 extend-filesystems[1574]: resize2fs 1.47.2 (1-Jan-2025) Aug 19 08:20:17.128301 tar[1555]: linux-amd64/LICENSE Aug 19 08:20:17.128301 tar[1555]: linux-amd64/helm Aug 19 08:20:17.132411 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Aug 19 08:20:17.138656 jq[1568]: true Aug 19 08:20:17.151196 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Aug 19 08:20:17.152626 dbus-daemon[1530]: [system] SELinux support is enabled Aug 19 08:20:17.152849 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 08:20:17.179734 update_engine[1548]: I20250819 08:20:17.167189 1548 update_check_scheduler.cc:74] Next update check in 5m26s Aug 19 08:20:17.160112 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 08:20:17.160134 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 08:20:17.161538 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 08:20:17.161554 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 08:20:17.171120 systemd[1]: Started update-engine.service - Update Engine. Aug 19 08:20:17.178455 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 08:20:17.184102 extend-filesystems[1574]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Aug 19 08:20:17.184102 extend-filesystems[1574]: old_desc_blocks = 1, new_desc_blocks = 1 Aug 19 08:20:17.184102 extend-filesystems[1574]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Aug 19 08:20:17.185234 extend-filesystems[1533]: Resized filesystem in /dev/vda9 Aug 19 08:20:17.188137 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 08:20:17.189012 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 08:20:17.219948 systemd-logind[1540]: Watching system buttons on /dev/input/event2 (Power Button) Aug 19 08:20:17.219980 systemd-logind[1540]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 19 08:20:17.221325 bash[1598]: Updated "/home/core/.ssh/authorized_keys" Aug 19 08:20:17.221412 systemd-logind[1540]: New seat seat0. Aug 19 08:20:17.246123 kernel: EDAC MC: Ver: 3.0.0 Aug 19 08:20:17.254789 sshd_keygen[1553]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 08:20:17.289117 locksmithd[1579]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 08:20:17.399413 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 08:20:17.401534 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 08:20:17.403408 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 08:20:17.405486 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 08:20:17.437097 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 08:20:17.438710 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Aug 19 08:20:17.465283 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 08:20:17.465632 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 08:20:17.470159 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 08:20:17.499138 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 08:20:17.503904 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 08:20:17.507443 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 19 08:20:17.508743 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 08:20:17.585257 tar[1555]: linux-amd64/README.md Aug 19 08:20:17.609392 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 08:20:17.643625 containerd[1559]: time="2025-08-19T08:20:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 08:20:17.644428 containerd[1559]: time="2025-08-19T08:20:17.644389281Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 08:20:17.653475 containerd[1559]: time="2025-08-19T08:20:17.653359536Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.394µs" Aug 19 08:20:17.653663 containerd[1559]: time="2025-08-19T08:20:17.653542349Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 08:20:17.653663 containerd[1559]: time="2025-08-19T08:20:17.653600278Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 08:20:17.653842 containerd[1559]: time="2025-08-19T08:20:17.653804100Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 08:20:17.653842 containerd[1559]: time="2025-08-19T08:20:17.653834467Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 08:20:17.653888 containerd[1559]: time="2025-08-19T08:20:17.653865125Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:20:17.653965 containerd[1559]: time="2025-08-19T08:20:17.653938422Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 08:20:17.653965 containerd[1559]: time="2025-08-19T08:20:17.653955614Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654245 containerd[1559]: time="2025-08-19T08:20:17.654214079Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654245 containerd[1559]: time="2025-08-19T08:20:17.654233626Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654303 containerd[1559]: time="2025-08-19T08:20:17.654245728Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654303 containerd[1559]: time="2025-08-19T08:20:17.654255457Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654382 containerd[1559]: time="2025-08-19T08:20:17.654362518Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654643 containerd[1559]: time="2025-08-19T08:20:17.654612967Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654668 containerd[1559]: time="2025-08-19T08:20:17.654650818Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 08:20:17.654668 containerd[1559]: time="2025-08-19T08:20:17.654663101Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 08:20:17.654720 containerd[1559]: time="2025-08-19T08:20:17.654704098Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 08:20:17.655034 containerd[1559]: time="2025-08-19T08:20:17.654995154Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 08:20:17.655196 containerd[1559]: time="2025-08-19T08:20:17.655169281Z" level=info msg="metadata content store policy set" policy=shared Aug 19 08:20:17.661032 containerd[1559]: time="2025-08-19T08:20:17.660993024Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 08:20:17.661084 containerd[1559]: time="2025-08-19T08:20:17.661032589Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 08:20:17.661084 containerd[1559]: time="2025-08-19T08:20:17.661046655Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 08:20:17.661084 containerd[1559]: time="2025-08-19T08:20:17.661059820Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661087832Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661099805Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661112749Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661125573Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661138127Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661148165Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 08:20:17.661157 containerd[1559]: time="2025-08-19T08:20:17.661157122Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 08:20:17.661291 containerd[1559]: time="2025-08-19T08:20:17.661170557Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 08:20:17.661314 containerd[1559]: time="2025-08-19T08:20:17.661297185Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 08:20:17.661335 containerd[1559]: time="2025-08-19T08:20:17.661315830Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 08:20:17.661335 containerd[1559]: time="2025-08-19T08:20:17.661329576Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 08:20:17.661381 containerd[1559]: time="2025-08-19T08:20:17.661342199Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 08:20:17.661381 containerd[1559]: time="2025-08-19T08:20:17.661352479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 08:20:17.661381 containerd[1559]: time="2025-08-19T08:20:17.661362237Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 08:20:17.661381 containerd[1559]: time="2025-08-19T08:20:17.661381122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 08:20:17.661460 containerd[1559]: time="2025-08-19T08:20:17.661392354Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 08:20:17.661460 containerd[1559]: time="2025-08-19T08:20:17.661403565Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 08:20:17.661460 containerd[1559]: time="2025-08-19T08:20:17.661415066Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 08:20:17.661460 containerd[1559]: time="2025-08-19T08:20:17.661425155Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 08:20:17.661537 containerd[1559]: time="2025-08-19T08:20:17.661493163Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 08:20:17.661537 containerd[1559]: time="2025-08-19T08:20:17.661506287Z" level=info msg="Start snapshots syncer" Aug 19 08:20:17.661577 containerd[1559]: time="2025-08-19T08:20:17.661539600Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 08:20:17.661805 containerd[1559]: time="2025-08-19T08:20:17.661747840Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 08:20:17.661924 containerd[1559]: time="2025-08-19T08:20:17.661810608Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 08:20:17.661924 containerd[1559]: time="2025-08-19T08:20:17.661885709Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 08:20:17.662021 containerd[1559]: time="2025-08-19T08:20:17.661990946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 08:20:17.662021 containerd[1559]: time="2025-08-19T08:20:17.662017115Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 08:20:17.662086 containerd[1559]: time="2025-08-19T08:20:17.662035670Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 08:20:17.662086 containerd[1559]: time="2025-08-19T08:20:17.662046721Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 08:20:17.662086 containerd[1559]: time="2025-08-19T08:20:17.662058202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 08:20:17.662147 containerd[1559]: time="2025-08-19T08:20:17.662089321Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 08:20:17.662147 containerd[1559]: time="2025-08-19T08:20:17.662100883Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 08:20:17.662147 containerd[1559]: time="2025-08-19T08:20:17.662121531Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 08:20:17.662147 containerd[1559]: time="2025-08-19T08:20:17.662131640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 08:20:17.662147 containerd[1559]: time="2025-08-19T08:20:17.662141960Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662178077Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662192585Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662201411Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662210368Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662218343Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662227189Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 08:20:17.662245 containerd[1559]: time="2025-08-19T08:20:17.662236607Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 08:20:17.662391 containerd[1559]: time="2025-08-19T08:20:17.662255543Z" level=info msg="runtime interface created" Aug 19 08:20:17.662391 containerd[1559]: time="2025-08-19T08:20:17.662261885Z" level=info msg="created NRI interface" Aug 19 08:20:17.662391 containerd[1559]: time="2025-08-19T08:20:17.662268898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 08:20:17.662391 containerd[1559]: time="2025-08-19T08:20:17.662278746Z" level=info msg="Connect containerd service" Aug 19 08:20:17.662391 containerd[1559]: time="2025-08-19T08:20:17.662301569Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 08:20:17.663202 containerd[1559]: time="2025-08-19T08:20:17.663168365Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 08:20:17.756483 containerd[1559]: time="2025-08-19T08:20:17.756415570Z" level=info msg="Start subscribing containerd event" Aug 19 08:20:17.756606 containerd[1559]: time="2025-08-19T08:20:17.756490320Z" level=info msg="Start recovering state" Aug 19 08:20:17.756677 containerd[1559]: time="2025-08-19T08:20:17.756635192Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 08:20:17.756731 containerd[1559]: time="2025-08-19T08:20:17.756638889Z" level=info msg="Start event monitor" Aug 19 08:20:17.756731 containerd[1559]: time="2025-08-19T08:20:17.756709381Z" level=info msg="Start cni network conf syncer for default" Aug 19 08:20:17.756731 containerd[1559]: time="2025-08-19T08:20:17.756718308Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 08:20:17.756798 containerd[1559]: time="2025-08-19T08:20:17.756719139Z" level=info msg="Start streaming server" Aug 19 08:20:17.756798 containerd[1559]: time="2025-08-19T08:20:17.756766408Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 08:20:17.756798 containerd[1559]: time="2025-08-19T08:20:17.756776507Z" level=info msg="runtime interface starting up..." Aug 19 08:20:17.756798 containerd[1559]: time="2025-08-19T08:20:17.756794631Z" level=info msg="starting plugins..." Aug 19 08:20:17.756885 containerd[1559]: time="2025-08-19T08:20:17.756819397Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 08:20:17.757923 containerd[1559]: time="2025-08-19T08:20:17.757402481Z" level=info msg="containerd successfully booted in 0.115796s" Aug 19 08:20:17.757619 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 08:20:18.724744 systemd-networkd[1467]: eth0: Gained IPv6LL Aug 19 08:20:18.730343 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 08:20:18.733030 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 08:20:18.735917 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Aug 19 08:20:18.738788 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:18.741199 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 08:20:18.786401 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 08:20:18.789999 systemd[1]: coreos-metadata.service: Deactivated successfully. Aug 19 08:20:18.790403 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Aug 19 08:20:18.792068 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 08:20:19.575492 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:19.577399 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 08:20:19.578844 systemd[1]: Startup finished in 3.461s (kernel) + 6.498s (initrd) + 5.421s (userspace) = 15.380s. Aug 19 08:20:19.610525 (kubelet)[1671]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:20:20.303422 kubelet[1671]: E0819 08:20:20.303333 1671 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:20:20.307434 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:20:20.307644 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:20:20.308046 systemd[1]: kubelet.service: Consumed 1.355s CPU time, 268.4M memory peak. Aug 19 08:20:21.138309 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 08:20:21.139595 systemd[1]: Started sshd@0-10.0.0.150:22-10.0.0.1:48498.service - OpenSSH per-connection server daemon (10.0.0.1:48498). Aug 19 08:20:21.226372 sshd[1684]: Accepted publickey for core from 10.0.0.1 port 48498 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:21.228423 sshd-session[1684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:21.235337 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 08:20:21.236479 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 08:20:21.243974 systemd-logind[1540]: New session 1 of user core. Aug 19 08:20:21.266891 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 08:20:21.269947 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 08:20:21.282718 (systemd)[1689]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 08:20:21.285112 systemd-logind[1540]: New session c1 of user core. Aug 19 08:20:21.442368 systemd[1689]: Queued start job for default target default.target. Aug 19 08:20:21.465308 systemd[1689]: Created slice app.slice - User Application Slice. Aug 19 08:20:21.465332 systemd[1689]: Reached target paths.target - Paths. Aug 19 08:20:21.465370 systemd[1689]: Reached target timers.target - Timers. Aug 19 08:20:21.466841 systemd[1689]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 08:20:21.477644 systemd[1689]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 08:20:21.477770 systemd[1689]: Reached target sockets.target - Sockets. Aug 19 08:20:21.477810 systemd[1689]: Reached target basic.target - Basic System. Aug 19 08:20:21.477862 systemd[1689]: Reached target default.target - Main User Target. Aug 19 08:20:21.477892 systemd[1689]: Startup finished in 186ms. Aug 19 08:20:21.478067 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 08:20:21.479595 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 08:20:21.544402 systemd[1]: Started sshd@1-10.0.0.150:22-10.0.0.1:48502.service - OpenSSH per-connection server daemon (10.0.0.1:48502). Aug 19 08:20:21.590107 sshd[1700]: Accepted publickey for core from 10.0.0.1 port 48502 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:21.591424 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:21.595432 systemd-logind[1540]: New session 2 of user core. Aug 19 08:20:21.609207 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 08:20:21.661961 sshd[1703]: Connection closed by 10.0.0.1 port 48502 Aug 19 08:20:21.662342 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:21.672543 systemd[1]: sshd@1-10.0.0.150:22-10.0.0.1:48502.service: Deactivated successfully. Aug 19 08:20:21.674266 systemd[1]: session-2.scope: Deactivated successfully. Aug 19 08:20:21.675101 systemd-logind[1540]: Session 2 logged out. Waiting for processes to exit. Aug 19 08:20:21.677647 systemd[1]: Started sshd@2-10.0.0.150:22-10.0.0.1:48510.service - OpenSSH per-connection server daemon (10.0.0.1:48510). Aug 19 08:20:21.678450 systemd-logind[1540]: Removed session 2. Aug 19 08:20:21.760194 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 48510 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:21.761545 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:21.765647 systemd-logind[1540]: New session 3 of user core. Aug 19 08:20:21.785237 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 08:20:21.833636 sshd[1712]: Connection closed by 10.0.0.1 port 48510 Aug 19 08:20:21.833950 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:21.848483 systemd[1]: sshd@2-10.0.0.150:22-10.0.0.1:48510.service: Deactivated successfully. Aug 19 08:20:21.850200 systemd[1]: session-3.scope: Deactivated successfully. Aug 19 08:20:21.850857 systemd-logind[1540]: Session 3 logged out. Waiting for processes to exit. Aug 19 08:20:21.853384 systemd[1]: Started sshd@3-10.0.0.150:22-10.0.0.1:48516.service - OpenSSH per-connection server daemon (10.0.0.1:48516). Aug 19 08:20:21.853907 systemd-logind[1540]: Removed session 3. Aug 19 08:20:21.906692 sshd[1718]: Accepted publickey for core from 10.0.0.1 port 48516 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:21.908019 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:21.912180 systemd-logind[1540]: New session 4 of user core. Aug 19 08:20:21.921183 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 08:20:21.973342 sshd[1722]: Connection closed by 10.0.0.1 port 48516 Aug 19 08:20:21.973674 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:21.989398 systemd[1]: sshd@3-10.0.0.150:22-10.0.0.1:48516.service: Deactivated successfully. Aug 19 08:20:21.990953 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 08:20:21.991619 systemd-logind[1540]: Session 4 logged out. Waiting for processes to exit. Aug 19 08:20:21.993889 systemd[1]: Started sshd@4-10.0.0.150:22-10.0.0.1:48520.service - OpenSSH per-connection server daemon (10.0.0.1:48520). Aug 19 08:20:21.994673 systemd-logind[1540]: Removed session 4. Aug 19 08:20:22.055847 sshd[1728]: Accepted publickey for core from 10.0.0.1 port 48520 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:22.057163 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:22.061173 systemd-logind[1540]: New session 5 of user core. Aug 19 08:20:22.071195 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 08:20:22.127505 sudo[1733]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 08:20:22.127899 sudo[1733]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:20:22.141542 sudo[1733]: pam_unix(sudo:session): session closed for user root Aug 19 08:20:22.143015 sshd[1732]: Connection closed by 10.0.0.1 port 48520 Aug 19 08:20:22.143402 sshd-session[1728]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:22.158375 systemd[1]: sshd@4-10.0.0.150:22-10.0.0.1:48520.service: Deactivated successfully. Aug 19 08:20:22.159919 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 08:20:22.160691 systemd-logind[1540]: Session 5 logged out. Waiting for processes to exit. Aug 19 08:20:22.162990 systemd[1]: Started sshd@5-10.0.0.150:22-10.0.0.1:48522.service - OpenSSH per-connection server daemon (10.0.0.1:48522). Aug 19 08:20:22.163538 systemd-logind[1540]: Removed session 5. Aug 19 08:20:22.214724 sshd[1739]: Accepted publickey for core from 10.0.0.1 port 48522 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:22.216050 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:22.220238 systemd-logind[1540]: New session 6 of user core. Aug 19 08:20:22.231205 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 08:20:22.284492 sudo[1744]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 08:20:22.284799 sudo[1744]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:20:22.290827 sudo[1744]: pam_unix(sudo:session): session closed for user root Aug 19 08:20:22.296648 sudo[1743]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 08:20:22.296953 sudo[1743]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:20:22.306096 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 08:20:22.353850 augenrules[1766]: No rules Aug 19 08:20:22.355254 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 08:20:22.355529 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 08:20:22.356645 sudo[1743]: pam_unix(sudo:session): session closed for user root Aug 19 08:20:22.357956 sshd[1742]: Connection closed by 10.0.0.1 port 48522 Aug 19 08:20:22.358331 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Aug 19 08:20:22.369454 systemd[1]: sshd@5-10.0.0.150:22-10.0.0.1:48522.service: Deactivated successfully. Aug 19 08:20:22.371138 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 08:20:22.371844 systemd-logind[1540]: Session 6 logged out. Waiting for processes to exit. Aug 19 08:20:22.374502 systemd[1]: Started sshd@6-10.0.0.150:22-10.0.0.1:48538.service - OpenSSH per-connection server daemon (10.0.0.1:48538). Aug 19 08:20:22.375095 systemd-logind[1540]: Removed session 6. Aug 19 08:20:22.434331 sshd[1775]: Accepted publickey for core from 10.0.0.1 port 48538 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:20:22.435620 sshd-session[1775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:20:22.439931 systemd-logind[1540]: New session 7 of user core. Aug 19 08:20:22.453206 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 08:20:22.505340 sudo[1779]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 08:20:22.505740 sudo[1779]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 08:20:22.801412 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 08:20:22.819432 (dockerd)[1800]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 08:20:23.032150 dockerd[1800]: time="2025-08-19T08:20:23.032042959Z" level=info msg="Starting up" Aug 19 08:20:23.033019 dockerd[1800]: time="2025-08-19T08:20:23.032985286Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 08:20:23.045615 dockerd[1800]: time="2025-08-19T08:20:23.045575372Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 08:20:23.606801 dockerd[1800]: time="2025-08-19T08:20:23.606746849Z" level=info msg="Loading containers: start." Aug 19 08:20:23.617111 kernel: Initializing XFRM netlink socket Aug 19 08:20:23.882621 systemd-networkd[1467]: docker0: Link UP Aug 19 08:20:23.887248 dockerd[1800]: time="2025-08-19T08:20:23.887210429Z" level=info msg="Loading containers: done." Aug 19 08:20:23.901128 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2209070770-merged.mount: Deactivated successfully. Aug 19 08:20:23.901767 dockerd[1800]: time="2025-08-19T08:20:23.901741395Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 08:20:23.901818 dockerd[1800]: time="2025-08-19T08:20:23.901799854Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 08:20:23.901887 dockerd[1800]: time="2025-08-19T08:20:23.901873573Z" level=info msg="Initializing buildkit" Aug 19 08:20:23.932194 dockerd[1800]: time="2025-08-19T08:20:23.932136958Z" level=info msg="Completed buildkit initialization" Aug 19 08:20:23.938127 dockerd[1800]: time="2025-08-19T08:20:23.938057223Z" level=info msg="Daemon has completed initialization" Aug 19 08:20:23.938263 dockerd[1800]: time="2025-08-19T08:20:23.938143114Z" level=info msg="API listen on /run/docker.sock" Aug 19 08:20:23.938311 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 08:20:24.490773 containerd[1559]: time="2025-08-19T08:20:24.490707206Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Aug 19 08:20:25.062175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount28240878.mount: Deactivated successfully. Aug 19 08:20:26.146437 containerd[1559]: time="2025-08-19T08:20:26.146368814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:26.147102 containerd[1559]: time="2025-08-19T08:20:26.147033451Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=30078664" Aug 19 08:20:26.148190 containerd[1559]: time="2025-08-19T08:20:26.148149143Z" level=info msg="ImageCreate event name:\"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:26.150596 containerd[1559]: time="2025-08-19T08:20:26.150557571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:26.151555 containerd[1559]: time="2025-08-19T08:20:26.151502273Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"30075464\" in 1.6607431s" Aug 19 08:20:26.151555 containerd[1559]: time="2025-08-19T08:20:26.151551435Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:1f41885d0a91155d5a5e670b2862eed338c7f12b0e8a5bbc88b1ab4a2d505ae8\"" Aug 19 08:20:26.152120 containerd[1559]: time="2025-08-19T08:20:26.152091278Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Aug 19 08:20:28.078656 containerd[1559]: time="2025-08-19T08:20:28.078522612Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:28.079340 containerd[1559]: time="2025-08-19T08:20:28.079226543Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=26018066" Aug 19 08:20:28.080437 containerd[1559]: time="2025-08-19T08:20:28.080403831Z" level=info msg="ImageCreate event name:\"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:28.082937 containerd[1559]: time="2025-08-19T08:20:28.082900063Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:28.083733 containerd[1559]: time="2025-08-19T08:20:28.083698591Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"27646961\" in 1.931575764s" Aug 19 08:20:28.083785 containerd[1559]: time="2025-08-19T08:20:28.083735310Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:358ab71c1a1ea4846ad0b3dff0d9db6b124236b64bc8a6b79dc874f65dc0d492\"" Aug 19 08:20:28.084493 containerd[1559]: time="2025-08-19T08:20:28.084440903Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Aug 19 08:20:29.770825 containerd[1559]: time="2025-08-19T08:20:29.770743154Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:29.771713 containerd[1559]: time="2025-08-19T08:20:29.771641890Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=20153911" Aug 19 08:20:29.773133 containerd[1559]: time="2025-08-19T08:20:29.773064689Z" level=info msg="ImageCreate event name:\"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:29.775770 containerd[1559]: time="2025-08-19T08:20:29.775739767Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:29.776739 containerd[1559]: time="2025-08-19T08:20:29.776686232Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"21782824\" in 1.692201947s" Aug 19 08:20:29.776739 containerd[1559]: time="2025-08-19T08:20:29.776721418Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:ab4ad8a84c3c69c18494ef32fa087b32f7c44d71e6acba463d2c7dda798c3d66\"" Aug 19 08:20:29.777240 containerd[1559]: time="2025-08-19T08:20:29.777188254Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Aug 19 08:20:30.558267 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 08:20:30.560461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:30.858291 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:30.869434 (kubelet)[2093]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 08:20:30.923921 kubelet[2093]: E0819 08:20:30.923811 2093 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 08:20:30.931044 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 08:20:30.931265 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 08:20:30.931750 systemd[1]: kubelet.service: Consumed 322ms CPU time, 110.6M memory peak. Aug 19 08:20:31.001143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount206527216.mount: Deactivated successfully. Aug 19 08:20:32.456281 containerd[1559]: time="2025-08-19T08:20:32.456222877Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:32.457306 containerd[1559]: time="2025-08-19T08:20:32.457270061Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=31899626" Aug 19 08:20:32.458724 containerd[1559]: time="2025-08-19T08:20:32.458631424Z" level=info msg="ImageCreate event name:\"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:32.460861 containerd[1559]: time="2025-08-19T08:20:32.460819950Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:32.461395 containerd[1559]: time="2025-08-19T08:20:32.461358460Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"31898645\" in 2.684143206s" Aug 19 08:20:32.461395 containerd[1559]: time="2025-08-19T08:20:32.461386142Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:1b2ea5e018dbbbd2efb8e5c540a6d3c463d77f250d3904429402ee057f09c64e\"" Aug 19 08:20:32.461877 containerd[1559]: time="2025-08-19T08:20:32.461853789Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 19 08:20:32.998402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3842423363.mount: Deactivated successfully. Aug 19 08:20:33.962088 containerd[1559]: time="2025-08-19T08:20:33.962000485Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:33.962890 containerd[1559]: time="2025-08-19T08:20:33.962848987Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Aug 19 08:20:33.964227 containerd[1559]: time="2025-08-19T08:20:33.964193158Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:33.967093 containerd[1559]: time="2025-08-19T08:20:33.967005874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:33.969511 containerd[1559]: time="2025-08-19T08:20:33.969429340Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.507538441s" Aug 19 08:20:33.969511 containerd[1559]: time="2025-08-19T08:20:33.969499882Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Aug 19 08:20:33.970027 containerd[1559]: time="2025-08-19T08:20:33.970000150Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 08:20:34.467264 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount308128696.mount: Deactivated successfully. Aug 19 08:20:34.474142 containerd[1559]: time="2025-08-19T08:20:34.474093445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:20:34.474953 containerd[1559]: time="2025-08-19T08:20:34.474929393Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Aug 19 08:20:34.476287 containerd[1559]: time="2025-08-19T08:20:34.476267122Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:20:34.478785 containerd[1559]: time="2025-08-19T08:20:34.478708912Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 08:20:34.479595 containerd[1559]: time="2025-08-19T08:20:34.479529031Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 509.499956ms" Aug 19 08:20:34.479595 containerd[1559]: time="2025-08-19T08:20:34.479580147Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 19 08:20:34.480020 containerd[1559]: time="2025-08-19T08:20:34.479984475Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 19 08:20:35.050350 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount306449548.mount: Deactivated successfully. Aug 19 08:20:37.679492 containerd[1559]: time="2025-08-19T08:20:37.679405524Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:37.680393 containerd[1559]: time="2025-08-19T08:20:37.680355887Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58377871" Aug 19 08:20:37.681867 containerd[1559]: time="2025-08-19T08:20:37.681798152Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:37.684723 containerd[1559]: time="2025-08-19T08:20:37.684684707Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:37.685949 containerd[1559]: time="2025-08-19T08:20:37.685905807Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 3.205888581s" Aug 19 08:20:37.685949 containerd[1559]: time="2025-08-19T08:20:37.685942937Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Aug 19 08:20:40.427725 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:40.427945 systemd[1]: kubelet.service: Consumed 322ms CPU time, 110.6M memory peak. Aug 19 08:20:40.430414 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:40.455145 systemd[1]: Reload requested from client PID 2248 ('systemctl') (unit session-7.scope)... Aug 19 08:20:40.455173 systemd[1]: Reloading... Aug 19 08:20:40.566369 zram_generator::config[2291]: No configuration found. Aug 19 08:20:40.991909 systemd[1]: Reloading finished in 536 ms. Aug 19 08:20:41.051849 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 08:20:41.051983 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 08:20:41.052343 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:41.052394 systemd[1]: kubelet.service: Consumed 170ms CPU time, 98.3M memory peak. Aug 19 08:20:41.054062 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:41.233006 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:41.238286 (kubelet)[2339]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:20:41.278092 kubelet[2339]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:20:41.278092 kubelet[2339]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 08:20:41.278092 kubelet[2339]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:20:41.278559 kubelet[2339]: I0819 08:20:41.278059 2339 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:20:41.724825 kubelet[2339]: I0819 08:20:41.724704 2339 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 08:20:41.724825 kubelet[2339]: I0819 08:20:41.724736 2339 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:20:41.725031 kubelet[2339]: I0819 08:20:41.725008 2339 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 08:20:41.752616 kubelet[2339]: E0819 08:20:41.752579 2339 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.150:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 08:20:41.752938 kubelet[2339]: I0819 08:20:41.752914 2339 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:20:41.762269 kubelet[2339]: I0819 08:20:41.762232 2339 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:20:41.769085 kubelet[2339]: I0819 08:20:41.769034 2339 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:20:41.769375 kubelet[2339]: I0819 08:20:41.769337 2339 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:20:41.769539 kubelet[2339]: I0819 08:20:41.769366 2339 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:20:41.769539 kubelet[2339]: I0819 08:20:41.769536 2339 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:20:41.769719 kubelet[2339]: I0819 08:20:41.769546 2339 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 08:20:41.770572 kubelet[2339]: I0819 08:20:41.770541 2339 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:20:41.773310 kubelet[2339]: I0819 08:20:41.773275 2339 kubelet.go:480] "Attempting to sync node with API server" Aug 19 08:20:41.773350 kubelet[2339]: I0819 08:20:41.773312 2339 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:20:41.773350 kubelet[2339]: I0819 08:20:41.773345 2339 kubelet.go:386] "Adding apiserver pod source" Aug 19 08:20:41.774860 kubelet[2339]: I0819 08:20:41.774847 2339 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:20:41.780764 kubelet[2339]: I0819 08:20:41.780731 2339 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:20:41.781257 kubelet[2339]: I0819 08:20:41.781232 2339 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 08:20:41.782675 kubelet[2339]: E0819 08:20:41.782594 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.150:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 08:20:41.782831 kubelet[2339]: W0819 08:20:41.782799 2339 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 08:20:41.782900 kubelet[2339]: E0819 08:20:41.782869 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.150:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 08:20:41.787480 kubelet[2339]: I0819 08:20:41.787453 2339 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 08:20:41.787581 kubelet[2339]: I0819 08:20:41.787541 2339 server.go:1289] "Started kubelet" Aug 19 08:20:41.788320 kubelet[2339]: I0819 08:20:41.787811 2339 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:20:41.789670 kubelet[2339]: I0819 08:20:41.789380 2339 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:20:41.789670 kubelet[2339]: I0819 08:20:41.789383 2339 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:20:41.789670 kubelet[2339]: I0819 08:20:41.789481 2339 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:20:41.791115 kubelet[2339]: E0819 08:20:41.790028 2339 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.150:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.150:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.185d1d4bf3630bcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-19 08:20:41.787468747 +0000 UTC m=+0.541567387,LastTimestamp:2025-08-19 08:20:41.787468747 +0000 UTC m=+0.541567387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 19 08:20:41.791115 kubelet[2339]: I0819 08:20:41.791110 2339 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 08:20:41.791676 kubelet[2339]: I0819 08:20:41.789375 2339 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:20:41.792464 kubelet[2339]: E0819 08:20:41.792436 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:41.792529 kubelet[2339]: E0819 08:20:41.792514 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="200ms" Aug 19 08:20:41.792711 kubelet[2339]: I0819 08:20:41.792683 2339 factory.go:223] Registration of the systemd container factory successfully Aug 19 08:20:41.792843 kubelet[2339]: I0819 08:20:41.792758 2339 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:20:41.794097 kubelet[2339]: I0819 08:20:41.793037 2339 server.go:317] "Adding debug handlers to kubelet server" Aug 19 08:20:41.794097 kubelet[2339]: E0819 08:20:41.794032 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.150:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 08:20:41.794296 kubelet[2339]: I0819 08:20:41.794269 2339 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 08:20:41.794296 kubelet[2339]: I0819 08:20:41.794275 2339 factory.go:223] Registration of the containerd container factory successfully Aug 19 08:20:41.794436 kubelet[2339]: E0819 08:20:41.794392 2339 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:20:41.794601 kubelet[2339]: I0819 08:20:41.794584 2339 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:20:41.808191 kubelet[2339]: I0819 08:20:41.808145 2339 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 08:20:41.808191 kubelet[2339]: I0819 08:20:41.808177 2339 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 08:20:41.808191 kubelet[2339]: I0819 08:20:41.808198 2339 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:20:41.892622 kubelet[2339]: E0819 08:20:41.892560 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:41.993717 kubelet[2339]: E0819 08:20:41.993503 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:41.994485 kubelet[2339]: E0819 08:20:41.994423 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="400ms" Aug 19 08:20:42.094027 kubelet[2339]: E0819 08:20:42.093967 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:42.181871 kubelet[2339]: I0819 08:20:42.181795 2339 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 08:20:42.183373 kubelet[2339]: I0819 08:20:42.183315 2339 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 08:20:42.183373 kubelet[2339]: I0819 08:20:42.183362 2339 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 08:20:42.183630 kubelet[2339]: I0819 08:20:42.183423 2339 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 08:20:42.183630 kubelet[2339]: I0819 08:20:42.183454 2339 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 08:20:42.183630 kubelet[2339]: E0819 08:20:42.183552 2339 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:20:42.184669 kubelet[2339]: E0819 08:20:42.184547 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.150:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 08:20:42.194472 kubelet[2339]: E0819 08:20:42.194423 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:42.214620 kubelet[2339]: I0819 08:20:42.214541 2339 policy_none.go:49] "None policy: Start" Aug 19 08:20:42.214620 kubelet[2339]: I0819 08:20:42.214616 2339 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 08:20:42.214620 kubelet[2339]: I0819 08:20:42.214639 2339 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:20:42.284740 kubelet[2339]: E0819 08:20:42.284528 2339 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Aug 19 08:20:42.294974 kubelet[2339]: E0819 08:20:42.294910 2339 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Aug 19 08:20:42.317169 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 08:20:42.335726 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 08:20:42.339408 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 08:20:42.350037 kubelet[2339]: E0819 08:20:42.349991 2339 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 08:20:42.350336 kubelet[2339]: I0819 08:20:42.350305 2339 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:20:42.350479 kubelet[2339]: I0819 08:20:42.350327 2339 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:20:42.350723 kubelet[2339]: I0819 08:20:42.350694 2339 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:20:42.351762 kubelet[2339]: E0819 08:20:42.351736 2339 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 08:20:42.351818 kubelet[2339]: E0819 08:20:42.351786 2339 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Aug 19 08:20:42.395680 kubelet[2339]: E0819 08:20:42.395617 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="800ms" Aug 19 08:20:42.452420 kubelet[2339]: I0819 08:20:42.452379 2339 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:20:42.452761 kubelet[2339]: E0819 08:20:42.452720 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Aug 19 08:20:42.497023 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Aug 19 08:20:42.498192 kubelet[2339]: I0819 08:20:42.498167 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:42.498822 kubelet[2339]: I0819 08:20:42.498198 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:42.498822 kubelet[2339]: I0819 08:20:42.498222 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:42.498822 kubelet[2339]: I0819 08:20:42.498260 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:42.498822 kubelet[2339]: I0819 08:20:42.498324 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:42.498822 kubelet[2339]: I0819 08:20:42.498367 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:42.498947 kubelet[2339]: I0819 08:20:42.498394 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:42.498947 kubelet[2339]: I0819 08:20:42.498428 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:42.498947 kubelet[2339]: I0819 08:20:42.498461 2339 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:42.505978 kubelet[2339]: E0819 08:20:42.505949 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:42.508470 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Aug 19 08:20:42.527334 kubelet[2339]: E0819 08:20:42.527281 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:42.530342 systemd[1]: Created slice kubepods-burstable-pod7a6485c2279998014873548dc69f9532.slice - libcontainer container kubepods-burstable-pod7a6485c2279998014873548dc69f9532.slice. Aug 19 08:20:42.532120 kubelet[2339]: E0819 08:20:42.532097 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:42.585629 kubelet[2339]: E0819 08:20:42.585559 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.150:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 08:20:42.654896 kubelet[2339]: I0819 08:20:42.654854 2339 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:20:42.655278 kubelet[2339]: E0819 08:20:42.655255 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Aug 19 08:20:42.807876 containerd[1559]: time="2025-08-19T08:20:42.807824493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Aug 19 08:20:42.828746 containerd[1559]: time="2025-08-19T08:20:42.828697925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Aug 19 08:20:42.833375 containerd[1559]: time="2025-08-19T08:20:42.833255003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7a6485c2279998014873548dc69f9532,Namespace:kube-system,Attempt:0,}" Aug 19 08:20:42.835891 containerd[1559]: time="2025-08-19T08:20:42.835757276Z" level=info msg="connecting to shim 41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a" address="unix:///run/containerd/s/a4d68434510aea012bc79a4192ee5980bc8afc60983b33f603ba5c24e38a905b" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:20:42.859725 kubelet[2339]: E0819 08:20:42.859217 2339 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.150:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 08:20:42.861823 containerd[1559]: time="2025-08-19T08:20:42.861496085Z" level=info msg="connecting to shim 462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5" address="unix:///run/containerd/s/dae25556a4e946510ca0337d369177c5182f2edcf5d730f154b4889083897fe8" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:20:42.871253 systemd[1]: Started cri-containerd-41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a.scope - libcontainer container 41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a. Aug 19 08:20:42.883269 containerd[1559]: time="2025-08-19T08:20:42.883212359Z" level=info msg="connecting to shim d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18" address="unix:///run/containerd/s/362831aee2a410df002503679a8cc3a8fd642556e8c64ea996d2934dfe90623a" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:20:42.898240 systemd[1]: Started cri-containerd-462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5.scope - libcontainer container 462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5. Aug 19 08:20:42.914429 systemd[1]: Started cri-containerd-d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18.scope - libcontainer container d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18. Aug 19 08:20:42.929990 containerd[1559]: time="2025-08-19T08:20:42.929906353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a\"" Aug 19 08:20:42.937187 containerd[1559]: time="2025-08-19T08:20:42.936568510Z" level=info msg="CreateContainer within sandbox \"41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 08:20:42.945611 containerd[1559]: time="2025-08-19T08:20:42.945576516Z" level=info msg="Container 1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:20:42.953285 containerd[1559]: time="2025-08-19T08:20:42.953206718Z" level=info msg="CreateContainer within sandbox \"41b26c85b38d73a8b5dccb643db9a0aec0eacda70d120cf3ec8577f0e40ab79a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077\"" Aug 19 08:20:42.953871 containerd[1559]: time="2025-08-19T08:20:42.953790613Z" level=info msg="StartContainer for \"1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077\"" Aug 19 08:20:42.957014 containerd[1559]: time="2025-08-19T08:20:42.956979966Z" level=info msg="connecting to shim 1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077" address="unix:///run/containerd/s/a4d68434510aea012bc79a4192ee5980bc8afc60983b33f603ba5c24e38a905b" protocol=ttrpc version=3 Aug 19 08:20:42.967318 containerd[1559]: time="2025-08-19T08:20:42.967265869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7a6485c2279998014873548dc69f9532,Namespace:kube-system,Attempt:0,} returns sandbox id \"d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18\"" Aug 19 08:20:42.973822 containerd[1559]: time="2025-08-19T08:20:42.973773716Z" level=info msg="CreateContainer within sandbox \"d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 08:20:42.981911 containerd[1559]: time="2025-08-19T08:20:42.981877085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5\"" Aug 19 08:20:42.982247 systemd[1]: Started cri-containerd-1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077.scope - libcontainer container 1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077. Aug 19 08:20:42.986112 containerd[1559]: time="2025-08-19T08:20:42.986027010Z" level=info msg="Container 5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:20:42.987009 containerd[1559]: time="2025-08-19T08:20:42.986946534Z" level=info msg="CreateContainer within sandbox \"462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 08:20:42.993877 containerd[1559]: time="2025-08-19T08:20:42.993839363Z" level=info msg="CreateContainer within sandbox \"d02ff8c8bd4e556cbb0ca124beebc734ac587ea79d8b5da0f3f9abd9e1e8be18\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149\"" Aug 19 08:20:42.994381 containerd[1559]: time="2025-08-19T08:20:42.994362054Z" level=info msg="StartContainer for \"5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149\"" Aug 19 08:20:42.995394 containerd[1559]: time="2025-08-19T08:20:42.995368622Z" level=info msg="connecting to shim 5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149" address="unix:///run/containerd/s/362831aee2a410df002503679a8cc3a8fd642556e8c64ea996d2934dfe90623a" protocol=ttrpc version=3 Aug 19 08:20:42.995913 containerd[1559]: time="2025-08-19T08:20:42.995890210Z" level=info msg="Container 62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:20:43.004996 containerd[1559]: time="2025-08-19T08:20:43.004942419Z" level=info msg="CreateContainer within sandbox \"462768dad3f2fd1bc18f7d15cc96c71402eadd3133ec877315815200fa96f6d5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4\"" Aug 19 08:20:43.005519 containerd[1559]: time="2025-08-19T08:20:43.005499855Z" level=info msg="StartContainer for \"62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4\"" Aug 19 08:20:43.006540 containerd[1559]: time="2025-08-19T08:20:43.006519447Z" level=info msg="connecting to shim 62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4" address="unix:///run/containerd/s/dae25556a4e946510ca0337d369177c5182f2edcf5d730f154b4889083897fe8" protocol=ttrpc version=3 Aug 19 08:20:43.017360 systemd[1]: Started cri-containerd-5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149.scope - libcontainer container 5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149. Aug 19 08:20:43.035263 systemd[1]: Started cri-containerd-62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4.scope - libcontainer container 62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4. Aug 19 08:20:43.042286 containerd[1559]: time="2025-08-19T08:20:43.042250529Z" level=info msg="StartContainer for \"1a00ad34a3966389c86ae8dfdbabefd0450091cf213f53da527316c887094077\" returns successfully" Aug 19 08:20:43.057539 kubelet[2339]: I0819 08:20:43.057486 2339 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:20:43.058543 kubelet[2339]: E0819 08:20:43.058503 2339 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Aug 19 08:20:43.097169 containerd[1559]: time="2025-08-19T08:20:43.097026232Z" level=info msg="StartContainer for \"62297a41704fcd89ee00f048c6d8ed6cb2317f26af12dde80fd3f14e57c3f9e4\" returns successfully" Aug 19 08:20:43.098280 containerd[1559]: time="2025-08-19T08:20:43.098229449Z" level=info msg="StartContainer for \"5f1ad23b4643868ef9eb494f3fe43a05e07e87d68f3d0bf16abd4cccfafa2149\" returns successfully" Aug 19 08:20:43.201250 kubelet[2339]: E0819 08:20:43.201210 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:43.207938 kubelet[2339]: E0819 08:20:43.207909 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:43.213775 kubelet[2339]: E0819 08:20:43.213750 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:43.860945 kubelet[2339]: I0819 08:20:43.860898 2339 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:20:44.218926 kubelet[2339]: E0819 08:20:44.218575 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:44.221094 kubelet[2339]: E0819 08:20:44.219116 2339 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Aug 19 08:20:44.792998 kubelet[2339]: E0819 08:20:44.792943 2339 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Aug 19 08:20:44.886929 kubelet[2339]: I0819 08:20:44.886853 2339 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 08:20:44.886929 kubelet[2339]: E0819 08:20:44.886908 2339 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Aug 19 08:20:44.901894 kubelet[2339]: I0819 08:20:44.901848 2339 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:44.920880 kubelet[2339]: E0819 08:20:44.920756 2339 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.185d1d4bf3630bcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-08-19 08:20:41.787468747 +0000 UTC m=+0.541567387,LastTimestamp:2025-08-19 08:20:41.787468747 +0000 UTC m=+0.541567387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Aug 19 08:20:44.963776 kubelet[2339]: E0819 08:20:44.963697 2339 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:44.963776 kubelet[2339]: I0819 08:20:44.963756 2339 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:44.965669 kubelet[2339]: E0819 08:20:44.965638 2339 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:44.965669 kubelet[2339]: I0819 08:20:44.965659 2339 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:44.967342 kubelet[2339]: E0819 08:20:44.966947 2339 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:45.217508 kubelet[2339]: I0819 08:20:45.217270 2339 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:45.219749 kubelet[2339]: E0819 08:20:45.219719 2339 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:45.782513 kubelet[2339]: I0819 08:20:45.782460 2339 apiserver.go:52] "Watching apiserver" Aug 19 08:20:45.795352 kubelet[2339]: I0819 08:20:45.795306 2339 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 08:20:46.682246 systemd[1]: Reload requested from client PID 2623 ('systemctl') (unit session-7.scope)... Aug 19 08:20:46.682261 systemd[1]: Reloading... Aug 19 08:20:46.769125 zram_generator::config[2666]: No configuration found. Aug 19 08:20:46.993123 systemd[1]: Reloading finished in 310 ms. Aug 19 08:20:47.019577 kubelet[2339]: I0819 08:20:47.019506 2339 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:20:47.019600 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:47.044430 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 08:20:47.044787 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:47.044850 systemd[1]: kubelet.service: Consumed 1.067s CPU time, 131.6M memory peak. Aug 19 08:20:47.046834 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 08:20:47.264665 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 08:20:47.269844 (kubelet)[2711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 08:20:47.311830 kubelet[2711]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:20:47.311830 kubelet[2711]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 08:20:47.311830 kubelet[2711]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 08:20:47.312406 kubelet[2711]: I0819 08:20:47.311900 2711 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 08:20:47.318683 kubelet[2711]: I0819 08:20:47.318649 2711 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 08:20:47.318683 kubelet[2711]: I0819 08:20:47.318670 2711 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 08:20:47.318866 kubelet[2711]: I0819 08:20:47.318844 2711 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 08:20:47.320091 kubelet[2711]: I0819 08:20:47.320055 2711 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 19 08:20:47.322803 kubelet[2711]: I0819 08:20:47.322734 2711 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 08:20:47.327947 kubelet[2711]: I0819 08:20:47.327910 2711 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 08:20:47.332708 kubelet[2711]: I0819 08:20:47.332680 2711 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 08:20:47.332954 kubelet[2711]: I0819 08:20:47.332912 2711 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 08:20:47.333135 kubelet[2711]: I0819 08:20:47.332942 2711 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 08:20:47.333135 kubelet[2711]: I0819 08:20:47.333133 2711 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 08:20:47.333273 kubelet[2711]: I0819 08:20:47.333142 2711 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 08:20:47.333273 kubelet[2711]: I0819 08:20:47.333205 2711 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:20:47.333380 kubelet[2711]: I0819 08:20:47.333361 2711 kubelet.go:480] "Attempting to sync node with API server" Aug 19 08:20:47.333416 kubelet[2711]: I0819 08:20:47.333398 2711 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 08:20:47.333441 kubelet[2711]: I0819 08:20:47.333427 2711 kubelet.go:386] "Adding apiserver pod source" Aug 19 08:20:47.333463 kubelet[2711]: I0819 08:20:47.333441 2711 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 08:20:47.334595 kubelet[2711]: I0819 08:20:47.334564 2711 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 08:20:47.335004 kubelet[2711]: I0819 08:20:47.334979 2711 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 08:20:47.338209 kubelet[2711]: I0819 08:20:47.338169 2711 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 08:20:47.338517 kubelet[2711]: I0819 08:20:47.338254 2711 server.go:1289] "Started kubelet" Aug 19 08:20:47.339859 kubelet[2711]: I0819 08:20:47.339825 2711 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 08:20:47.340689 kubelet[2711]: I0819 08:20:47.340353 2711 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 08:20:47.340689 kubelet[2711]: I0819 08:20:47.340632 2711 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 08:20:47.340689 kubelet[2711]: I0819 08:20:47.340683 2711 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 08:20:47.341892 kubelet[2711]: I0819 08:20:47.341867 2711 server.go:317] "Adding debug handlers to kubelet server" Aug 19 08:20:47.350176 kubelet[2711]: I0819 08:20:47.349708 2711 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 08:20:47.352593 kubelet[2711]: I0819 08:20:47.352547 2711 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 08:20:47.353062 kubelet[2711]: I0819 08:20:47.352959 2711 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 08:20:47.353247 kubelet[2711]: I0819 08:20:47.353215 2711 reconciler.go:26] "Reconciler: start to sync state" Aug 19 08:20:47.354129 kubelet[2711]: I0819 08:20:47.354101 2711 factory.go:223] Registration of the systemd container factory successfully Aug 19 08:20:47.354236 kubelet[2711]: I0819 08:20:47.354214 2711 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 08:20:47.355783 kubelet[2711]: E0819 08:20:47.355753 2711 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 08:20:47.356037 kubelet[2711]: I0819 08:20:47.356009 2711 factory.go:223] Registration of the containerd container factory successfully Aug 19 08:20:47.359881 kubelet[2711]: I0819 08:20:47.359471 2711 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 08:20:47.361267 kubelet[2711]: I0819 08:20:47.361218 2711 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 08:20:47.361267 kubelet[2711]: I0819 08:20:47.361245 2711 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 08:20:47.361267 kubelet[2711]: I0819 08:20:47.361269 2711 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 08:20:47.361375 kubelet[2711]: I0819 08:20:47.361278 2711 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 08:20:47.361375 kubelet[2711]: E0819 08:20:47.361316 2711 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 08:20:47.389137 kubelet[2711]: I0819 08:20:47.389065 2711 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 08:20:47.389137 kubelet[2711]: I0819 08:20:47.389101 2711 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 08:20:47.389137 kubelet[2711]: I0819 08:20:47.389119 2711 state_mem.go:36] "Initialized new in-memory state store" Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389245 2711 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389255 2711 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389272 2711 policy_none.go:49] "None policy: Start" Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389280 2711 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389290 2711 state_mem.go:35] "Initializing new in-memory state store" Aug 19 08:20:47.389380 kubelet[2711]: I0819 08:20:47.389374 2711 state_mem.go:75] "Updated machine memory state" Aug 19 08:20:47.393782 kubelet[2711]: E0819 08:20:47.393723 2711 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 08:20:47.393930 kubelet[2711]: I0819 08:20:47.393920 2711 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 08:20:47.393957 kubelet[2711]: I0819 08:20:47.393931 2711 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 08:20:47.394215 kubelet[2711]: I0819 08:20:47.394185 2711 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 08:20:47.396415 kubelet[2711]: E0819 08:20:47.395514 2711 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 08:20:47.462315 kubelet[2711]: I0819 08:20:47.462273 2711 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:47.462545 kubelet[2711]: I0819 08:20:47.462397 2711 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:47.462684 kubelet[2711]: I0819 08:20:47.462414 2711 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:47.499860 kubelet[2711]: I0819 08:20:47.499804 2711 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Aug 19 08:20:47.505911 kubelet[2711]: I0819 08:20:47.505872 2711 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Aug 19 08:20:47.506100 kubelet[2711]: I0819 08:20:47.505949 2711 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Aug 19 08:20:47.653850 kubelet[2711]: I0819 08:20:47.653686 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:47.653850 kubelet[2711]: I0819 08:20:47.653742 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:47.653850 kubelet[2711]: I0819 08:20:47.653771 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:47.653850 kubelet[2711]: I0819 08:20:47.653842 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Aug 19 08:20:47.654052 kubelet[2711]: I0819 08:20:47.653872 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:47.654052 kubelet[2711]: I0819 08:20:47.653894 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a6485c2279998014873548dc69f9532-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7a6485c2279998014873548dc69f9532\") " pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:47.654052 kubelet[2711]: I0819 08:20:47.653914 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:47.654052 kubelet[2711]: I0819 08:20:47.653934 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:47.654052 kubelet[2711]: I0819 08:20:47.653994 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:48.333720 kubelet[2711]: I0819 08:20:48.333669 2711 apiserver.go:52] "Watching apiserver" Aug 19 08:20:48.353496 kubelet[2711]: I0819 08:20:48.353446 2711 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 08:20:48.374091 kubelet[2711]: I0819 08:20:48.373366 2711 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:48.374091 kubelet[2711]: I0819 08:20:48.373615 2711 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:48.608603 kubelet[2711]: E0819 08:20:48.607800 2711 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Aug 19 08:20:48.609189 kubelet[2711]: E0819 08:20:48.609128 2711 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Aug 19 08:20:48.664947 kubelet[2711]: I0819 08:20:48.664878 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.664738559 podStartE2EDuration="1.664738559s" podCreationTimestamp="2025-08-19 08:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:20:48.60845663 +0000 UTC m=+1.334241967" watchObservedRunningTime="2025-08-19 08:20:48.664738559 +0000 UTC m=+1.390523876" Aug 19 08:20:48.665310 kubelet[2711]: I0819 08:20:48.665280 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.665273923 podStartE2EDuration="1.665273923s" podCreationTimestamp="2025-08-19 08:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:20:48.663774591 +0000 UTC m=+1.389559908" watchObservedRunningTime="2025-08-19 08:20:48.665273923 +0000 UTC m=+1.391059240" Aug 19 08:20:48.674244 kubelet[2711]: I0819 08:20:48.674188 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.674179307 podStartE2EDuration="1.674179307s" podCreationTimestamp="2025-08-19 08:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:20:48.67402078 +0000 UTC m=+1.399806097" watchObservedRunningTime="2025-08-19 08:20:48.674179307 +0000 UTC m=+1.399964624" Aug 19 08:20:53.453557 kubelet[2711]: I0819 08:20:53.453442 2711 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 08:20:53.454620 kubelet[2711]: I0819 08:20:53.454248 2711 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 08:20:53.454703 containerd[1559]: time="2025-08-19T08:20:53.453933268Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 08:20:53.660505 systemd[1]: Created slice kubepods-besteffort-poda642f3d5_e76a_45e7_abc2_985f117d5986.slice - libcontainer container kubepods-besteffort-poda642f3d5_e76a_45e7_abc2_985f117d5986.slice. Aug 19 08:20:53.690463 kubelet[2711]: I0819 08:20:53.690421 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-proxy\") pod \"kube-proxy-zzwqk\" (UID: \"a642f3d5-e76a-45e7-abc2-985f117d5986\") " pod="kube-system/kube-proxy-zzwqk" Aug 19 08:20:53.690463 kubelet[2711]: I0819 08:20:53.690462 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a642f3d5-e76a-45e7-abc2-985f117d5986-lib-modules\") pod \"kube-proxy-zzwqk\" (UID: \"a642f3d5-e76a-45e7-abc2-985f117d5986\") " pod="kube-system/kube-proxy-zzwqk" Aug 19 08:20:53.690463 kubelet[2711]: I0819 08:20:53.690479 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qct6p\" (UniqueName: \"kubernetes.io/projected/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-api-access-qct6p\") pod \"kube-proxy-zzwqk\" (UID: \"a642f3d5-e76a-45e7-abc2-985f117d5986\") " pod="kube-system/kube-proxy-zzwqk" Aug 19 08:20:53.690641 kubelet[2711]: I0819 08:20:53.690501 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a642f3d5-e76a-45e7-abc2-985f117d5986-xtables-lock\") pod \"kube-proxy-zzwqk\" (UID: \"a642f3d5-e76a-45e7-abc2-985f117d5986\") " pod="kube-system/kube-proxy-zzwqk" Aug 19 08:20:53.796607 kubelet[2711]: E0819 08:20:53.796421 2711 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Aug 19 08:20:53.796607 kubelet[2711]: E0819 08:20:53.796449 2711 projected.go:194] Error preparing data for projected volume kube-api-access-qct6p for pod kube-system/kube-proxy-zzwqk: configmap "kube-root-ca.crt" not found Aug 19 08:20:53.796607 kubelet[2711]: E0819 08:20:53.796539 2711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-api-access-qct6p podName:a642f3d5-e76a-45e7-abc2-985f117d5986 nodeName:}" failed. No retries permitted until 2025-08-19 08:20:54.296515346 +0000 UTC m=+7.022300663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qct6p" (UniqueName: "kubernetes.io/projected/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-api-access-qct6p") pod "kube-proxy-zzwqk" (UID: "a642f3d5-e76a-45e7-abc2-985f117d5986") : configmap "kube-root-ca.crt" not found Aug 19 08:20:54.395556 kubelet[2711]: E0819 08:20:54.395503 2711 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Aug 19 08:20:54.395556 kubelet[2711]: E0819 08:20:54.395533 2711 projected.go:194] Error preparing data for projected volume kube-api-access-qct6p for pod kube-system/kube-proxy-zzwqk: configmap "kube-root-ca.crt" not found Aug 19 08:20:54.395796 kubelet[2711]: E0819 08:20:54.395597 2711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-api-access-qct6p podName:a642f3d5-e76a-45e7-abc2-985f117d5986 nodeName:}" failed. No retries permitted until 2025-08-19 08:20:55.395574674 +0000 UTC m=+8.121359991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qct6p" (UniqueName: "kubernetes.io/projected/a642f3d5-e76a-45e7-abc2-985f117d5986-kube-api-access-qct6p") pod "kube-proxy-zzwqk" (UID: "a642f3d5-e76a-45e7-abc2-985f117d5986") : configmap "kube-root-ca.crt" not found Aug 19 08:20:54.665781 systemd[1]: Created slice kubepods-besteffort-podeef7b641_4ccd_49e1_bb8d_fe8a0eb90b2c.slice - libcontainer container kubepods-besteffort-podeef7b641_4ccd_49e1_bb8d_fe8a0eb90b2c.slice. Aug 19 08:20:54.696124 kubelet[2711]: I0819 08:20:54.696064 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw89t\" (UniqueName: \"kubernetes.io/projected/eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c-kube-api-access-gw89t\") pod \"tigera-operator-747864d56d-jm7t8\" (UID: \"eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c\") " pod="tigera-operator/tigera-operator-747864d56d-jm7t8" Aug 19 08:20:54.696524 kubelet[2711]: I0819 08:20:54.696135 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c-var-lib-calico\") pod \"tigera-operator-747864d56d-jm7t8\" (UID: \"eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c\") " pod="tigera-operator/tigera-operator-747864d56d-jm7t8" Aug 19 08:20:54.969040 containerd[1559]: time="2025-08-19T08:20:54.968998805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-jm7t8,Uid:eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c,Namespace:tigera-operator,Attempt:0,}" Aug 19 08:20:54.990520 containerd[1559]: time="2025-08-19T08:20:54.990466076Z" level=info msg="connecting to shim 10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b" address="unix:///run/containerd/s/1e27170d26c93ae7f3b99472d0372c17c4a0cae1edd49dd07851867817ef38bd" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:20:55.025227 systemd[1]: Started cri-containerd-10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b.scope - libcontainer container 10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b. Aug 19 08:20:55.068518 containerd[1559]: time="2025-08-19T08:20:55.068472387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-jm7t8,Uid:eef7b641-4ccd-49e1-bb8d-fe8a0eb90b2c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b\"" Aug 19 08:20:55.070476 containerd[1559]: time="2025-08-19T08:20:55.070238328Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 08:20:55.472813 containerd[1559]: time="2025-08-19T08:20:55.472745488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zzwqk,Uid:a642f3d5-e76a-45e7-abc2-985f117d5986,Namespace:kube-system,Attempt:0,}" Aug 19 08:20:55.494229 containerd[1559]: time="2025-08-19T08:20:55.494166594Z" level=info msg="connecting to shim 95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6" address="unix:///run/containerd/s/dc6f7749ff72715f99edd42d99afade54e803953757cedd3aab5b5c8dc87f718" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:20:55.530261 systemd[1]: Started cri-containerd-95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6.scope - libcontainer container 95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6. Aug 19 08:20:55.558279 containerd[1559]: time="2025-08-19T08:20:55.558231442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zzwqk,Uid:a642f3d5-e76a-45e7-abc2-985f117d5986,Namespace:kube-system,Attempt:0,} returns sandbox id \"95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6\"" Aug 19 08:20:55.563547 containerd[1559]: time="2025-08-19T08:20:55.563502060Z" level=info msg="CreateContainer within sandbox \"95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 08:20:55.575088 containerd[1559]: time="2025-08-19T08:20:55.575032767Z" level=info msg="Container 291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:20:55.582647 containerd[1559]: time="2025-08-19T08:20:55.582618439Z" level=info msg="CreateContainer within sandbox \"95a1f8da6059aa76f17448c3f8180fa0dbd8c76db376ef10fd3e67e6a17098a6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434\"" Aug 19 08:20:55.583045 containerd[1559]: time="2025-08-19T08:20:55.583002771Z" level=info msg="StartContainer for \"291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434\"" Aug 19 08:20:55.584384 containerd[1559]: time="2025-08-19T08:20:55.584358570Z" level=info msg="connecting to shim 291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434" address="unix:///run/containerd/s/dc6f7749ff72715f99edd42d99afade54e803953757cedd3aab5b5c8dc87f718" protocol=ttrpc version=3 Aug 19 08:20:55.608240 systemd[1]: Started cri-containerd-291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434.scope - libcontainer container 291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434. Aug 19 08:20:55.648903 containerd[1559]: time="2025-08-19T08:20:55.648864638Z" level=info msg="StartContainer for \"291d231b7a3c2f224e343e0576d8a790194db02f80a29d7daaed616164377434\" returns successfully" Aug 19 08:20:56.397870 kubelet[2711]: I0819 08:20:56.397799 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zzwqk" podStartSLOduration=3.397784609 podStartE2EDuration="3.397784609s" podCreationTimestamp="2025-08-19 08:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:20:56.397765263 +0000 UTC m=+9.123550580" watchObservedRunningTime="2025-08-19 08:20:56.397784609 +0000 UTC m=+9.123569926" Aug 19 08:20:56.898823 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1202693829.mount: Deactivated successfully. Aug 19 08:20:57.454990 containerd[1559]: time="2025-08-19T08:20:57.454933707Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:57.455792 containerd[1559]: time="2025-08-19T08:20:57.455757873Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 19 08:20:57.457159 containerd[1559]: time="2025-08-19T08:20:57.457112285Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:57.459312 containerd[1559]: time="2025-08-19T08:20:57.459276596Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:20:57.459869 containerd[1559]: time="2025-08-19T08:20:57.459825268Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.389559138s" Aug 19 08:20:57.459869 containerd[1559]: time="2025-08-19T08:20:57.459864934Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 19 08:20:57.464448 containerd[1559]: time="2025-08-19T08:20:57.464420546Z" level=info msg="CreateContainer within sandbox \"10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 08:20:57.473095 containerd[1559]: time="2025-08-19T08:20:57.473057541Z" level=info msg="Container 210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:20:57.479475 containerd[1559]: time="2025-08-19T08:20:57.479437178Z" level=info msg="CreateContainer within sandbox \"10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\"" Aug 19 08:20:57.479808 containerd[1559]: time="2025-08-19T08:20:57.479778997Z" level=info msg="StartContainer for \"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\"" Aug 19 08:20:57.480718 containerd[1559]: time="2025-08-19T08:20:57.480673035Z" level=info msg="connecting to shim 210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df" address="unix:///run/containerd/s/1e27170d26c93ae7f3b99472d0372c17c4a0cae1edd49dd07851867817ef38bd" protocol=ttrpc version=3 Aug 19 08:20:57.533237 systemd[1]: Started cri-containerd-210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df.scope - libcontainer container 210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df. Aug 19 08:20:57.562580 containerd[1559]: time="2025-08-19T08:20:57.562523235Z" level=info msg="StartContainer for \"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\" returns successfully" Aug 19 08:20:59.537819 systemd[1]: cri-containerd-210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df.scope: Deactivated successfully. Aug 19 08:20:59.540977 containerd[1559]: time="2025-08-19T08:20:59.540945630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\" id:\"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\" pid:3060 exit_status:1 exited_at:{seconds:1755591659 nanos:539907562}" Aug 19 08:20:59.541588 containerd[1559]: time="2025-08-19T08:20:59.541568382Z" level=info msg="received exit event container_id:\"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\" id:\"210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df\" pid:3060 exit_status:1 exited_at:{seconds:1755591659 nanos:539907562}" Aug 19 08:20:59.602741 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df-rootfs.mount: Deactivated successfully. Aug 19 08:21:00.399497 kubelet[2711]: I0819 08:21:00.399150 2711 scope.go:117] "RemoveContainer" containerID="210d00dc8e1e7b29a56b85c2035ac15c5b6eae5ecba3d558d3cc4840a6bac5df" Aug 19 08:21:00.401990 containerd[1559]: time="2025-08-19T08:21:00.401925980Z" level=info msg="CreateContainer within sandbox \"10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Aug 19 08:21:00.413542 containerd[1559]: time="2025-08-19T08:21:00.412587586Z" level=info msg="Container c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:00.420244 containerd[1559]: time="2025-08-19T08:21:00.420198229Z" level=info msg="CreateContainer within sandbox \"10bcba356f73ed434083424ffc2e4dbd375eb1b471c9a4602bcb37a7ae2d109b\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22\"" Aug 19 08:21:00.420925 containerd[1559]: time="2025-08-19T08:21:00.420806752Z" level=info msg="StartContainer for \"c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22\"" Aug 19 08:21:00.422682 containerd[1559]: time="2025-08-19T08:21:00.421965919Z" level=info msg="connecting to shim c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22" address="unix:///run/containerd/s/1e27170d26c93ae7f3b99472d0372c17c4a0cae1edd49dd07851867817ef38bd" protocol=ttrpc version=3 Aug 19 08:21:00.443203 systemd[1]: Started cri-containerd-c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22.scope - libcontainer container c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22. Aug 19 08:21:00.474887 containerd[1559]: time="2025-08-19T08:21:00.474844783Z" level=info msg="StartContainer for \"c24ed29556838fcfcede1ed5a594c9eff51b4a3f0b7eacb048f310fcbf900b22\" returns successfully" Aug 19 08:21:01.410749 kubelet[2711]: I0819 08:21:01.410678 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-jm7t8" podStartSLOduration=5.020055891 podStartE2EDuration="7.410662158s" podCreationTimestamp="2025-08-19 08:20:54 +0000 UTC" firstStartedPulling="2025-08-19 08:20:55.069908861 +0000 UTC m=+7.795694168" lastFinishedPulling="2025-08-19 08:20:57.460515118 +0000 UTC m=+10.186300435" observedRunningTime="2025-08-19 08:20:58.401244598 +0000 UTC m=+11.127029915" watchObservedRunningTime="2025-08-19 08:21:01.410662158 +0000 UTC m=+14.136447475" Aug 19 08:21:02.199971 update_engine[1548]: I20250819 08:21:02.199890 1548 update_attempter.cc:509] Updating boot flags... Aug 19 08:21:02.833657 sudo[1779]: pam_unix(sudo:session): session closed for user root Aug 19 08:21:02.835163 sshd[1778]: Connection closed by 10.0.0.1 port 48538 Aug 19 08:21:02.835629 sshd-session[1775]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:02.840585 systemd[1]: sshd@6-10.0.0.150:22-10.0.0.1:48538.service: Deactivated successfully. Aug 19 08:21:02.843067 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 08:21:02.843329 systemd[1]: session-7.scope: Consumed 5.184s CPU time, 227.5M memory peak. Aug 19 08:21:02.844582 systemd-logind[1540]: Session 7 logged out. Waiting for processes to exit. Aug 19 08:21:02.845924 systemd-logind[1540]: Removed session 7. Aug 19 08:21:05.847015 systemd[1]: Created slice kubepods-besteffort-pod5fa7b311_9cae_4428_8ead_de2770b2db83.slice - libcontainer container kubepods-besteffort-pod5fa7b311_9cae_4428_8ead_de2770b2db83.slice. Aug 19 08:21:05.867524 kubelet[2711]: I0819 08:21:05.867467 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmgc\" (UniqueName: \"kubernetes.io/projected/5fa7b311-9cae-4428-8ead-de2770b2db83-kube-api-access-zlmgc\") pod \"calico-typha-7f68dff95f-dnb9z\" (UID: \"5fa7b311-9cae-4428-8ead-de2770b2db83\") " pod="calico-system/calico-typha-7f68dff95f-dnb9z" Aug 19 08:21:05.867524 kubelet[2711]: I0819 08:21:05.867516 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fa7b311-9cae-4428-8ead-de2770b2db83-tigera-ca-bundle\") pod \"calico-typha-7f68dff95f-dnb9z\" (UID: \"5fa7b311-9cae-4428-8ead-de2770b2db83\") " pod="calico-system/calico-typha-7f68dff95f-dnb9z" Aug 19 08:21:05.867905 kubelet[2711]: I0819 08:21:05.867533 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5fa7b311-9cae-4428-8ead-de2770b2db83-typha-certs\") pod \"calico-typha-7f68dff95f-dnb9z\" (UID: \"5fa7b311-9cae-4428-8ead-de2770b2db83\") " pod="calico-system/calico-typha-7f68dff95f-dnb9z" Aug 19 08:21:06.153033 containerd[1559]: time="2025-08-19T08:21:06.152904986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f68dff95f-dnb9z,Uid:5fa7b311-9cae-4428-8ead-de2770b2db83,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:06.189603 containerd[1559]: time="2025-08-19T08:21:06.189563531Z" level=info msg="connecting to shim 27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431" address="unix:///run/containerd/s/b589f53a2997090ca2062b0f82554cd66a3ab37e4b5817a18dbd7d5c1a4f0192" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:06.225533 systemd[1]: Started cri-containerd-27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431.scope - libcontainer container 27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431. Aug 19 08:21:06.245870 systemd[1]: Created slice kubepods-besteffort-poddd1f0714_4174_44ae_babb_01eeea5a5ace.slice - libcontainer container kubepods-besteffort-poddd1f0714_4174_44ae_babb_01eeea5a5ace.slice. Aug 19 08:21:06.270675 kubelet[2711]: I0819 08:21:06.270594 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/dd1f0714-4174-44ae-babb-01eeea5a5ace-node-certs\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270675 kubelet[2711]: I0819 08:21:06.270631 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-flexvol-driver-host\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270675 kubelet[2711]: I0819 08:21:06.270647 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-var-run-calico\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270675 kubelet[2711]: I0819 08:21:06.270661 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-cni-log-dir\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270675 kubelet[2711]: I0819 08:21:06.270675 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-lib-modules\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270867 kubelet[2711]: I0819 08:21:06.270689 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-var-lib-calico\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270867 kubelet[2711]: I0819 08:21:06.270706 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-cni-bin-dir\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270867 kubelet[2711]: I0819 08:21:06.270721 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1f0714-4174-44ae-babb-01eeea5a5ace-tigera-ca-bundle\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270867 kubelet[2711]: I0819 08:21:06.270738 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-cni-net-dir\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270867 kubelet[2711]: I0819 08:21:06.270751 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-policysync\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270984 kubelet[2711]: I0819 08:21:06.270764 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dd1f0714-4174-44ae-babb-01eeea5a5ace-xtables-lock\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.270984 kubelet[2711]: I0819 08:21:06.270778 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdw7b\" (UniqueName: \"kubernetes.io/projected/dd1f0714-4174-44ae-babb-01eeea5a5ace-kube-api-access-jdw7b\") pod \"calico-node-mg4mp\" (UID: \"dd1f0714-4174-44ae-babb-01eeea5a5ace\") " pod="calico-system/calico-node-mg4mp" Aug 19 08:21:06.278019 containerd[1559]: time="2025-08-19T08:21:06.277951028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f68dff95f-dnb9z,Uid:5fa7b311-9cae-4428-8ead-de2770b2db83,Namespace:calico-system,Attempt:0,} returns sandbox id \"27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431\"" Aug 19 08:21:06.280066 containerd[1559]: time="2025-08-19T08:21:06.280034704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 08:21:06.380019 kubelet[2711]: E0819 08:21:06.379929 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.380019 kubelet[2711]: W0819 08:21:06.379956 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.380914 kubelet[2711]: E0819 08:21:06.380793 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.381184 kubelet[2711]: E0819 08:21:06.381131 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.381184 kubelet[2711]: W0819 08:21:06.381156 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.381184 kubelet[2711]: E0819 08:21:06.381178 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.383435 kubelet[2711]: E0819 08:21:06.383414 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.383435 kubelet[2711]: W0819 08:21:06.383429 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.383513 kubelet[2711]: E0819 08:21:06.383441 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.476727 kubelet[2711]: E0819 08:21:06.476375 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:06.551282 containerd[1559]: time="2025-08-19T08:21:06.551219150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mg4mp,Uid:dd1f0714-4174-44ae-babb-01eeea5a5ace,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:06.558631 kubelet[2711]: E0819 08:21:06.558466 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.558631 kubelet[2711]: W0819 08:21:06.558495 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.558631 kubelet[2711]: E0819 08:21:06.558517 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.559197 kubelet[2711]: E0819 08:21:06.559149 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.559197 kubelet[2711]: W0819 08:21:06.559168 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.559197 kubelet[2711]: E0819 08:21:06.559178 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.559448 kubelet[2711]: E0819 08:21:06.559413 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.559448 kubelet[2711]: W0819 08:21:06.559428 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.559448 kubelet[2711]: E0819 08:21:06.559438 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.559770 kubelet[2711]: E0819 08:21:06.559743 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.559770 kubelet[2711]: W0819 08:21:06.559759 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.559770 kubelet[2711]: E0819 08:21:06.559769 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.560435 kubelet[2711]: E0819 08:21:06.560358 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.560435 kubelet[2711]: W0819 08:21:06.560372 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.560435 kubelet[2711]: E0819 08:21:06.560383 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.561858 kubelet[2711]: E0819 08:21:06.561796 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.561858 kubelet[2711]: W0819 08:21:06.561824 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.561858 kubelet[2711]: E0819 08:21:06.561853 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.562169 kubelet[2711]: E0819 08:21:06.562150 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.562169 kubelet[2711]: W0819 08:21:06.562162 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.562241 kubelet[2711]: E0819 08:21:06.562173 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.562501 kubelet[2711]: E0819 08:21:06.562478 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.562501 kubelet[2711]: W0819 08:21:06.562493 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.562501 kubelet[2711]: E0819 08:21:06.562504 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.562753 kubelet[2711]: E0819 08:21:06.562734 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.562753 kubelet[2711]: W0819 08:21:06.562746 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.562823 kubelet[2711]: E0819 08:21:06.562755 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.562933 kubelet[2711]: E0819 08:21:06.562917 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.562933 kubelet[2711]: W0819 08:21:06.562927 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.562988 kubelet[2711]: E0819 08:21:06.562938 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.563855 kubelet[2711]: E0819 08:21:06.563531 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.563855 kubelet[2711]: W0819 08:21:06.563558 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.563855 kubelet[2711]: E0819 08:21:06.563582 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.563855 kubelet[2711]: E0819 08:21:06.563811 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.563855 kubelet[2711]: W0819 08:21:06.563819 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.563855 kubelet[2711]: E0819 08:21:06.563829 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.564165 kubelet[2711]: E0819 08:21:06.564024 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.564165 kubelet[2711]: W0819 08:21:06.564032 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.564165 kubelet[2711]: E0819 08:21:06.564040 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.564652 kubelet[2711]: E0819 08:21:06.564590 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.564652 kubelet[2711]: W0819 08:21:06.564605 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.564652 kubelet[2711]: E0819 08:21:06.564614 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.564927 kubelet[2711]: E0819 08:21:06.564839 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.564927 kubelet[2711]: W0819 08:21:06.564848 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.564927 kubelet[2711]: E0819 08:21:06.564856 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.565126 kubelet[2711]: E0819 08:21:06.565086 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.565126 kubelet[2711]: W0819 08:21:06.565096 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.565126 kubelet[2711]: E0819 08:21:06.565106 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.565403 kubelet[2711]: E0819 08:21:06.565373 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.565403 kubelet[2711]: W0819 08:21:06.565389 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.565403 kubelet[2711]: E0819 08:21:06.565398 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.565686 kubelet[2711]: E0819 08:21:06.565665 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.565686 kubelet[2711]: W0819 08:21:06.565677 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.565686 kubelet[2711]: E0819 08:21:06.565686 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.565961 kubelet[2711]: E0819 08:21:06.565944 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.565961 kubelet[2711]: W0819 08:21:06.565955 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.566021 kubelet[2711]: E0819 08:21:06.565965 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.566798 kubelet[2711]: E0819 08:21:06.566228 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.566798 kubelet[2711]: W0819 08:21:06.566243 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.566798 kubelet[2711]: E0819 08:21:06.566290 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.574506 kubelet[2711]: E0819 08:21:06.574449 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.574506 kubelet[2711]: W0819 08:21:06.574471 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.574506 kubelet[2711]: E0819 08:21:06.574488 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.574614 kubelet[2711]: I0819 08:21:06.574515 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074be308-2f59-4eab-ad49-1f332ee9401f-kubelet-dir\") pod \"csi-node-driver-b4j8r\" (UID: \"074be308-2f59-4eab-ad49-1f332ee9401f\") " pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:06.574872 kubelet[2711]: E0819 08:21:06.574666 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.574872 kubelet[2711]: W0819 08:21:06.574715 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.574872 kubelet[2711]: E0819 08:21:06.574725 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.574872 kubelet[2711]: I0819 08:21:06.574738 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/074be308-2f59-4eab-ad49-1f332ee9401f-socket-dir\") pod \"csi-node-driver-b4j8r\" (UID: \"074be308-2f59-4eab-ad49-1f332ee9401f\") " pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:06.575065 kubelet[2711]: E0819 08:21:06.575046 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.575065 kubelet[2711]: W0819 08:21:06.575058 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.575065 kubelet[2711]: E0819 08:21:06.575084 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.575065 kubelet[2711]: I0819 08:21:06.575106 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/074be308-2f59-4eab-ad49-1f332ee9401f-registration-dir\") pod \"csi-node-driver-b4j8r\" (UID: \"074be308-2f59-4eab-ad49-1f332ee9401f\") " pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:06.575484 kubelet[2711]: E0819 08:21:06.575411 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.575484 kubelet[2711]: W0819 08:21:06.575478 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.575657 kubelet[2711]: E0819 08:21:06.575489 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.575657 kubelet[2711]: I0819 08:21:06.575508 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/074be308-2f59-4eab-ad49-1f332ee9401f-varrun\") pod \"csi-node-driver-b4j8r\" (UID: \"074be308-2f59-4eab-ad49-1f332ee9401f\") " pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:06.575997 kubelet[2711]: E0819 08:21:06.575967 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.575997 kubelet[2711]: W0819 08:21:06.575981 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.575997 kubelet[2711]: E0819 08:21:06.575990 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.576173 kubelet[2711]: I0819 08:21:06.576015 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6b65\" (UniqueName: \"kubernetes.io/projected/074be308-2f59-4eab-ad49-1f332ee9401f-kube-api-access-w6b65\") pod \"csi-node-driver-b4j8r\" (UID: \"074be308-2f59-4eab-ad49-1f332ee9401f\") " pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:06.577111 kubelet[2711]: E0819 08:21:06.577016 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.577111 kubelet[2711]: W0819 08:21:06.577047 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.577111 kubelet[2711]: E0819 08:21:06.577064 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.577406 kubelet[2711]: E0819 08:21:06.577387 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.577406 kubelet[2711]: W0819 08:21:06.577399 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.577463 kubelet[2711]: E0819 08:21:06.577409 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.578294 kubelet[2711]: E0819 08:21:06.578272 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.578294 kubelet[2711]: W0819 08:21:06.578287 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.578294 kubelet[2711]: E0819 08:21:06.578297 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.578801 kubelet[2711]: E0819 08:21:06.578774 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.578841 kubelet[2711]: W0819 08:21:06.578792 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.578841 kubelet[2711]: E0819 08:21:06.578838 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.579313 kubelet[2711]: E0819 08:21:06.579282 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.579313 kubelet[2711]: W0819 08:21:06.579302 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.579313 kubelet[2711]: E0819 08:21:06.579311 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.579535 kubelet[2711]: E0819 08:21:06.579508 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.579535 kubelet[2711]: W0819 08:21:06.579526 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.579535 kubelet[2711]: E0819 08:21:06.579537 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580310 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.580980 kubelet[2711]: W0819 08:21:06.580351 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580363 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580577 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.580980 kubelet[2711]: W0819 08:21:06.580584 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580593 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580800 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.580980 kubelet[2711]: W0819 08:21:06.580810 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.580980 kubelet[2711]: E0819 08:21:06.580820 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.581285 containerd[1559]: time="2025-08-19T08:21:06.580328185Z" level=info msg="connecting to shim 8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a" address="unix:///run/containerd/s/761182aecb072ff491624e0010ae6e7bbb8bda0e9da0add66c2cddbb1ffcadda" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:06.581325 kubelet[2711]: E0819 08:21:06.581020 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.581325 kubelet[2711]: W0819 08:21:06.581028 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.581325 kubelet[2711]: E0819 08:21:06.581035 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.613231 systemd[1]: Started cri-containerd-8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a.scope - libcontainer container 8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a. Aug 19 08:21:06.640889 containerd[1559]: time="2025-08-19T08:21:06.640851549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mg4mp,Uid:dd1f0714-4174-44ae-babb-01eeea5a5ace,Namespace:calico-system,Attempt:0,} returns sandbox id \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\"" Aug 19 08:21:06.676996 kubelet[2711]: E0819 08:21:06.676963 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.676996 kubelet[2711]: W0819 08:21:06.676980 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.676996 kubelet[2711]: E0819 08:21:06.676999 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.677307 kubelet[2711]: E0819 08:21:06.677277 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.677307 kubelet[2711]: W0819 08:21:06.677299 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.677372 kubelet[2711]: E0819 08:21:06.677320 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.677660 kubelet[2711]: E0819 08:21:06.677633 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.677699 kubelet[2711]: W0819 08:21:06.677657 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.677699 kubelet[2711]: E0819 08:21:06.677679 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.677887 kubelet[2711]: E0819 08:21:06.677862 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.677887 kubelet[2711]: W0819 08:21:06.677875 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.677887 kubelet[2711]: E0819 08:21:06.677883 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.678131 kubelet[2711]: E0819 08:21:06.678113 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.678131 kubelet[2711]: W0819 08:21:06.678126 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.678192 kubelet[2711]: E0819 08:21:06.678137 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.678400 kubelet[2711]: E0819 08:21:06.678382 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.678400 kubelet[2711]: W0819 08:21:06.678394 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.678455 kubelet[2711]: E0819 08:21:06.678403 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.678618 kubelet[2711]: E0819 08:21:06.678602 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.678618 kubelet[2711]: W0819 08:21:06.678613 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.678674 kubelet[2711]: E0819 08:21:06.678621 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.678822 kubelet[2711]: E0819 08:21:06.678806 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.678822 kubelet[2711]: W0819 08:21:06.678818 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.678883 kubelet[2711]: E0819 08:21:06.678827 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.679038 kubelet[2711]: E0819 08:21:06.679022 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.679038 kubelet[2711]: W0819 08:21:06.679034 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.679107 kubelet[2711]: E0819 08:21:06.679044 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.679290 kubelet[2711]: E0819 08:21:06.679273 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.679290 kubelet[2711]: W0819 08:21:06.679284 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.679342 kubelet[2711]: E0819 08:21:06.679293 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.679492 kubelet[2711]: E0819 08:21:06.679477 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.679492 kubelet[2711]: W0819 08:21:06.679487 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.679549 kubelet[2711]: E0819 08:21:06.679495 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.679704 kubelet[2711]: E0819 08:21:06.679686 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.679704 kubelet[2711]: W0819 08:21:06.679698 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.679758 kubelet[2711]: E0819 08:21:06.679708 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.679951 kubelet[2711]: E0819 08:21:06.679934 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.679989 kubelet[2711]: W0819 08:21:06.679956 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.679989 kubelet[2711]: E0819 08:21:06.679966 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.680221 kubelet[2711]: E0819 08:21:06.680193 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.680221 kubelet[2711]: W0819 08:21:06.680216 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.680270 kubelet[2711]: E0819 08:21:06.680226 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.680440 kubelet[2711]: E0819 08:21:06.680423 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.680440 kubelet[2711]: W0819 08:21:06.680434 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.680574 kubelet[2711]: E0819 08:21:06.680442 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.680626 kubelet[2711]: E0819 08:21:06.680611 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.680626 kubelet[2711]: W0819 08:21:06.680621 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.680669 kubelet[2711]: E0819 08:21:06.680628 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.680827 kubelet[2711]: E0819 08:21:06.680812 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.680827 kubelet[2711]: W0819 08:21:06.680822 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.680876 kubelet[2711]: E0819 08:21:06.680830 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.681053 kubelet[2711]: E0819 08:21:06.681038 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.681053 kubelet[2711]: W0819 08:21:06.681048 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.681108 kubelet[2711]: E0819 08:21:06.681058 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.681264 kubelet[2711]: E0819 08:21:06.681247 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.681264 kubelet[2711]: W0819 08:21:06.681257 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.681316 kubelet[2711]: E0819 08:21:06.681267 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.681476 kubelet[2711]: E0819 08:21:06.681458 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.681476 kubelet[2711]: W0819 08:21:06.681471 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.681534 kubelet[2711]: E0819 08:21:06.681479 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.681704 kubelet[2711]: E0819 08:21:06.681688 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.681704 kubelet[2711]: W0819 08:21:06.681699 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.681758 kubelet[2711]: E0819 08:21:06.681706 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.682109 kubelet[2711]: E0819 08:21:06.681962 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.682109 kubelet[2711]: W0819 08:21:06.681980 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.682109 kubelet[2711]: E0819 08:21:06.681990 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.682318 kubelet[2711]: E0819 08:21:06.682293 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.682318 kubelet[2711]: W0819 08:21:06.682310 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.682423 kubelet[2711]: E0819 08:21:06.682323 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.682669 kubelet[2711]: E0819 08:21:06.682633 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.682669 kubelet[2711]: W0819 08:21:06.682646 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.682669 kubelet[2711]: E0819 08:21:06.682655 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.682875 kubelet[2711]: E0819 08:21:06.682859 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.682875 kubelet[2711]: W0819 08:21:06.682870 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.682950 kubelet[2711]: E0819 08:21:06.682877 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:06.689580 kubelet[2711]: E0819 08:21:06.689551 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:06.689580 kubelet[2711]: W0819 08:21:06.689564 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:06.689580 kubelet[2711]: E0819 08:21:06.689573 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:07.748721 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1775576306.mount: Deactivated successfully. Aug 19 08:21:08.362425 kubelet[2711]: E0819 08:21:08.362368 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:09.163485 containerd[1559]: time="2025-08-19T08:21:09.163418509Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:09.164198 containerd[1559]: time="2025-08-19T08:21:09.164168063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 19 08:21:09.165605 containerd[1559]: time="2025-08-19T08:21:09.165557685Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:09.167496 containerd[1559]: time="2025-08-19T08:21:09.167466005Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:09.167991 containerd[1559]: time="2025-08-19T08:21:09.167959476Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.887897681s" Aug 19 08:21:09.168037 containerd[1559]: time="2025-08-19T08:21:09.167995434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 19 08:21:09.169268 containerd[1559]: time="2025-08-19T08:21:09.168984580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 08:21:09.179787 containerd[1559]: time="2025-08-19T08:21:09.179754959Z" level=info msg="CreateContainer within sandbox \"27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 08:21:09.188307 containerd[1559]: time="2025-08-19T08:21:09.188262750Z" level=info msg="Container 0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:09.197527 containerd[1559]: time="2025-08-19T08:21:09.197488725Z" level=info msg="CreateContainer within sandbox \"27dedfaa97a4ffb6798e863ae70a4d74d9f55fba6ce3cced02446139c3e7b431\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f\"" Aug 19 08:21:09.198094 containerd[1559]: time="2025-08-19T08:21:09.198049414Z" level=info msg="StartContainer for \"0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f\"" Aug 19 08:21:09.199045 containerd[1559]: time="2025-08-19T08:21:09.198995990Z" level=info msg="connecting to shim 0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f" address="unix:///run/containerd/s/b589f53a2997090ca2062b0f82554cd66a3ab37e4b5817a18dbd7d5c1a4f0192" protocol=ttrpc version=3 Aug 19 08:21:09.225246 systemd[1]: Started cri-containerd-0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f.scope - libcontainer container 0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f. Aug 19 08:21:09.281177 containerd[1559]: time="2025-08-19T08:21:09.281107220Z" level=info msg="StartContainer for \"0c1c9575e13cf75ce553e55565df51ff5fe2a5e284b91cf05e85c2f441b9e98f\" returns successfully" Aug 19 08:21:09.443344 kubelet[2711]: I0819 08:21:09.443153 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7f68dff95f-dnb9z" podStartSLOduration=1.553807841 podStartE2EDuration="4.443138368s" podCreationTimestamp="2025-08-19 08:21:05 +0000 UTC" firstStartedPulling="2025-08-19 08:21:06.279375518 +0000 UTC m=+19.005160835" lastFinishedPulling="2025-08-19 08:21:09.168706045 +0000 UTC m=+21.894491362" observedRunningTime="2025-08-19 08:21:09.437150271 +0000 UTC m=+22.162935588" watchObservedRunningTime="2025-08-19 08:21:09.443138368 +0000 UTC m=+22.168923685" Aug 19 08:21:09.485092 kubelet[2711]: E0819 08:21:09.485026 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.485092 kubelet[2711]: W0819 08:21:09.485055 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.485092 kubelet[2711]: E0819 08:21:09.485090 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.485496 kubelet[2711]: E0819 08:21:09.485471 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.485496 kubelet[2711]: W0819 08:21:09.485488 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.485496 kubelet[2711]: E0819 08:21:09.485498 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.485730 kubelet[2711]: E0819 08:21:09.485706 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.485730 kubelet[2711]: W0819 08:21:09.485720 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.485730 kubelet[2711]: E0819 08:21:09.485729 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.486467 kubelet[2711]: E0819 08:21:09.486437 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.486467 kubelet[2711]: W0819 08:21:09.486459 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.486467 kubelet[2711]: E0819 08:21:09.486469 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.486750 kubelet[2711]: E0819 08:21:09.486721 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.486750 kubelet[2711]: W0819 08:21:09.486739 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.486750 kubelet[2711]: E0819 08:21:09.486750 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.487017 kubelet[2711]: E0819 08:21:09.486993 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.487017 kubelet[2711]: W0819 08:21:09.487008 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.487089 kubelet[2711]: E0819 08:21:09.487017 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.487319 kubelet[2711]: E0819 08:21:09.487294 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.487319 kubelet[2711]: W0819 08:21:09.487309 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.487319 kubelet[2711]: E0819 08:21:09.487318 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.487687 kubelet[2711]: E0819 08:21:09.487659 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.487800 kubelet[2711]: W0819 08:21:09.487678 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.487800 kubelet[2711]: E0819 08:21:09.487798 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.488263 kubelet[2711]: E0819 08:21:09.488238 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.488263 kubelet[2711]: W0819 08:21:09.488253 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.488263 kubelet[2711]: E0819 08:21:09.488262 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.488936 kubelet[2711]: E0819 08:21:09.488877 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.488936 kubelet[2711]: W0819 08:21:09.488895 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.488936 kubelet[2711]: E0819 08:21:09.488905 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.489349 kubelet[2711]: E0819 08:21:09.489320 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.489349 kubelet[2711]: W0819 08:21:09.489336 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.489349 kubelet[2711]: E0819 08:21:09.489346 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.489893 kubelet[2711]: E0819 08:21:09.489864 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.489893 kubelet[2711]: W0819 08:21:09.489878 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.489893 kubelet[2711]: E0819 08:21:09.489888 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.490562 kubelet[2711]: E0819 08:21:09.490447 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.490562 kubelet[2711]: W0819 08:21:09.490496 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.490562 kubelet[2711]: E0819 08:21:09.490523 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.491332 kubelet[2711]: E0819 08:21:09.491269 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.491332 kubelet[2711]: W0819 08:21:09.491281 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.491332 kubelet[2711]: E0819 08:21:09.491291 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.491755 kubelet[2711]: E0819 08:21:09.491662 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.491755 kubelet[2711]: W0819 08:21:09.491673 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.491755 kubelet[2711]: E0819 08:21:09.491683 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.500150 kubelet[2711]: E0819 08:21:09.500136 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.500240 kubelet[2711]: W0819 08:21:09.500228 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.500296 kubelet[2711]: E0819 08:21:09.500286 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.500614 kubelet[2711]: E0819 08:21:09.500583 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.500614 kubelet[2711]: W0819 08:21:09.500593 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.500614 kubelet[2711]: E0819 08:21:09.500602 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.501096 kubelet[2711]: E0819 08:21:09.501028 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.501174 kubelet[2711]: W0819 08:21:09.501161 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.501250 kubelet[2711]: E0819 08:21:09.501237 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.501602 kubelet[2711]: E0819 08:21:09.501570 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.501602 kubelet[2711]: W0819 08:21:09.501581 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.501602 kubelet[2711]: E0819 08:21:09.501590 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.501934 kubelet[2711]: E0819 08:21:09.501900 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.501934 kubelet[2711]: W0819 08:21:09.501912 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.501934 kubelet[2711]: E0819 08:21:09.501922 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.502358 kubelet[2711]: E0819 08:21:09.502326 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.502358 kubelet[2711]: W0819 08:21:09.502337 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.502358 kubelet[2711]: E0819 08:21:09.502346 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.502741 kubelet[2711]: E0819 08:21:09.502707 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.502741 kubelet[2711]: W0819 08:21:09.502718 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.502741 kubelet[2711]: E0819 08:21:09.502729 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.503113 kubelet[2711]: E0819 08:21:09.503046 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.503113 kubelet[2711]: W0819 08:21:09.503057 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.503199 kubelet[2711]: E0819 08:21:09.503065 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.503503 kubelet[2711]: E0819 08:21:09.503491 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.503566 kubelet[2711]: W0819 08:21:09.503555 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.503614 kubelet[2711]: E0819 08:21:09.503605 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.504275 kubelet[2711]: E0819 08:21:09.504242 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.504275 kubelet[2711]: W0819 08:21:09.504254 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.504275 kubelet[2711]: E0819 08:21:09.504263 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.504603 kubelet[2711]: E0819 08:21:09.504572 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.504603 kubelet[2711]: W0819 08:21:09.504582 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.504603 kubelet[2711]: E0819 08:21:09.504591 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.504947 kubelet[2711]: E0819 08:21:09.504917 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.504947 kubelet[2711]: W0819 08:21:09.504927 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.504947 kubelet[2711]: E0819 08:21:09.504936 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.505340 kubelet[2711]: E0819 08:21:09.505298 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.505340 kubelet[2711]: W0819 08:21:09.505319 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.505340 kubelet[2711]: E0819 08:21:09.505327 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.505722 kubelet[2711]: E0819 08:21:09.505688 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.505722 kubelet[2711]: W0819 08:21:09.505699 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.505722 kubelet[2711]: E0819 08:21:09.505709 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.506152 kubelet[2711]: E0819 08:21:09.506121 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.506152 kubelet[2711]: W0819 08:21:09.506131 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.506152 kubelet[2711]: E0819 08:21:09.506140 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.506571 kubelet[2711]: E0819 08:21:09.506540 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.506571 kubelet[2711]: W0819 08:21:09.506551 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.506571 kubelet[2711]: E0819 08:21:09.506559 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.507108 kubelet[2711]: E0819 08:21:09.507049 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.507108 kubelet[2711]: W0819 08:21:09.507060 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.507258 kubelet[2711]: E0819 08:21:09.507204 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:09.507518 kubelet[2711]: E0819 08:21:09.507479 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:09.507518 kubelet[2711]: W0819 08:21:09.507490 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:09.507518 kubelet[2711]: E0819 08:21:09.507500 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.361921 kubelet[2711]: E0819 08:21:10.361859 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:10.498722 kubelet[2711]: E0819 08:21:10.498686 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.498722 kubelet[2711]: W0819 08:21:10.498705 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.498722 kubelet[2711]: E0819 08:21:10.498723 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.499292 kubelet[2711]: E0819 08:21:10.498963 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.499292 kubelet[2711]: W0819 08:21:10.498980 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.499292 kubelet[2711]: E0819 08:21:10.499002 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.499363 kubelet[2711]: E0819 08:21:10.499327 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.499363 kubelet[2711]: W0819 08:21:10.499339 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.499363 kubelet[2711]: E0819 08:21:10.499349 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.499607 kubelet[2711]: E0819 08:21:10.499589 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.499607 kubelet[2711]: W0819 08:21:10.499600 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.499665 kubelet[2711]: E0819 08:21:10.499623 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.499886 kubelet[2711]: E0819 08:21:10.499830 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.499886 kubelet[2711]: W0819 08:21:10.499866 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.499886 kubelet[2711]: E0819 08:21:10.499875 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.500048 kubelet[2711]: E0819 08:21:10.500033 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.500048 kubelet[2711]: W0819 08:21:10.500043 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.500113 kubelet[2711]: E0819 08:21:10.500050 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.500315 kubelet[2711]: E0819 08:21:10.500281 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.500315 kubelet[2711]: W0819 08:21:10.500293 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.500315 kubelet[2711]: E0819 08:21:10.500302 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.500757 kubelet[2711]: E0819 08:21:10.500538 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.500757 kubelet[2711]: W0819 08:21:10.500561 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.500757 kubelet[2711]: E0819 08:21:10.500593 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.500896 kubelet[2711]: E0819 08:21:10.500878 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.500896 kubelet[2711]: W0819 08:21:10.500892 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.500947 kubelet[2711]: E0819 08:21:10.500905 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.501117 kubelet[2711]: E0819 08:21:10.501101 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.501117 kubelet[2711]: W0819 08:21:10.501112 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.501188 kubelet[2711]: E0819 08:21:10.501120 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.501306 kubelet[2711]: E0819 08:21:10.501292 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.501306 kubelet[2711]: W0819 08:21:10.501302 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.501358 kubelet[2711]: E0819 08:21:10.501310 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.501494 kubelet[2711]: E0819 08:21:10.501480 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.501494 kubelet[2711]: W0819 08:21:10.501490 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.501550 kubelet[2711]: E0819 08:21:10.501497 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.501697 kubelet[2711]: E0819 08:21:10.501682 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.501697 kubelet[2711]: W0819 08:21:10.501693 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.501745 kubelet[2711]: E0819 08:21:10.501700 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.501911 kubelet[2711]: E0819 08:21:10.501896 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.501911 kubelet[2711]: W0819 08:21:10.501907 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.501958 kubelet[2711]: E0819 08:21:10.501914 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.502111 kubelet[2711]: E0819 08:21:10.502096 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.502111 kubelet[2711]: W0819 08:21:10.502107 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.502164 kubelet[2711]: E0819 08:21:10.502115 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.508391 kubelet[2711]: E0819 08:21:10.508373 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.508391 kubelet[2711]: W0819 08:21:10.508387 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.508459 kubelet[2711]: E0819 08:21:10.508397 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.508646 kubelet[2711]: E0819 08:21:10.508606 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.508646 kubelet[2711]: W0819 08:21:10.508620 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.508646 kubelet[2711]: E0819 08:21:10.508630 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.508856 kubelet[2711]: E0819 08:21:10.508825 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.508856 kubelet[2711]: W0819 08:21:10.508832 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.508856 kubelet[2711]: E0819 08:21:10.508840 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.509115 kubelet[2711]: E0819 08:21:10.509085 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.509115 kubelet[2711]: W0819 08:21:10.509099 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.509115 kubelet[2711]: E0819 08:21:10.509108 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.509323 kubelet[2711]: E0819 08:21:10.509307 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.509323 kubelet[2711]: W0819 08:21:10.509318 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.509372 kubelet[2711]: E0819 08:21:10.509326 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.509557 kubelet[2711]: E0819 08:21:10.509539 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.509557 kubelet[2711]: W0819 08:21:10.509552 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.509624 kubelet[2711]: E0819 08:21:10.509563 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.509779 kubelet[2711]: E0819 08:21:10.509764 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.509779 kubelet[2711]: W0819 08:21:10.509775 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.509823 kubelet[2711]: E0819 08:21:10.509782 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.510124 kubelet[2711]: E0819 08:21:10.510107 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.510124 kubelet[2711]: W0819 08:21:10.510121 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.510196 kubelet[2711]: E0819 08:21:10.510131 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.510338 kubelet[2711]: E0819 08:21:10.510322 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.510338 kubelet[2711]: W0819 08:21:10.510333 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.510386 kubelet[2711]: E0819 08:21:10.510342 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.510545 kubelet[2711]: E0819 08:21:10.510530 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.510545 kubelet[2711]: W0819 08:21:10.510540 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.510611 kubelet[2711]: E0819 08:21:10.510549 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.510732 kubelet[2711]: E0819 08:21:10.510716 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.510732 kubelet[2711]: W0819 08:21:10.510727 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.510778 kubelet[2711]: E0819 08:21:10.510735 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.510915 kubelet[2711]: E0819 08:21:10.510899 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.510915 kubelet[2711]: W0819 08:21:10.510911 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.510975 kubelet[2711]: E0819 08:21:10.510919 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.511144 kubelet[2711]: E0819 08:21:10.511129 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.511144 kubelet[2711]: W0819 08:21:10.511139 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.511215 kubelet[2711]: E0819 08:21:10.511148 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.511333 kubelet[2711]: E0819 08:21:10.511318 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.511333 kubelet[2711]: W0819 08:21:10.511328 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.511384 kubelet[2711]: E0819 08:21:10.511336 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.511561 kubelet[2711]: E0819 08:21:10.511545 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.511561 kubelet[2711]: W0819 08:21:10.511556 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.511622 kubelet[2711]: E0819 08:21:10.511565 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.511847 kubelet[2711]: E0819 08:21:10.511832 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.511847 kubelet[2711]: W0819 08:21:10.511843 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.511915 kubelet[2711]: E0819 08:21:10.511854 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.512195 kubelet[2711]: E0819 08:21:10.512178 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.512195 kubelet[2711]: W0819 08:21:10.512191 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.512241 kubelet[2711]: E0819 08:21:10.512201 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.512389 kubelet[2711]: E0819 08:21:10.512374 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 08:21:10.512389 kubelet[2711]: W0819 08:21:10.512384 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 08:21:10.512444 kubelet[2711]: E0819 08:21:10.512392 2711 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 08:21:10.626378 containerd[1559]: time="2025-08-19T08:21:10.626252171Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:10.628035 containerd[1559]: time="2025-08-19T08:21:10.627945575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 19 08:21:10.629096 containerd[1559]: time="2025-08-19T08:21:10.628914733Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:10.767328 containerd[1559]: time="2025-08-19T08:21:10.767235974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:10.767751 containerd[1559]: time="2025-08-19T08:21:10.767703046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.598655898s" Aug 19 08:21:10.767751 containerd[1559]: time="2025-08-19T08:21:10.767747841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 19 08:21:10.772849 containerd[1559]: time="2025-08-19T08:21:10.772803566Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 08:21:10.781920 containerd[1559]: time="2025-08-19T08:21:10.781895051Z" level=info msg="Container 6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:10.791705 containerd[1559]: time="2025-08-19T08:21:10.791664795Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\"" Aug 19 08:21:10.792133 containerd[1559]: time="2025-08-19T08:21:10.792097031Z" level=info msg="StartContainer for \"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\"" Aug 19 08:21:10.793576 containerd[1559]: time="2025-08-19T08:21:10.793547447Z" level=info msg="connecting to shim 6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27" address="unix:///run/containerd/s/761182aecb072ff491624e0010ae6e7bbb8bda0e9da0add66c2cddbb1ffcadda" protocol=ttrpc version=3 Aug 19 08:21:10.823349 systemd[1]: Started cri-containerd-6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27.scope - libcontainer container 6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27. Aug 19 08:21:10.867804 containerd[1559]: time="2025-08-19T08:21:10.867755466Z" level=info msg="StartContainer for \"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\" returns successfully" Aug 19 08:21:10.877874 systemd[1]: cri-containerd-6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27.scope: Deactivated successfully. Aug 19 08:21:10.880095 containerd[1559]: time="2025-08-19T08:21:10.879932191Z" level=info msg="received exit event container_id:\"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\" id:\"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\" pid:3519 exited_at:{seconds:1755591670 nanos:879210660}" Aug 19 08:21:10.880095 containerd[1559]: time="2025-08-19T08:21:10.879978117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\" id:\"6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27\" pid:3519 exited_at:{seconds:1755591670 nanos:879210660}" Aug 19 08:21:10.907489 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6b82027348ebdf7e9475e33df0e86a1c09c57576f8dacb1cad71d489b205ba27-rootfs.mount: Deactivated successfully. Aug 19 08:21:11.426658 containerd[1559]: time="2025-08-19T08:21:11.426608826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 08:21:12.361828 kubelet[2711]: E0819 08:21:12.361781 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:14.317226 containerd[1559]: time="2025-08-19T08:21:14.317165634Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:14.317891 containerd[1559]: time="2025-08-19T08:21:14.317850565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 19 08:21:14.318897 containerd[1559]: time="2025-08-19T08:21:14.318861509Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:14.320686 containerd[1559]: time="2025-08-19T08:21:14.320653945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:14.321328 containerd[1559]: time="2025-08-19T08:21:14.321298058Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 2.894648997s" Aug 19 08:21:14.321328 containerd[1559]: time="2025-08-19T08:21:14.321324297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 19 08:21:14.325536 containerd[1559]: time="2025-08-19T08:21:14.325501886Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 08:21:14.334221 containerd[1559]: time="2025-08-19T08:21:14.334196624Z" level=info msg="Container b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:14.344344 containerd[1559]: time="2025-08-19T08:21:14.344298713Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\"" Aug 19 08:21:14.344801 containerd[1559]: time="2025-08-19T08:21:14.344764579Z" level=info msg="StartContainer for \"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\"" Aug 19 08:21:14.346285 containerd[1559]: time="2025-08-19T08:21:14.346256279Z" level=info msg="connecting to shim b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc" address="unix:///run/containerd/s/761182aecb072ff491624e0010ae6e7bbb8bda0e9da0add66c2cddbb1ffcadda" protocol=ttrpc version=3 Aug 19 08:21:14.361648 kubelet[2711]: E0819 08:21:14.361593 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:14.364323 systemd[1]: Started cri-containerd-b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc.scope - libcontainer container b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc. Aug 19 08:21:14.406721 containerd[1559]: time="2025-08-19T08:21:14.406677218Z" level=info msg="StartContainer for \"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\" returns successfully" Aug 19 08:21:15.837549 systemd[1]: cri-containerd-b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc.scope: Deactivated successfully. Aug 19 08:21:15.837883 systemd[1]: cri-containerd-b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc.scope: Consumed 607ms CPU time, 176.7M memory peak, 2.5M read from disk, 171.2M written to disk. Aug 19 08:21:15.838547 containerd[1559]: time="2025-08-19T08:21:15.838468401Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\" id:\"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\" pid:3577 exited_at:{seconds:1755591675 nanos:838169068}" Aug 19 08:21:15.838547 containerd[1559]: time="2025-08-19T08:21:15.838476186Z" level=info msg="received exit event container_id:\"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\" id:\"b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc\" pid:3577 exited_at:{seconds:1755591675 nanos:838169068}" Aug 19 08:21:15.862086 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b83ee663e49f085aed020601df20e2db3df5f17fbdbef53f8208cbf1f4e903cc-rootfs.mount: Deactivated successfully. Aug 19 08:21:15.901185 kubelet[2711]: I0819 08:21:15.901154 2711 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 19 08:21:16.109832 systemd[1]: Created slice kubepods-besteffort-pod7677a958_0a64_4b43_a17a_755951db9108.slice - libcontainer container kubepods-besteffort-pod7677a958_0a64_4b43_a17a_755951db9108.slice. Aug 19 08:21:16.124165 systemd[1]: Created slice kubepods-burstable-pod6c38262c_06e6_4c32_ba8a_36e0b07e2d95.slice - libcontainer container kubepods-burstable-pod6c38262c_06e6_4c32_ba8a_36e0b07e2d95.slice. Aug 19 08:21:16.136478 systemd[1]: Created slice kubepods-burstable-podfa0f8f47_19d7_4561_813e_30ce4c8bb154.slice - libcontainer container kubepods-burstable-podfa0f8f47_19d7_4561_813e_30ce4c8bb154.slice. Aug 19 08:21:16.144920 systemd[1]: Created slice kubepods-besteffort-pod01eef2fa_38e5_4788_9612_447f1a9d137b.slice - libcontainer container kubepods-besteffort-pod01eef2fa_38e5_4788_9612_447f1a9d137b.slice. Aug 19 08:21:16.148271 kubelet[2711]: I0819 08:21:16.147845 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpwss\" (UniqueName: \"kubernetes.io/projected/56a0933c-2685-4cd7-8b89-03d0cd7801cf-kube-api-access-cpwss\") pod \"goldmane-768f4c5c69-5qz84\" (UID: \"56a0933c-2685-4cd7-8b89-03d0cd7801cf\") " pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.148271 kubelet[2711]: I0819 08:21:16.147879 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wh7v\" (UniqueName: \"kubernetes.io/projected/6c38262c-06e6-4c32-ba8a-36e0b07e2d95-kube-api-access-4wh7v\") pod \"coredns-674b8bbfcf-vbj6m\" (UID: \"6c38262c-06e6-4c32-ba8a-36e0b07e2d95\") " pod="kube-system/coredns-674b8bbfcf-vbj6m" Aug 19 08:21:16.148271 kubelet[2711]: I0819 08:21:16.147897 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56a0933c-2685-4cd7-8b89-03d0cd7801cf-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-5qz84\" (UID: \"56a0933c-2685-4cd7-8b89-03d0cd7801cf\") " pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.148271 kubelet[2711]: I0819 08:21:16.147912 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a25fc55-9c92-4016-bb5f-0473cc98f094-calico-apiserver-certs\") pod \"calico-apiserver-7459684667-rxhqh\" (UID: \"4a25fc55-9c92-4016-bb5f-0473cc98f094\") " pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" Aug 19 08:21:16.148271 kubelet[2711]: I0819 08:21:16.147927 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/155e2b61-2afd-491c-953f-5f486780c300-calico-apiserver-certs\") pod \"calico-apiserver-56bb94f46f-lkqqb\" (UID: \"155e2b61-2afd-491c-953f-5f486780c300\") " pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" Aug 19 08:21:16.148465 kubelet[2711]: I0819 08:21:16.147941 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7677a958-0a64-4b43-a17a-755951db9108-tigera-ca-bundle\") pod \"calico-kube-controllers-7787566946-5j8pk\" (UID: \"7677a958-0a64-4b43-a17a-755951db9108\") " pod="calico-system/calico-kube-controllers-7787566946-5j8pk" Aug 19 08:21:16.148465 kubelet[2711]: I0819 08:21:16.147956 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bn9\" (UniqueName: \"kubernetes.io/projected/7677a958-0a64-4b43-a17a-755951db9108-kube-api-access-42bn9\") pod \"calico-kube-controllers-7787566946-5j8pk\" (UID: \"7677a958-0a64-4b43-a17a-755951db9108\") " pod="calico-system/calico-kube-controllers-7787566946-5j8pk" Aug 19 08:21:16.148465 kubelet[2711]: I0819 08:21:16.147998 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0f8f47-19d7-4561-813e-30ce4c8bb154-config-volume\") pod \"coredns-674b8bbfcf-2mvkn\" (UID: \"fa0f8f47-19d7-4561-813e-30ce4c8bb154\") " pod="kube-system/coredns-674b8bbfcf-2mvkn" Aug 19 08:21:16.148465 kubelet[2711]: I0819 08:21:16.148017 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndb4\" (UniqueName: \"kubernetes.io/projected/c38c3c29-994f-49d0-9415-10147f16c433-kube-api-access-wndb4\") pod \"whisker-74d8c98d75-4wqhx\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " pod="calico-system/whisker-74d8c98d75-4wqhx" Aug 19 08:21:16.148465 kubelet[2711]: I0819 08:21:16.148052 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/01eef2fa-38e5-4788-9612-447f1a9d137b-calico-apiserver-certs\") pod \"calico-apiserver-7459684667-2qvhr\" (UID: \"01eef2fa-38e5-4788-9612-447f1a9d137b\") " pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" Aug 19 08:21:16.148583 kubelet[2711]: I0819 08:21:16.148084 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt76\" (UniqueName: \"kubernetes.io/projected/01eef2fa-38e5-4788-9612-447f1a9d137b-kube-api-access-lpt76\") pod \"calico-apiserver-7459684667-2qvhr\" (UID: \"01eef2fa-38e5-4788-9612-447f1a9d137b\") " pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" Aug 19 08:21:16.148583 kubelet[2711]: I0819 08:21:16.148098 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmt7\" (UniqueName: \"kubernetes.io/projected/fa0f8f47-19d7-4561-813e-30ce4c8bb154-kube-api-access-9mmt7\") pod \"coredns-674b8bbfcf-2mvkn\" (UID: \"fa0f8f47-19d7-4561-813e-30ce4c8bb154\") " pod="kube-system/coredns-674b8bbfcf-2mvkn" Aug 19 08:21:16.148583 kubelet[2711]: I0819 08:21:16.148118 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c38c3c29-994f-49d0-9415-10147f16c433-whisker-backend-key-pair\") pod \"whisker-74d8c98d75-4wqhx\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " pod="calico-system/whisker-74d8c98d75-4wqhx" Aug 19 08:21:16.148583 kubelet[2711]: I0819 08:21:16.148139 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c38c3c29-994f-49d0-9415-10147f16c433-whisker-ca-bundle\") pod \"whisker-74d8c98d75-4wqhx\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " pod="calico-system/whisker-74d8c98d75-4wqhx" Aug 19 08:21:16.148583 kubelet[2711]: I0819 08:21:16.148153 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wx2l\" (UniqueName: \"kubernetes.io/projected/155e2b61-2afd-491c-953f-5f486780c300-kube-api-access-6wx2l\") pod \"calico-apiserver-56bb94f46f-lkqqb\" (UID: \"155e2b61-2afd-491c-953f-5f486780c300\") " pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" Aug 19 08:21:16.148719 kubelet[2711]: I0819 08:21:16.148168 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56a0933c-2685-4cd7-8b89-03d0cd7801cf-config\") pod \"goldmane-768f4c5c69-5qz84\" (UID: \"56a0933c-2685-4cd7-8b89-03d0cd7801cf\") " pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.148719 kubelet[2711]: I0819 08:21:16.148182 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/56a0933c-2685-4cd7-8b89-03d0cd7801cf-goldmane-key-pair\") pod \"goldmane-768f4c5c69-5qz84\" (UID: \"56a0933c-2685-4cd7-8b89-03d0cd7801cf\") " pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.148719 kubelet[2711]: I0819 08:21:16.148196 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jnj\" (UniqueName: \"kubernetes.io/projected/4a25fc55-9c92-4016-bb5f-0473cc98f094-kube-api-access-s9jnj\") pod \"calico-apiserver-7459684667-rxhqh\" (UID: \"4a25fc55-9c92-4016-bb5f-0473cc98f094\") " pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" Aug 19 08:21:16.148719 kubelet[2711]: I0819 08:21:16.148211 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c38262c-06e6-4c32-ba8a-36e0b07e2d95-config-volume\") pod \"coredns-674b8bbfcf-vbj6m\" (UID: \"6c38262c-06e6-4c32-ba8a-36e0b07e2d95\") " pod="kube-system/coredns-674b8bbfcf-vbj6m" Aug 19 08:21:16.151744 systemd[1]: Created slice kubepods-besteffort-podc38c3c29_994f_49d0_9415_10147f16c433.slice - libcontainer container kubepods-besteffort-podc38c3c29_994f_49d0_9415_10147f16c433.slice. Aug 19 08:21:16.159170 systemd[1]: Created slice kubepods-besteffort-pod155e2b61_2afd_491c_953f_5f486780c300.slice - libcontainer container kubepods-besteffort-pod155e2b61_2afd_491c_953f_5f486780c300.slice. Aug 19 08:21:16.164573 systemd[1]: Created slice kubepods-besteffort-pod4a25fc55_9c92_4016_bb5f_0473cc98f094.slice - libcontainer container kubepods-besteffort-pod4a25fc55_9c92_4016_bb5f_0473cc98f094.slice. Aug 19 08:21:16.169224 systemd[1]: Created slice kubepods-besteffort-pod56a0933c_2685_4cd7_8b89_03d0cd7801cf.slice - libcontainer container kubepods-besteffort-pod56a0933c_2685_4cd7_8b89_03d0cd7801cf.slice. Aug 19 08:21:16.366864 systemd[1]: Created slice kubepods-besteffort-pod074be308_2f59_4eab_ad49_1f332ee9401f.slice - libcontainer container kubepods-besteffort-pod074be308_2f59_4eab_ad49_1f332ee9401f.slice. Aug 19 08:21:16.369496 containerd[1559]: time="2025-08-19T08:21:16.369459272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4j8r,Uid:074be308-2f59-4eab-ad49-1f332ee9401f,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:16.420593 containerd[1559]: time="2025-08-19T08:21:16.420539187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7787566946-5j8pk,Uid:7677a958-0a64-4b43-a17a-755951db9108,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:16.439900 containerd[1559]: time="2025-08-19T08:21:16.439840815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vbj6m,Uid:6c38262c-06e6-4c32-ba8a-36e0b07e2d95,Namespace:kube-system,Attempt:0,}" Aug 19 08:21:16.440815 containerd[1559]: time="2025-08-19T08:21:16.440648797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mvkn,Uid:fa0f8f47-19d7-4561-813e-30ce4c8bb154,Namespace:kube-system,Attempt:0,}" Aug 19 08:21:16.447970 containerd[1559]: time="2025-08-19T08:21:16.447933451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 08:21:16.448887 containerd[1559]: time="2025-08-19T08:21:16.448858703Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-2qvhr,Uid:01eef2fa-38e5-4788-9612-447f1a9d137b,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:16.455614 containerd[1559]: time="2025-08-19T08:21:16.455517729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d8c98d75-4wqhx,Uid:c38c3c29-994f-49d0-9415-10147f16c433,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:16.464099 containerd[1559]: time="2025-08-19T08:21:16.463275525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56bb94f46f-lkqqb,Uid:155e2b61-2afd-491c-953f-5f486780c300,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:16.468706 containerd[1559]: time="2025-08-19T08:21:16.468658619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-rxhqh,Uid:4a25fc55-9c92-4016-bb5f-0473cc98f094,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:16.474155 containerd[1559]: time="2025-08-19T08:21:16.474068174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5qz84,Uid:56a0933c-2685-4cd7-8b89-03d0cd7801cf,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:16.490472 containerd[1559]: time="2025-08-19T08:21:16.490416361Z" level=error msg="Failed to destroy network for sandbox \"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.502050 containerd[1559]: time="2025-08-19T08:21:16.501763814Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7787566946-5j8pk,Uid:7677a958-0a64-4b43-a17a-755951db9108,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.511109 kubelet[2711]: E0819 08:21:16.511015 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.511171 kubelet[2711]: E0819 08:21:16.511122 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7787566946-5j8pk" Aug 19 08:21:16.511171 kubelet[2711]: E0819 08:21:16.511144 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7787566946-5j8pk" Aug 19 08:21:16.511229 kubelet[2711]: E0819 08:21:16.511195 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7787566946-5j8pk_calico-system(7677a958-0a64-4b43-a17a-755951db9108)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7787566946-5j8pk_calico-system(7677a958-0a64-4b43-a17a-755951db9108)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1edf00a9e0e163402f5c9915039269005879256a29bf586d6b8eeb733b2ea83\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7787566946-5j8pk" podUID="7677a958-0a64-4b43-a17a-755951db9108" Aug 19 08:21:16.522126 containerd[1559]: time="2025-08-19T08:21:16.522013757Z" level=error msg="Failed to destroy network for sandbox \"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.550595 containerd[1559]: time="2025-08-19T08:21:16.550507621Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vbj6m,Uid:6c38262c-06e6-4c32-ba8a-36e0b07e2d95,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.550887 kubelet[2711]: E0819 08:21:16.550743 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.550887 kubelet[2711]: E0819 08:21:16.550800 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vbj6m" Aug 19 08:21:16.550887 kubelet[2711]: E0819 08:21:16.550829 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vbj6m" Aug 19 08:21:16.551209 kubelet[2711]: E0819 08:21:16.550877 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vbj6m_kube-system(6c38262c-06e6-4c32-ba8a-36e0b07e2d95)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vbj6m_kube-system(6c38262c-06e6-4c32-ba8a-36e0b07e2d95)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20639ce63fd55b63750ab163a0e29ed92993a824fc111fd487232d524eb558b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vbj6m" podUID="6c38262c-06e6-4c32-ba8a-36e0b07e2d95" Aug 19 08:21:16.586909 containerd[1559]: time="2025-08-19T08:21:16.586855771Z" level=error msg="Failed to destroy network for sandbox \"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.591796 containerd[1559]: time="2025-08-19T08:21:16.591758722Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4j8r,Uid:074be308-2f59-4eab-ad49-1f332ee9401f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.592194 kubelet[2711]: E0819 08:21:16.592160 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.592313 kubelet[2711]: E0819 08:21:16.592297 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:16.592391 kubelet[2711]: E0819 08:21:16.592376 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4j8r" Aug 19 08:21:16.592516 kubelet[2711]: E0819 08:21:16.592478 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-b4j8r_calico-system(074be308-2f59-4eab-ad49-1f332ee9401f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-b4j8r_calico-system(074be308-2f59-4eab-ad49-1f332ee9401f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5ff9db3f694d714e4d0058a1c606b3e531cb7a037179ab2d7a705e74bcdc2bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-b4j8r" podUID="074be308-2f59-4eab-ad49-1f332ee9401f" Aug 19 08:21:16.598052 containerd[1559]: time="2025-08-19T08:21:16.597944387Z" level=error msg="Failed to destroy network for sandbox \"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.601047 containerd[1559]: time="2025-08-19T08:21:16.601009859Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mvkn,Uid:fa0f8f47-19d7-4561-813e-30ce4c8bb154,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.602258 kubelet[2711]: E0819 08:21:16.602202 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.602317 kubelet[2711]: E0819 08:21:16.602267 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2mvkn" Aug 19 08:21:16.602317 kubelet[2711]: E0819 08:21:16.602288 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2mvkn" Aug 19 08:21:16.602535 kubelet[2711]: E0819 08:21:16.602495 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-2mvkn_kube-system(fa0f8f47-19d7-4561-813e-30ce4c8bb154)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-2mvkn_kube-system(fa0f8f47-19d7-4561-813e-30ce4c8bb154)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"266d2354a4b197d9ae737d380c7ba04c82e735aff838124c650b1a627c7e34c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-2mvkn" podUID="fa0f8f47-19d7-4561-813e-30ce4c8bb154" Aug 19 08:21:16.603415 containerd[1559]: time="2025-08-19T08:21:16.603338052Z" level=error msg="Failed to destroy network for sandbox \"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.605603 containerd[1559]: time="2025-08-19T08:21:16.605543223Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5qz84,Uid:56a0933c-2685-4cd7-8b89-03d0cd7801cf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.606057 kubelet[2711]: E0819 08:21:16.606023 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.606259 kubelet[2711]: E0819 08:21:16.606220 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.606335 kubelet[2711]: E0819 08:21:16.606318 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5qz84" Aug 19 08:21:16.606491 kubelet[2711]: E0819 08:21:16.606460 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-5qz84_calico-system(56a0933c-2685-4cd7-8b89-03d0cd7801cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-5qz84_calico-system(56a0933c-2685-4cd7-8b89-03d0cd7801cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"29efe6d714bf71f26956a36fff8e50e2119f4c006cf96479d6358282088dd61f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-5qz84" podUID="56a0933c-2685-4cd7-8b89-03d0cd7801cf" Aug 19 08:21:16.628845 containerd[1559]: time="2025-08-19T08:21:16.628665415Z" level=error msg="Failed to destroy network for sandbox \"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.631497 containerd[1559]: time="2025-08-19T08:21:16.631411705Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-2qvhr,Uid:01eef2fa-38e5-4788-9612-447f1a9d137b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.631686 kubelet[2711]: E0819 08:21:16.631647 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.631742 kubelet[2711]: E0819 08:21:16.631703 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" Aug 19 08:21:16.631742 kubelet[2711]: E0819 08:21:16.631724 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" Aug 19 08:21:16.631812 kubelet[2711]: E0819 08:21:16.631766 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7459684667-2qvhr_calico-apiserver(01eef2fa-38e5-4788-9612-447f1a9d137b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7459684667-2qvhr_calico-apiserver(01eef2fa-38e5-4788-9612-447f1a9d137b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"018d848599d0b6282db5ac8054b79e684e393873360643a1ce5358cdd5e9faf0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" podUID="01eef2fa-38e5-4788-9612-447f1a9d137b" Aug 19 08:21:16.634536 containerd[1559]: time="2025-08-19T08:21:16.634485682Z" level=error msg="Failed to destroy network for sandbox \"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.638174 containerd[1559]: time="2025-08-19T08:21:16.638137366Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d8c98d75-4wqhx,Uid:c38c3c29-994f-49d0-9415-10147f16c433,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.640370 kubelet[2711]: E0819 08:21:16.640310 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.640461 kubelet[2711]: E0819 08:21:16.640395 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-74d8c98d75-4wqhx" Aug 19 08:21:16.640461 kubelet[2711]: E0819 08:21:16.640416 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-74d8c98d75-4wqhx" Aug 19 08:21:16.640512 kubelet[2711]: E0819 08:21:16.640467 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-74d8c98d75-4wqhx_calico-system(c38c3c29-994f-49d0-9415-10147f16c433)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-74d8c98d75-4wqhx_calico-system(c38c3c29-994f-49d0-9415-10147f16c433)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0b557f214f64d0937992fb4f9ffe38ac6795232dbe2013e327610a4165dddbd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-74d8c98d75-4wqhx" podUID="c38c3c29-994f-49d0-9415-10147f16c433" Aug 19 08:21:16.645097 containerd[1559]: time="2025-08-19T08:21:16.645009204Z" level=error msg="Failed to destroy network for sandbox \"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.647357 containerd[1559]: time="2025-08-19T08:21:16.647280039Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56bb94f46f-lkqqb,Uid:155e2b61-2afd-491c-953f-5f486780c300,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.647845 kubelet[2711]: E0819 08:21:16.647805 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.647929 kubelet[2711]: E0819 08:21:16.647863 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" Aug 19 08:21:16.647929 kubelet[2711]: E0819 08:21:16.647883 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" Aug 19 08:21:16.647997 kubelet[2711]: E0819 08:21:16.647936 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56bb94f46f-lkqqb_calico-apiserver(155e2b61-2afd-491c-953f-5f486780c300)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56bb94f46f-lkqqb_calico-apiserver(155e2b61-2afd-491c-953f-5f486780c300)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fdfae0970a163fc434c27de19811a9d7ed42204e663c04bb07da0700b27d1576\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" podUID="155e2b61-2afd-491c-953f-5f486780c300" Aug 19 08:21:16.653152 containerd[1559]: time="2025-08-19T08:21:16.653122288Z" level=error msg="Failed to destroy network for sandbox \"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.654287 containerd[1559]: time="2025-08-19T08:21:16.654250963Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-rxhqh,Uid:4a25fc55-9c92-4016-bb5f-0473cc98f094,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.654470 kubelet[2711]: E0819 08:21:16.654425 2711 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 08:21:16.654521 kubelet[2711]: E0819 08:21:16.654490 2711 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" Aug 19 08:21:16.654521 kubelet[2711]: E0819 08:21:16.654511 2711 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" Aug 19 08:21:16.654594 kubelet[2711]: E0819 08:21:16.654566 2711 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7459684667-rxhqh_calico-apiserver(4a25fc55-9c92-4016-bb5f-0473cc98f094)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7459684667-rxhqh_calico-apiserver(4a25fc55-9c92-4016-bb5f-0473cc98f094)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1cbf021a6ddc9170cc1e7bd43aff65a8658b3a3905d6d7c44fdbe42ba4db33e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" podUID="4a25fc55-9c92-4016-bb5f-0473cc98f094" Aug 19 08:21:25.505881 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount829484915.mount: Deactivated successfully. Aug 19 08:21:26.758757 containerd[1559]: time="2025-08-19T08:21:26.758696681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:26.760121 containerd[1559]: time="2025-08-19T08:21:26.760064692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 19 08:21:26.761500 containerd[1559]: time="2025-08-19T08:21:26.761433865Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:26.763374 containerd[1559]: time="2025-08-19T08:21:26.763326161Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:26.763813 containerd[1559]: time="2025-08-19T08:21:26.763784724Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 10.31563746s" Aug 19 08:21:26.763861 containerd[1559]: time="2025-08-19T08:21:26.763814950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 19 08:21:26.787377 containerd[1559]: time="2025-08-19T08:21:26.787306969Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 08:21:26.796366 containerd[1559]: time="2025-08-19T08:21:26.796318896Z" level=info msg="Container 6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:26.807470 containerd[1559]: time="2025-08-19T08:21:26.807436020Z" level=info msg="CreateContainer within sandbox \"8527605099af7680e41108a225b7871dad986867e3372e0d9a533131c93a096a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\"" Aug 19 08:21:26.807875 containerd[1559]: time="2025-08-19T08:21:26.807846460Z" level=info msg="StartContainer for \"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\"" Aug 19 08:21:26.809195 containerd[1559]: time="2025-08-19T08:21:26.809168165Z" level=info msg="connecting to shim 6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6" address="unix:///run/containerd/s/761182aecb072ff491624e0010ae6e7bbb8bda0e9da0add66c2cddbb1ffcadda" protocol=ttrpc version=3 Aug 19 08:21:26.833212 systemd[1]: Started cri-containerd-6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6.scope - libcontainer container 6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6. Aug 19 08:21:26.878265 containerd[1559]: time="2025-08-19T08:21:26.878223283Z" level=info msg="StartContainer for \"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\" returns successfully" Aug 19 08:21:26.973642 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 08:21:26.973757 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 08:21:27.109288 kubelet[2711]: I0819 08:21:27.108759 2711 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wndb4\" (UniqueName: \"kubernetes.io/projected/c38c3c29-994f-49d0-9415-10147f16c433-kube-api-access-wndb4\") pod \"c38c3c29-994f-49d0-9415-10147f16c433\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " Aug 19 08:21:27.109288 kubelet[2711]: I0819 08:21:27.108858 2711 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c38c3c29-994f-49d0-9415-10147f16c433-whisker-backend-key-pair\") pod \"c38c3c29-994f-49d0-9415-10147f16c433\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " Aug 19 08:21:27.109288 kubelet[2711]: I0819 08:21:27.108876 2711 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c38c3c29-994f-49d0-9415-10147f16c433-whisker-ca-bundle\") pod \"c38c3c29-994f-49d0-9415-10147f16c433\" (UID: \"c38c3c29-994f-49d0-9415-10147f16c433\") " Aug 19 08:21:27.109777 kubelet[2711]: I0819 08:21:27.109733 2711 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c38c3c29-994f-49d0-9415-10147f16c433-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c38c3c29-994f-49d0-9415-10147f16c433" (UID: "c38c3c29-994f-49d0-9415-10147f16c433"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 19 08:21:27.113511 kubelet[2711]: I0819 08:21:27.113477 2711 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c38c3c29-994f-49d0-9415-10147f16c433-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c38c3c29-994f-49d0-9415-10147f16c433" (UID: "c38c3c29-994f-49d0-9415-10147f16c433"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 08:21:27.114387 kubelet[2711]: I0819 08:21:27.114303 2711 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c38c3c29-994f-49d0-9415-10147f16c433-kube-api-access-wndb4" (OuterVolumeSpecName: "kube-api-access-wndb4") pod "c38c3c29-994f-49d0-9415-10147f16c433" (UID: "c38c3c29-994f-49d0-9415-10147f16c433"). InnerVolumeSpecName "kube-api-access-wndb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 08:21:27.209493 kubelet[2711]: I0819 08:21:27.209443 2711 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wndb4\" (UniqueName: \"kubernetes.io/projected/c38c3c29-994f-49d0-9415-10147f16c433-kube-api-access-wndb4\") on node \"localhost\" DevicePath \"\"" Aug 19 08:21:27.209493 kubelet[2711]: I0819 08:21:27.209476 2711 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c38c3c29-994f-49d0-9415-10147f16c433-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Aug 19 08:21:27.209493 kubelet[2711]: I0819 08:21:27.209488 2711 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c38c3c29-994f-49d0-9415-10147f16c433-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Aug 19 08:21:27.371477 systemd[1]: Removed slice kubepods-besteffort-podc38c3c29_994f_49d0_9415_10147f16c433.slice - libcontainer container kubepods-besteffort-podc38c3c29_994f_49d0_9415_10147f16c433.slice. Aug 19 08:21:27.485246 kubelet[2711]: I0819 08:21:27.485136 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mg4mp" podStartSLOduration=1.3620093070000001 podStartE2EDuration="21.484524123s" podCreationTimestamp="2025-08-19 08:21:06 +0000 UTC" firstStartedPulling="2025-08-19 08:21:06.642166242 +0000 UTC m=+19.367951559" lastFinishedPulling="2025-08-19 08:21:26.764681058 +0000 UTC m=+39.490466375" observedRunningTime="2025-08-19 08:21:27.484159428 +0000 UTC m=+40.209944745" watchObservedRunningTime="2025-08-19 08:21:27.484524123 +0000 UTC m=+40.210309440" Aug 19 08:21:27.534407 systemd[1]: Started sshd@7-10.0.0.150:22-10.0.0.1:54846.service - OpenSSH per-connection server daemon (10.0.0.1:54846). Aug 19 08:21:27.552648 systemd[1]: Created slice kubepods-besteffort-pod9795501f_2294_426d_92c9_a87c27d5ef4d.slice - libcontainer container kubepods-besteffort-pod9795501f_2294_426d_92c9_a87c27d5ef4d.slice. Aug 19 08:21:27.593684 containerd[1559]: time="2025-08-19T08:21:27.593615198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\" id:\"bb0b1587e6040caca654cec63532dbae7286fcc7013274498880f56df5884933\" pid:4013 exit_status:1 exited_at:{seconds:1755591687 nanos:593290668}" Aug 19 08:21:27.600612 sshd[4021]: Accepted publickey for core from 10.0.0.1 port 54846 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:27.602571 sshd-session[4021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:27.607140 systemd-logind[1540]: New session 8 of user core. Aug 19 08:21:27.612635 kubelet[2711]: I0819 08:21:27.612589 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhsp2\" (UniqueName: \"kubernetes.io/projected/9795501f-2294-426d-92c9-a87c27d5ef4d-kube-api-access-nhsp2\") pod \"whisker-fd6f5fc45-jvmlq\" (UID: \"9795501f-2294-426d-92c9-a87c27d5ef4d\") " pod="calico-system/whisker-fd6f5fc45-jvmlq" Aug 19 08:21:27.612714 kubelet[2711]: I0819 08:21:27.612642 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9795501f-2294-426d-92c9-a87c27d5ef4d-whisker-backend-key-pair\") pod \"whisker-fd6f5fc45-jvmlq\" (UID: \"9795501f-2294-426d-92c9-a87c27d5ef4d\") " pod="calico-system/whisker-fd6f5fc45-jvmlq" Aug 19 08:21:27.612714 kubelet[2711]: I0819 08:21:27.612664 2711 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9795501f-2294-426d-92c9-a87c27d5ef4d-whisker-ca-bundle\") pod \"whisker-fd6f5fc45-jvmlq\" (UID: \"9795501f-2294-426d-92c9-a87c27d5ef4d\") " pod="calico-system/whisker-fd6f5fc45-jvmlq" Aug 19 08:21:27.617199 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 08:21:27.744471 sshd[4031]: Connection closed by 10.0.0.1 port 54846 Aug 19 08:21:27.744795 sshd-session[4021]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:27.749452 systemd[1]: sshd@7-10.0.0.150:22-10.0.0.1:54846.service: Deactivated successfully. Aug 19 08:21:27.751516 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 08:21:27.752290 systemd-logind[1540]: Session 8 logged out. Waiting for processes to exit. Aug 19 08:21:27.753463 systemd-logind[1540]: Removed session 8. Aug 19 08:21:27.772860 systemd[1]: var-lib-kubelet-pods-c38c3c29\x2d994f\x2d49d0\x2d9415\x2d10147f16c433-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dwndb4.mount: Deactivated successfully. Aug 19 08:21:27.772985 systemd[1]: var-lib-kubelet-pods-c38c3c29\x2d994f\x2d49d0\x2d9415\x2d10147f16c433-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 08:21:27.856673 containerd[1559]: time="2025-08-19T08:21:27.856627950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fd6f5fc45-jvmlq,Uid:9795501f-2294-426d-92c9-a87c27d5ef4d,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:28.002552 systemd-networkd[1467]: calif661ee5af6d: Link UP Aug 19 08:21:28.002783 systemd-networkd[1467]: calif661ee5af6d: Gained carrier Aug 19 08:21:28.017121 containerd[1559]: 2025-08-19 08:21:27.882 [INFO][4048] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 08:21:28.017121 containerd[1559]: 2025-08-19 08:21:27.900 [INFO][4048] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0 whisker-fd6f5fc45- calico-system 9795501f-2294-426d-92c9-a87c27d5ef4d 975 0 2025-08-19 08:21:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fd6f5fc45 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-fd6f5fc45-jvmlq eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif661ee5af6d [] [] }} ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-" Aug 19 08:21:28.017121 containerd[1559]: 2025-08-19 08:21:27.900 [INFO][4048] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017121 containerd[1559]: 2025-08-19 08:21:27.960 [INFO][4062] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" HandleID="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Workload="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.961 [INFO][4062] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" HandleID="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Workload="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123750), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-fd6f5fc45-jvmlq", "timestamp":"2025-08-19 08:21:27.960797478 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.961 [INFO][4062] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.962 [INFO][4062] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.962 [INFO][4062] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.970 [INFO][4062] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" host="localhost" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.974 [INFO][4062] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.978 [INFO][4062] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.980 [INFO][4062] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.981 [INFO][4062] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.017339 containerd[1559]: 2025-08-19 08:21:27.981 [INFO][4062] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" host="localhost" Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.983 [INFO][4062] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63 Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.986 [INFO][4062] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" host="localhost" Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.991 [INFO][4062] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" host="localhost" Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.991 [INFO][4062] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" host="localhost" Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.991 [INFO][4062] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:28.017556 containerd[1559]: 2025-08-19 08:21:27.991 [INFO][4062] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" HandleID="k8s-pod-network.1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Workload="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017740 containerd[1559]: 2025-08-19 08:21:27.994 [INFO][4048] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0", GenerateName:"whisker-fd6f5fc45-", Namespace:"calico-system", SelfLink:"", UID:"9795501f-2294-426d-92c9-a87c27d5ef4d", ResourceVersion:"975", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fd6f5fc45", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-fd6f5fc45-jvmlq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif661ee5af6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.017740 containerd[1559]: 2025-08-19 08:21:27.994 [INFO][4048] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017820 containerd[1559]: 2025-08-19 08:21:27.995 [INFO][4048] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif661ee5af6d ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017820 containerd[1559]: 2025-08-19 08:21:28.002 [INFO][4048] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.017869 containerd[1559]: 2025-08-19 08:21:28.004 [INFO][4048] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0", GenerateName:"whisker-fd6f5fc45-", Namespace:"calico-system", SelfLink:"", UID:"9795501f-2294-426d-92c9-a87c27d5ef4d", ResourceVersion:"975", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fd6f5fc45", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63", Pod:"whisker-fd6f5fc45-jvmlq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif661ee5af6d", MAC:"42:52:8a:d8:35:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.017917 containerd[1559]: 2025-08-19 08:21:28.013 [INFO][4048] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" Namespace="calico-system" Pod="whisker-fd6f5fc45-jvmlq" WorkloadEndpoint="localhost-k8s-whisker--fd6f5fc45--jvmlq-eth0" Aug 19 08:21:28.152724 containerd[1559]: time="2025-08-19T08:21:28.152649869Z" level=info msg="connecting to shim 1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63" address="unix:///run/containerd/s/89515da887e8efa6d0a44830ef3dfe57d17c6a8eff211a9a3a0882f270dcb854" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:28.186246 systemd[1]: Started cri-containerd-1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63.scope - libcontainer container 1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63. Aug 19 08:21:28.198361 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:28.328223 containerd[1559]: time="2025-08-19T08:21:28.326971914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fd6f5fc45-jvmlq,Uid:9795501f-2294-426d-92c9-a87c27d5ef4d,Namespace:calico-system,Attempt:0,} returns sandbox id \"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63\"" Aug 19 08:21:28.330387 containerd[1559]: time="2025-08-19T08:21:28.330274700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 08:21:28.362380 containerd[1559]: time="2025-08-19T08:21:28.362324899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-2qvhr,Uid:01eef2fa-38e5-4788-9612-447f1a9d137b,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:28.363295 containerd[1559]: time="2025-08-19T08:21:28.363274283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vbj6m,Uid:6c38262c-06e6-4c32-ba8a-36e0b07e2d95,Namespace:kube-system,Attempt:0,}" Aug 19 08:21:28.364681 containerd[1559]: time="2025-08-19T08:21:28.364645690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56bb94f46f-lkqqb,Uid:155e2b61-2afd-491c-953f-5f486780c300,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:28.625240 containerd[1559]: time="2025-08-19T08:21:28.624902979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\" id:\"1a19381d48d531e48bbe3e96317ebf3e7ca8ddeea8743dee04a6aff5d0ce2620\" pid:4264 exit_status:1 exited_at:{seconds:1755591688 nanos:624574401}" Aug 19 08:21:28.644211 systemd-networkd[1467]: cali1acefc53901: Link UP Aug 19 08:21:28.645032 systemd-networkd[1467]: cali1acefc53901: Gained carrier Aug 19 08:21:28.658888 containerd[1559]: 2025-08-19 08:21:28.569 [INFO][4281] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0 calico-apiserver-7459684667- calico-apiserver 01eef2fa-38e5-4788-9612-447f1a9d137b 873 0 2025-08-19 08:21:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7459684667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7459684667-2qvhr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1acefc53901 [] [] }} ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-" Aug 19 08:21:28.658888 containerd[1559]: 2025-08-19 08:21:28.570 [INFO][4281] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.658888 containerd[1559]: 2025-08-19 08:21:28.601 [INFO][4334] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" HandleID="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Workload="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.601 [INFO][4334] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" HandleID="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Workload="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7090), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7459684667-2qvhr", "timestamp":"2025-08-19 08:21:28.601313597 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.601 [INFO][4334] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.601 [INFO][4334] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.601 [INFO][4334] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.608 [INFO][4334] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" host="localhost" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.614 [INFO][4334] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.619 [INFO][4334] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.622 [INFO][4334] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.624 [INFO][4334] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.659145 containerd[1559]: 2025-08-19 08:21:28.624 [INFO][4334] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" host="localhost" Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.625 [INFO][4334] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5 Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.629 [INFO][4334] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" host="localhost" Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4334] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" host="localhost" Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4334] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" host="localhost" Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4334] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:28.659493 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4334] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" HandleID="k8s-pod-network.ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Workload="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.659708 containerd[1559]: 2025-08-19 08:21:28.639 [INFO][4281] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0", GenerateName:"calico-apiserver-7459684667-", Namespace:"calico-apiserver", SelfLink:"", UID:"01eef2fa-38e5-4788-9612-447f1a9d137b", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7459684667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7459684667-2qvhr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1acefc53901", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.659790 containerd[1559]: 2025-08-19 08:21:28.639 [INFO][4281] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.659790 containerd[1559]: 2025-08-19 08:21:28.639 [INFO][4281] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1acefc53901 ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.659790 containerd[1559]: 2025-08-19 08:21:28.645 [INFO][4281] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.659882 containerd[1559]: 2025-08-19 08:21:28.645 [INFO][4281] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0", GenerateName:"calico-apiserver-7459684667-", Namespace:"calico-apiserver", SelfLink:"", UID:"01eef2fa-38e5-4788-9612-447f1a9d137b", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7459684667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5", Pod:"calico-apiserver-7459684667-2qvhr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1acefc53901", MAC:"16:1f:8a:47:24:7f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.659968 containerd[1559]: 2025-08-19 08:21:28.653 [INFO][4281] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-2qvhr" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--2qvhr-eth0" Aug 19 08:21:28.764709 systemd-networkd[1467]: calib0ba216f443: Link UP Aug 19 08:21:28.766113 systemd-networkd[1467]: calib0ba216f443: Gained carrier Aug 19 08:21:28.784742 containerd[1559]: 2025-08-19 08:21:28.550 [INFO][4266] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0 coredns-674b8bbfcf- kube-system 6c38262c-06e6-4c32-ba8a-36e0b07e2d95 866 0 2025-08-19 08:20:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-vbj6m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib0ba216f443 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-" Aug 19 08:21:28.784742 containerd[1559]: 2025-08-19 08:21:28.550 [INFO][4266] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.784742 containerd[1559]: 2025-08-19 08:21:28.611 [INFO][4314] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" HandleID="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Workload="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.611 [INFO][4314] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" HandleID="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Workload="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fd90), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-vbj6m", "timestamp":"2025-08-19 08:21:28.611155038 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.611 [INFO][4314] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4314] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.636 [INFO][4314] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.709 [INFO][4314] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" host="localhost" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.713 [INFO][4314] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.718 [INFO][4314] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.720 [INFO][4314] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.748 [INFO][4314] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.784962 containerd[1559]: 2025-08-19 08:21:28.748 [INFO][4314] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" host="localhost" Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.750 [INFO][4314] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6 Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.753 [INFO][4314] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" host="localhost" Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4314] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" host="localhost" Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4314] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" host="localhost" Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4314] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:28.785252 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4314] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" HandleID="k8s-pod-network.ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Workload="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.785379 containerd[1559]: 2025-08-19 08:21:28.761 [INFO][4266] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6c38262c-06e6-4c32-ba8a-36e0b07e2d95", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 20, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-vbj6m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib0ba216f443", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.785450 containerd[1559]: 2025-08-19 08:21:28.761 [INFO][4266] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.785450 containerd[1559]: 2025-08-19 08:21:28.762 [INFO][4266] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0ba216f443 ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.785450 containerd[1559]: 2025-08-19 08:21:28.765 [INFO][4266] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.785525 containerd[1559]: 2025-08-19 08:21:28.766 [INFO][4266] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6c38262c-06e6-4c32-ba8a-36e0b07e2d95", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 20, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6", Pod:"coredns-674b8bbfcf-vbj6m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib0ba216f443", MAC:"2e:28:56:20:8b:43", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.785525 containerd[1559]: 2025-08-19 08:21:28.779 [INFO][4266] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-vbj6m" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--vbj6m-eth0" Aug 19 08:21:28.786096 containerd[1559]: time="2025-08-19T08:21:28.785738925Z" level=info msg="connecting to shim ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5" address="unix:///run/containerd/s/b09388226c18caa93ca0abb24048d9845cbf2af80c60ebbcf160abf985244d15" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:28.820828 containerd[1559]: time="2025-08-19T08:21:28.820714581Z" level=info msg="connecting to shim ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6" address="unix:///run/containerd/s/55645cb3510d258d5c6d1ee0388a0a4c1a996318070a9b41031aa094b709b7ef" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:28.822217 systemd[1]: Started cri-containerd-ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5.scope - libcontainer container ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5. Aug 19 08:21:28.849868 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:28.863717 systemd-networkd[1467]: cali875f0d2ce9f: Link UP Aug 19 08:21:28.865237 systemd-networkd[1467]: cali875f0d2ce9f: Gained carrier Aug 19 08:21:28.866237 systemd[1]: Started cri-containerd-ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6.scope - libcontainer container ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6. Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.564 [INFO][4251] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0 calico-apiserver-56bb94f46f- calico-apiserver 155e2b61-2afd-491c-953f-5f486780c300 871 0 2025-08-19 08:21:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56bb94f46f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-56bb94f46f-lkqqb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali875f0d2ce9f [] [] }} ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.565 [INFO][4251] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.624 [INFO][4327] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" HandleID="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Workload="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.624 [INFO][4327] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" HandleID="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Workload="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001b1ba0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-56bb94f46f-lkqqb", "timestamp":"2025-08-19 08:21:28.624561828 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.624 [INFO][4327] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4327] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.758 [INFO][4327] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.811 [INFO][4327] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.816 [INFO][4327] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.820 [INFO][4327] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.822 [INFO][4327] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.824 [INFO][4327] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.824 [INFO][4327] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.826 [INFO][4327] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0 Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.830 [INFO][4327] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.837 [INFO][4327] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.837 [INFO][4327] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" host="localhost" Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.837 [INFO][4327] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:28.889101 containerd[1559]: 2025-08-19 08:21:28.837 [INFO][4327] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" HandleID="k8s-pod-network.9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Workload="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.848 [INFO][4251] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0", GenerateName:"calico-apiserver-56bb94f46f-", Namespace:"calico-apiserver", SelfLink:"", UID:"155e2b61-2afd-491c-953f-5f486780c300", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56bb94f46f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-56bb94f46f-lkqqb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali875f0d2ce9f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.850 [INFO][4251] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.852 [INFO][4251] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali875f0d2ce9f ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.865 [INFO][4251] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.869 [INFO][4251] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0", GenerateName:"calico-apiserver-56bb94f46f-", Namespace:"calico-apiserver", SelfLink:"", UID:"155e2b61-2afd-491c-953f-5f486780c300", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56bb94f46f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0", Pod:"calico-apiserver-56bb94f46f-lkqqb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali875f0d2ce9f", MAC:"52:5b:98:fe:c3:53", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:28.890563 containerd[1559]: 2025-08-19 08:21:28.883 [INFO][4251] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" Namespace="calico-apiserver" Pod="calico-apiserver-56bb94f46f-lkqqb" WorkloadEndpoint="localhost-k8s-calico--apiserver--56bb94f46f--lkqqb-eth0" Aug 19 08:21:28.892853 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:28.902732 containerd[1559]: time="2025-08-19T08:21:28.902695295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-2qvhr,Uid:01eef2fa-38e5-4788-9612-447f1a9d137b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5\"" Aug 19 08:21:28.913641 systemd-networkd[1467]: vxlan.calico: Link UP Aug 19 08:21:28.913649 systemd-networkd[1467]: vxlan.calico: Gained carrier Aug 19 08:21:28.937056 containerd[1559]: time="2025-08-19T08:21:28.936964222Z" level=info msg="connecting to shim 9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0" address="unix:///run/containerd/s/9880e3b7033a278c8518812a6b7c303e6f8890120abd0e6ee5489c9e16d37921" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:28.949810 containerd[1559]: time="2025-08-19T08:21:28.949764323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vbj6m,Uid:6c38262c-06e6-4c32-ba8a-36e0b07e2d95,Namespace:kube-system,Attempt:0,} returns sandbox id \"ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6\"" Aug 19 08:21:28.959319 containerd[1559]: time="2025-08-19T08:21:28.958207927Z" level=info msg="CreateContainer within sandbox \"ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:21:28.970215 systemd[1]: Started cri-containerd-9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0.scope - libcontainer container 9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0. Aug 19 08:21:28.972677 containerd[1559]: time="2025-08-19T08:21:28.972588447Z" level=info msg="Container 5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:28.980673 containerd[1559]: time="2025-08-19T08:21:28.980636278Z" level=info msg="CreateContainer within sandbox \"ca163f5334707cfc3372a0667dafafc43ada049924f9db4a1c8879521d7365a6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64\"" Aug 19 08:21:28.981400 containerd[1559]: time="2025-08-19T08:21:28.981371319Z" level=info msg="StartContainer for \"5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64\"" Aug 19 08:21:28.982593 containerd[1559]: time="2025-08-19T08:21:28.982528484Z" level=info msg="connecting to shim 5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64" address="unix:///run/containerd/s/55645cb3510d258d5c6d1ee0388a0a4c1a996318070a9b41031aa094b709b7ef" protocol=ttrpc version=3 Aug 19 08:21:28.985505 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:29.002202 systemd[1]: Started cri-containerd-5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64.scope - libcontainer container 5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64. Aug 19 08:21:29.028886 containerd[1559]: time="2025-08-19T08:21:29.028774060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56bb94f46f-lkqqb,Uid:155e2b61-2afd-491c-953f-5f486780c300,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0\"" Aug 19 08:21:29.045756 containerd[1559]: time="2025-08-19T08:21:29.045722120Z" level=info msg="StartContainer for \"5ccb4652e6bba979cd206ce6f803b8cd82673f64af90bb6708f211ff810d6a64\" returns successfully" Aug 19 08:21:29.364484 kubelet[2711]: I0819 08:21:29.364431 2711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c38c3c29-994f-49d0-9415-10147f16c433" path="/var/lib/kubelet/pods/c38c3c29-994f-49d0-9415-10147f16c433/volumes" Aug 19 08:21:29.380219 systemd-networkd[1467]: calif661ee5af6d: Gained IPv6LL Aug 19 08:21:29.487949 kubelet[2711]: I0819 08:21:29.487867 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vbj6m" podStartSLOduration=35.487847644 podStartE2EDuration="35.487847644s" podCreationTimestamp="2025-08-19 08:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:21:29.486051881 +0000 UTC m=+42.211837188" watchObservedRunningTime="2025-08-19 08:21:29.487847644 +0000 UTC m=+42.213632961" Aug 19 08:21:29.892215 systemd-networkd[1467]: calib0ba216f443: Gained IPv6LL Aug 19 08:21:30.020287 systemd-networkd[1467]: vxlan.calico: Gained IPv6LL Aug 19 08:21:30.213252 systemd-networkd[1467]: cali1acefc53901: Gained IPv6LL Aug 19 08:21:30.362860 containerd[1559]: time="2025-08-19T08:21:30.362810246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mvkn,Uid:fa0f8f47-19d7-4561-813e-30ce4c8bb154,Namespace:kube-system,Attempt:0,}" Aug 19 08:21:30.465578 systemd-networkd[1467]: cali7cd883a148e: Link UP Aug 19 08:21:30.466550 systemd-networkd[1467]: cali7cd883a148e: Gained carrier Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.404 [INFO][4623] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0 coredns-674b8bbfcf- kube-system fa0f8f47-19d7-4561-813e-30ce4c8bb154 869 0 2025-08-19 08:20:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-2mvkn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7cd883a148e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.405 [INFO][4623] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.431 [INFO][4632] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" HandleID="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Workload="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.431 [INFO][4632] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" HandleID="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Workload="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000324140), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-2mvkn", "timestamp":"2025-08-19 08:21:30.431718331 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.432 [INFO][4632] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.432 [INFO][4632] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.432 [INFO][4632] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.437 [INFO][4632] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.441 [INFO][4632] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.445 [INFO][4632] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.446 [INFO][4632] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.448 [INFO][4632] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.448 [INFO][4632] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.449 [INFO][4632] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906 Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.452 [INFO][4632] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.459 [INFO][4632] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.459 [INFO][4632] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" host="localhost" Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.459 [INFO][4632] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:30.479060 containerd[1559]: 2025-08-19 08:21:30.459 [INFO][4632] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" HandleID="k8s-pod-network.eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Workload="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.463 [INFO][4623] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"fa0f8f47-19d7-4561-813e-30ce4c8bb154", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 20, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-2mvkn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7cd883a148e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.463 [INFO][4623] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.463 [INFO][4623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7cd883a148e ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.466 [INFO][4623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.467 [INFO][4623] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"fa0f8f47-19d7-4561-813e-30ce4c8bb154", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 20, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906", Pod:"coredns-674b8bbfcf-2mvkn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7cd883a148e", MAC:"3a:7f:14:a7:63:fc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:30.479609 containerd[1559]: 2025-08-19 08:21:30.474 [INFO][4623] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" Namespace="kube-system" Pod="coredns-674b8bbfcf-2mvkn" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--2mvkn-eth0" Aug 19 08:21:30.501597 containerd[1559]: time="2025-08-19T08:21:30.501121577Z" level=info msg="connecting to shim eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906" address="unix:///run/containerd/s/42873268f9bfdc31390c1d036bb45ff1da8e24d6111950e7009a3e8505186ee2" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:30.529202 systemd[1]: Started cri-containerd-eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906.scope - libcontainer container eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906. Aug 19 08:21:30.542500 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:30.572953 containerd[1559]: time="2025-08-19T08:21:30.572908289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2mvkn,Uid:fa0f8f47-19d7-4561-813e-30ce4c8bb154,Namespace:kube-system,Attempt:0,} returns sandbox id \"eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906\"" Aug 19 08:21:30.578417 containerd[1559]: time="2025-08-19T08:21:30.578369338Z" level=info msg="CreateContainer within sandbox \"eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 08:21:30.589752 containerd[1559]: time="2025-08-19T08:21:30.589710252Z" level=info msg="Container ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:30.599315 containerd[1559]: time="2025-08-19T08:21:30.599265111Z" level=info msg="CreateContainer within sandbox \"eeff0a45df8381b6e6f7eb12ff828a5481043fa4ae57f33e298f9fe659125906\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015\"" Aug 19 08:21:30.599965 containerd[1559]: time="2025-08-19T08:21:30.599747327Z" level=info msg="StartContainer for \"ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015\"" Aug 19 08:21:30.600650 containerd[1559]: time="2025-08-19T08:21:30.600621789Z" level=info msg="connecting to shim ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015" address="unix:///run/containerd/s/42873268f9bfdc31390c1d036bb45ff1da8e24d6111950e7009a3e8505186ee2" protocol=ttrpc version=3 Aug 19 08:21:30.634370 systemd[1]: Started cri-containerd-ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015.scope - libcontainer container ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015. Aug 19 08:21:30.678691 containerd[1559]: time="2025-08-19T08:21:30.678639618Z" level=info msg="StartContainer for \"ff10e4d60233c3efec8009aeee6dcd43d5a6a5cb400a4ee2337c0aa3441aa015\" returns successfully" Aug 19 08:21:30.736787 containerd[1559]: time="2025-08-19T08:21:30.736582164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:30.737760 containerd[1559]: time="2025-08-19T08:21:30.737714882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 19 08:21:30.739634 containerd[1559]: time="2025-08-19T08:21:30.739576439Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:30.742283 containerd[1559]: time="2025-08-19T08:21:30.742228971Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:30.743231 containerd[1559]: time="2025-08-19T08:21:30.743198002Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 2.412897794s" Aug 19 08:21:30.743278 containerd[1559]: time="2025-08-19T08:21:30.743233779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 19 08:21:30.744772 containerd[1559]: time="2025-08-19T08:21:30.744745779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:21:30.749329 containerd[1559]: time="2025-08-19T08:21:30.749261602Z" level=info msg="CreateContainer within sandbox \"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 08:21:30.762754 containerd[1559]: time="2025-08-19T08:21:30.762708944Z" level=info msg="Container 0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:30.776931 containerd[1559]: time="2025-08-19T08:21:30.776869685Z" level=info msg="CreateContainer within sandbox \"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361\"" Aug 19 08:21:30.777881 containerd[1559]: time="2025-08-19T08:21:30.777803198Z" level=info msg="StartContainer for \"0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361\"" Aug 19 08:21:30.778937 containerd[1559]: time="2025-08-19T08:21:30.778903466Z" level=info msg="connecting to shim 0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361" address="unix:///run/containerd/s/89515da887e8efa6d0a44830ef3dfe57d17c6a8eff211a9a3a0882f270dcb854" protocol=ttrpc version=3 Aug 19 08:21:30.788253 systemd-networkd[1467]: cali875f0d2ce9f: Gained IPv6LL Aug 19 08:21:30.810657 systemd[1]: Started cri-containerd-0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361.scope - libcontainer container 0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361. Aug 19 08:21:30.864521 containerd[1559]: time="2025-08-19T08:21:30.864476606Z" level=info msg="StartContainer for \"0d9dde69e38f8628e4c873af3b599e45cc4e69c5ced619831623c751f98aa361\" returns successfully" Aug 19 08:21:31.362579 containerd[1559]: time="2025-08-19T08:21:31.362521181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5qz84,Uid:56a0933c-2685-4cd7-8b89-03d0cd7801cf,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:31.362579 containerd[1559]: time="2025-08-19T08:21:31.362559433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-rxhqh,Uid:4a25fc55-9c92-4016-bb5f-0473cc98f094,Namespace:calico-apiserver,Attempt:0,}" Aug 19 08:21:31.363098 containerd[1559]: time="2025-08-19T08:21:31.362902708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7787566946-5j8pk,Uid:7677a958-0a64-4b43-a17a-755951db9108,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:31.364103 containerd[1559]: time="2025-08-19T08:21:31.363983086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4j8r,Uid:074be308-2f59-4eab-ad49-1f332ee9401f,Namespace:calico-system,Attempt:0,}" Aug 19 08:21:31.513579 kubelet[2711]: I0819 08:21:31.513490 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-2mvkn" podStartSLOduration=37.51346757 podStartE2EDuration="37.51346757s" podCreationTimestamp="2025-08-19 08:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 08:21:31.510678631 +0000 UTC m=+44.236463948" watchObservedRunningTime="2025-08-19 08:21:31.51346757 +0000 UTC m=+44.239252888" Aug 19 08:21:31.546767 systemd-networkd[1467]: cali01bbd391799: Link UP Aug 19 08:21:31.549640 systemd-networkd[1467]: cali01bbd391799: Gained carrier Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.435 [INFO][4779] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--768f4c5c69--5qz84-eth0 goldmane-768f4c5c69- calico-system 56a0933c-2685-4cd7-8b89-03d0cd7801cf 874 0 2025-08-19 08:21:05 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-768f4c5c69-5qz84 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali01bbd391799 [] [] }} ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.436 [INFO][4779] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.485 [INFO][4826] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" HandleID="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Workload="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.486 [INFO][4826] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" HandleID="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Workload="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00034c9c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-768f4c5c69-5qz84", "timestamp":"2025-08-19 08:21:31.485309088 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.486 [INFO][4826] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.486 [INFO][4826] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.486 [INFO][4826] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.493 [INFO][4826] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.504 [INFO][4826] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.509 [INFO][4826] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.513 [INFO][4826] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.517 [INFO][4826] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.517 [INFO][4826] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.519 [INFO][4826] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8 Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.527 [INFO][4826] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.535 [INFO][4826] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.535 [INFO][4826] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" host="localhost" Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.535 [INFO][4826] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:31.574907 containerd[1559]: 2025-08-19 08:21:31.535 [INFO][4826] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" HandleID="k8s-pod-network.15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Workload="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.541 [INFO][4779] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--5qz84-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"56a0933c-2685-4cd7-8b89-03d0cd7801cf", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-768f4c5c69-5qz84", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali01bbd391799", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.541 [INFO][4779] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.541 [INFO][4779] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali01bbd391799 ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.556 [INFO][4779] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.557 [INFO][4779] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--768f4c5c69--5qz84-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"56a0933c-2685-4cd7-8b89-03d0cd7801cf", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8", Pod:"goldmane-768f4c5c69-5qz84", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali01bbd391799", MAC:"3a:78:1a:13:49:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.576026 containerd[1559]: 2025-08-19 08:21:31.569 [INFO][4779] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" Namespace="calico-system" Pod="goldmane-768f4c5c69-5qz84" WorkloadEndpoint="localhost-k8s-goldmane--768f4c5c69--5qz84-eth0" Aug 19 08:21:31.621405 systemd-networkd[1467]: cali7cd883a148e: Gained IPv6LL Aug 19 08:21:31.629942 containerd[1559]: time="2025-08-19T08:21:31.629881275Z" level=info msg="connecting to shim 15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8" address="unix:///run/containerd/s/d2c25421e0d7997d2b40f0f9fd5b14530063377e4c7c5b05acab7c4ec783fe7d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:31.654055 systemd-networkd[1467]: cali43f3092cc42: Link UP Aug 19 08:21:31.655821 systemd-networkd[1467]: cali43f3092cc42: Gained carrier Aug 19 08:21:31.670245 systemd[1]: Started cri-containerd-15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8.scope - libcontainer container 15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8. Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.446 [INFO][4768] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0 calico-apiserver-7459684667- calico-apiserver 4a25fc55-9c92-4016-bb5f-0473cc98f094 875 0 2025-08-19 08:21:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7459684667 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7459684667-rxhqh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali43f3092cc42 [] [] }} ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.447 [INFO][4768] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.489 [INFO][4840] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.489 [INFO][4840] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00021f740), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7459684667-rxhqh", "timestamp":"2025-08-19 08:21:31.489369986 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.489 [INFO][4840] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.537 [INFO][4840] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.538 [INFO][4840] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.609 [INFO][4840] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.615 [INFO][4840] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.620 [INFO][4840] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.625 [INFO][4840] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.629 [INFO][4840] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.630 [INFO][4840] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.632 [INFO][4840] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.636 [INFO][4840] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.644 [INFO][4840] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.644 [INFO][4840] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" host="localhost" Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.644 [INFO][4840] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:31.672789 containerd[1559]: 2025-08-19 08:21:31.644 [INFO][4840] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.648 [INFO][4768] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0", GenerateName:"calico-apiserver-7459684667-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a25fc55-9c92-4016-bb5f-0473cc98f094", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7459684667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7459684667-rxhqh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43f3092cc42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.648 [INFO][4768] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.648 [INFO][4768] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43f3092cc42 ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.656 [INFO][4768] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.657 [INFO][4768] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0", GenerateName:"calico-apiserver-7459684667-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a25fc55-9c92-4016-bb5f-0473cc98f094", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7459684667", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea", Pod:"calico-apiserver-7459684667-rxhqh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43f3092cc42", MAC:"da:4e:39:d9:88:d2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.673534 containerd[1559]: 2025-08-19 08:21:31.668 [INFO][4768] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Namespace="calico-apiserver" Pod="calico-apiserver-7459684667-rxhqh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:31.689266 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:31.704121 containerd[1559]: time="2025-08-19T08:21:31.704063064Z" level=info msg="connecting to shim bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" address="unix:///run/containerd/s/b4a6a35f1e9cda6321abaff59ac444d2820ebb5ac3ba3936828979b5d7dea8d8" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:31.730243 systemd[1]: Started cri-containerd-bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea.scope - libcontainer container bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea. Aug 19 08:21:31.735025 containerd[1559]: time="2025-08-19T08:21:31.734956853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5qz84,Uid:56a0933c-2685-4cd7-8b89-03d0cd7801cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8\"" Aug 19 08:21:31.756169 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:31.760948 systemd-networkd[1467]: calia127e7aa30a: Link UP Aug 19 08:21:31.762535 systemd-networkd[1467]: calia127e7aa30a: Gained carrier Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.448 [INFO][4800] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--b4j8r-eth0 csi-node-driver- calico-system 074be308-2f59-4eab-ad49-1f332ee9401f 748 0 2025-08-19 08:21:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-b4j8r eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia127e7aa30a [] [] }} ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.448 [INFO][4800] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.492 [INFO][4839] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" HandleID="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Workload="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.493 [INFO][4839] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" HandleID="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Workload="localhost-k8s-csi--node--driver--b4j8r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7280), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-b4j8r", "timestamp":"2025-08-19 08:21:31.492644157 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.493 [INFO][4839] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.644 [INFO][4839] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.646 [INFO][4839] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.695 [INFO][4839] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.717 [INFO][4839] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.726 [INFO][4839] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.732 [INFO][4839] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.735 [INFO][4839] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.735 [INFO][4839] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.738 [INFO][4839] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802 Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.742 [INFO][4839] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.749 [INFO][4839] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.749 [INFO][4839] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" host="localhost" Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.749 [INFO][4839] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:31.791625 containerd[1559]: 2025-08-19 08:21:31.749 [INFO][4839] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" HandleID="k8s-pod-network.55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Workload="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.754 [INFO][4800] cni-plugin/k8s.go 418: Populated endpoint ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--b4j8r-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"074be308-2f59-4eab-ad49-1f332ee9401f", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-b4j8r", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia127e7aa30a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.755 [INFO][4800] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.755 [INFO][4800] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia127e7aa30a ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.762 [INFO][4800] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.770 [INFO][4800] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--b4j8r-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"074be308-2f59-4eab-ad49-1f332ee9401f", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802", Pod:"csi-node-driver-b4j8r", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia127e7aa30a", MAC:"32:aa:b5:ef:ac:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.792403 containerd[1559]: 2025-08-19 08:21:31.786 [INFO][4800] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" Namespace="calico-system" Pod="csi-node-driver-b4j8r" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4j8r-eth0" Aug 19 08:21:31.806924 containerd[1559]: time="2025-08-19T08:21:31.806879319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7459684667-rxhqh,Uid:4a25fc55-9c92-4016-bb5f-0473cc98f094,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\"" Aug 19 08:21:31.823104 containerd[1559]: time="2025-08-19T08:21:31.822940978Z" level=info msg="connecting to shim 55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802" address="unix:///run/containerd/s/bd0f025f21390bc70fa65912db0294e96e34e4c7b47e77ac42d6b0950a1157fe" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:31.856355 systemd[1]: Started cri-containerd-55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802.scope - libcontainer container 55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802. Aug 19 08:21:31.882183 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:31.898621 systemd-networkd[1467]: cali4b66da806a8: Link UP Aug 19 08:21:31.903155 systemd-networkd[1467]: cali4b66da806a8: Gained carrier Aug 19 08:21:31.919387 containerd[1559]: time="2025-08-19T08:21:31.919331142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4j8r,Uid:074be308-2f59-4eab-ad49-1f332ee9401f,Namespace:calico-system,Attempt:0,} returns sandbox id \"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802\"" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.436 [INFO][4789] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0 calico-kube-controllers-7787566946- calico-system 7677a958-0a64-4b43-a17a-755951db9108 861 0 2025-08-19 08:21:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7787566946 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7787566946-5j8pk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4b66da806a8 [] [] }} ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.437 [INFO][4789] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.498 [INFO][4828] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" HandleID="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Workload="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.498 [INFO][4828] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" HandleID="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Workload="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6f00), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7787566946-5j8pk", "timestamp":"2025-08-19 08:21:31.498255286 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.498 [INFO][4828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.750 [INFO][4828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.750 [INFO][4828] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.795 [INFO][4828] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.817 [INFO][4828] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.824 [INFO][4828] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.827 [INFO][4828] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.829 [INFO][4828] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.829 [INFO][4828] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.836 [INFO][4828] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9 Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.844 [INFO][4828] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.864 [INFO][4828] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.866 [INFO][4828] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" host="localhost" Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.867 [INFO][4828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:31.928168 containerd[1559]: 2025-08-19 08:21:31.867 [INFO][4828] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" HandleID="k8s-pod-network.050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Workload="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.886 [INFO][4789] cni-plugin/k8s.go 418: Populated endpoint ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0", GenerateName:"calico-kube-controllers-7787566946-", Namespace:"calico-system", SelfLink:"", UID:"7677a958-0a64-4b43-a17a-755951db9108", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7787566946", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7787566946-5j8pk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4b66da806a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.886 [INFO][4789] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.886 [INFO][4789] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4b66da806a8 ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.909 [INFO][4789] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.914 [INFO][4789] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0", GenerateName:"calico-kube-controllers-7787566946-", Namespace:"calico-system", SelfLink:"", UID:"7677a958-0a64-4b43-a17a-755951db9108", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 8, 21, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7787566946", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9", Pod:"calico-kube-controllers-7787566946-5j8pk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4b66da806a8", MAC:"a2:71:42:67:3d:b6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 08:21:31.928750 containerd[1559]: 2025-08-19 08:21:31.923 [INFO][4789] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" Namespace="calico-system" Pod="calico-kube-controllers-7787566946-5j8pk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7787566946--5j8pk-eth0" Aug 19 08:21:31.954376 containerd[1559]: time="2025-08-19T08:21:31.954313251Z" level=info msg="connecting to shim 050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9" address="unix:///run/containerd/s/cedbb5c6eff67355ab6a83b64479b4b1f91c57b2ef587d7d168ddbdc6b8b477d" namespace=k8s.io protocol=ttrpc version=3 Aug 19 08:21:31.984269 systemd[1]: Started cri-containerd-050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9.scope - libcontainer container 050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9. Aug 19 08:21:31.998650 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Aug 19 08:21:32.032382 containerd[1559]: time="2025-08-19T08:21:32.032254736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7787566946-5j8pk,Uid:7677a958-0a64-4b43-a17a-755951db9108,Namespace:calico-system,Attempt:0,} returns sandbox id \"050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9\"" Aug 19 08:21:32.762826 systemd[1]: Started sshd@8-10.0.0.150:22-10.0.0.1:55376.service - OpenSSH per-connection server daemon (10.0.0.1:55376). Aug 19 08:21:32.836187 sshd[5089]: Accepted publickey for core from 10.0.0.1 port 55376 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:32.838654 sshd-session[5089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:32.844275 systemd-logind[1540]: New session 9 of user core. Aug 19 08:21:32.854262 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 08:21:32.900373 systemd-networkd[1467]: calia127e7aa30a: Gained IPv6LL Aug 19 08:21:33.000335 sshd[5092]: Connection closed by 10.0.0.1 port 55376 Aug 19 08:21:33.000817 sshd-session[5089]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:33.006754 systemd[1]: sshd@8-10.0.0.150:22-10.0.0.1:55376.service: Deactivated successfully. Aug 19 08:21:33.009161 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 08:21:33.010206 systemd-logind[1540]: Session 9 logged out. Waiting for processes to exit. Aug 19 08:21:33.011693 systemd-logind[1540]: Removed session 9. Aug 19 08:21:33.412325 systemd-networkd[1467]: cali4b66da806a8: Gained IPv6LL Aug 19 08:21:33.476407 systemd-networkd[1467]: cali01bbd391799: Gained IPv6LL Aug 19 08:21:33.540372 systemd-networkd[1467]: cali43f3092cc42: Gained IPv6LL Aug 19 08:21:35.268718 containerd[1559]: time="2025-08-19T08:21:35.268648138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:35.269604 containerd[1559]: time="2025-08-19T08:21:35.269578015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 19 08:21:35.270952 containerd[1559]: time="2025-08-19T08:21:35.270870181Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:35.273408 containerd[1559]: time="2025-08-19T08:21:35.273353665Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:35.273909 containerd[1559]: time="2025-08-19T08:21:35.273870004Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 4.529065474s" Aug 19 08:21:35.273972 containerd[1559]: time="2025-08-19T08:21:35.273913195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:21:35.275127 containerd[1559]: time="2025-08-19T08:21:35.275052494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:21:35.337496 containerd[1559]: time="2025-08-19T08:21:35.337435274Z" level=info msg="CreateContainer within sandbox \"ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:21:35.779328 containerd[1559]: time="2025-08-19T08:21:35.779275477Z" level=info msg="Container fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:35.807212 containerd[1559]: time="2025-08-19T08:21:35.790415596Z" level=info msg="CreateContainer within sandbox \"ddebd733652dc2535951049b6de67e9a1d9c43cab2f383306ee1759ae9f78eb5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58\"" Aug 19 08:21:35.808121 containerd[1559]: time="2025-08-19T08:21:35.807767488Z" level=info msg="StartContainer for \"fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58\"" Aug 19 08:21:35.809028 containerd[1559]: time="2025-08-19T08:21:35.809001486Z" level=info msg="connecting to shim fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58" address="unix:///run/containerd/s/b09388226c18caa93ca0abb24048d9845cbf2af80c60ebbcf160abf985244d15" protocol=ttrpc version=3 Aug 19 08:21:35.814172 containerd[1559]: time="2025-08-19T08:21:35.814051518Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:35.814557 containerd[1559]: time="2025-08-19T08:21:35.814508647Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 08:21:35.818489 containerd[1559]: time="2025-08-19T08:21:35.818425121Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 543.295501ms" Aug 19 08:21:35.818489 containerd[1559]: time="2025-08-19T08:21:35.818466368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:21:35.820971 containerd[1559]: time="2025-08-19T08:21:35.820936166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 08:21:35.828752 containerd[1559]: time="2025-08-19T08:21:35.828238909Z" level=info msg="CreateContainer within sandbox \"9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:21:35.836742 containerd[1559]: time="2025-08-19T08:21:35.836667405Z" level=info msg="Container 736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:35.840224 systemd[1]: Started cri-containerd-fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58.scope - libcontainer container fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58. Aug 19 08:21:35.846394 containerd[1559]: time="2025-08-19T08:21:35.846355056Z" level=info msg="CreateContainer within sandbox \"9afa46b97722c9d88f6a762371ec36cdde48eaee52c38570e106f58f3aaef0f0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793\"" Aug 19 08:21:35.848162 containerd[1559]: time="2025-08-19T08:21:35.847674784Z" level=info msg="StartContainer for \"736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793\"" Aug 19 08:21:35.849618 containerd[1559]: time="2025-08-19T08:21:35.849580863Z" level=info msg="connecting to shim 736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793" address="unix:///run/containerd/s/9880e3b7033a278c8518812a6b7c303e6f8890120abd0e6ee5489c9e16d37921" protocol=ttrpc version=3 Aug 19 08:21:35.881214 systemd[1]: Started cri-containerd-736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793.scope - libcontainer container 736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793. Aug 19 08:21:35.902807 containerd[1559]: time="2025-08-19T08:21:35.902729543Z" level=info msg="StartContainer for \"fd3ec1f26a902cfc732cd2dbb6a127201fa8101bf44bcd53a91ce3c59924dc58\" returns successfully" Aug 19 08:21:35.942675 containerd[1559]: time="2025-08-19T08:21:35.942621751Z" level=info msg="StartContainer for \"736fc0ff2efd019971b860acf4c3a5367fe684576510dced52948913f173f793\" returns successfully" Aug 19 08:21:36.533286 kubelet[2711]: I0819 08:21:36.533195 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56bb94f46f-lkqqb" podStartSLOduration=25.744431529 podStartE2EDuration="32.533176616s" podCreationTimestamp="2025-08-19 08:21:04 +0000 UTC" firstStartedPulling="2025-08-19 08:21:29.030770491 +0000 UTC m=+41.756555808" lastFinishedPulling="2025-08-19 08:21:35.819515578 +0000 UTC m=+48.545300895" observedRunningTime="2025-08-19 08:21:36.532147123 +0000 UTC m=+49.257932440" watchObservedRunningTime="2025-08-19 08:21:36.533176616 +0000 UTC m=+49.258961933" Aug 19 08:21:36.553031 kubelet[2711]: I0819 08:21:36.552685 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7459684667-2qvhr" podStartSLOduration=27.182288399 podStartE2EDuration="33.552671129s" podCreationTimestamp="2025-08-19 08:21:03 +0000 UTC" firstStartedPulling="2025-08-19 08:21:28.904376133 +0000 UTC m=+41.630161460" lastFinishedPulling="2025-08-19 08:21:35.274758873 +0000 UTC m=+48.000544190" observedRunningTime="2025-08-19 08:21:36.552362609 +0000 UTC m=+49.278147926" watchObservedRunningTime="2025-08-19 08:21:36.552671129 +0000 UTC m=+49.278456436" Aug 19 08:21:37.543829 kubelet[2711]: I0819 08:21:37.543782 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:37.544370 kubelet[2711]: I0819 08:21:37.543788 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:38.017367 systemd[1]: Started sshd@9-10.0.0.150:22-10.0.0.1:55382.service - OpenSSH per-connection server daemon (10.0.0.1:55382). Aug 19 08:21:38.085924 sshd[5205]: Accepted publickey for core from 10.0.0.1 port 55382 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:38.087926 sshd-session[5205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:38.093359 systemd-logind[1540]: New session 10 of user core. Aug 19 08:21:38.097216 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 08:21:38.235204 sshd[5208]: Connection closed by 10.0.0.1 port 55382 Aug 19 08:21:38.235597 sshd-session[5205]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:38.240497 systemd[1]: sshd@9-10.0.0.150:22-10.0.0.1:55382.service: Deactivated successfully. Aug 19 08:21:38.242832 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 08:21:38.243606 systemd-logind[1540]: Session 10 logged out. Waiting for processes to exit. Aug 19 08:21:38.244781 systemd-logind[1540]: Removed session 10. Aug 19 08:21:38.654011 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount325977472.mount: Deactivated successfully. Aug 19 08:21:38.695863 containerd[1559]: time="2025-08-19T08:21:38.695798327Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:38.696637 containerd[1559]: time="2025-08-19T08:21:38.696573912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 19 08:21:38.697852 containerd[1559]: time="2025-08-19T08:21:38.697803381Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:38.700243 containerd[1559]: time="2025-08-19T08:21:38.700197255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:38.700796 containerd[1559]: time="2025-08-19T08:21:38.700765111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.879605366s" Aug 19 08:21:38.700796 containerd[1559]: time="2025-08-19T08:21:38.700796279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 19 08:21:38.701799 containerd[1559]: time="2025-08-19T08:21:38.701739400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 08:21:38.707192 containerd[1559]: time="2025-08-19T08:21:38.707145409Z" level=info msg="CreateContainer within sandbox \"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 08:21:38.716867 containerd[1559]: time="2025-08-19T08:21:38.716809402Z" level=info msg="Container 1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:38.725843 containerd[1559]: time="2025-08-19T08:21:38.725783971Z" level=info msg="CreateContainer within sandbox \"1b27dce693818a52e5e0bf41d88933189767a30c778d32c3a846e011e17b4c63\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2\"" Aug 19 08:21:38.726491 containerd[1559]: time="2025-08-19T08:21:38.726387755Z" level=info msg="StartContainer for \"1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2\"" Aug 19 08:21:38.727799 containerd[1559]: time="2025-08-19T08:21:38.727765642Z" level=info msg="connecting to shim 1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2" address="unix:///run/containerd/s/89515da887e8efa6d0a44830ef3dfe57d17c6a8eff211a9a3a0882f270dcb854" protocol=ttrpc version=3 Aug 19 08:21:38.803248 systemd[1]: Started cri-containerd-1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2.scope - libcontainer container 1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2. Aug 19 08:21:39.168744 containerd[1559]: time="2025-08-19T08:21:39.168607436Z" level=info msg="StartContainer for \"1ff4104177e0eea9f1e65dc14ec656a448b259409ce112e5a7a5c47a039789e2\" returns successfully" Aug 19 08:21:39.626853 kubelet[2711]: I0819 08:21:39.626780 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-fd6f5fc45-jvmlq" podStartSLOduration=2.254871773 podStartE2EDuration="12.626763616s" podCreationTimestamp="2025-08-19 08:21:27 +0000 UTC" firstStartedPulling="2025-08-19 08:21:28.3296839 +0000 UTC m=+41.055469217" lastFinishedPulling="2025-08-19 08:21:38.701575743 +0000 UTC m=+51.427361060" observedRunningTime="2025-08-19 08:21:39.626274547 +0000 UTC m=+52.352059874" watchObservedRunningTime="2025-08-19 08:21:39.626763616 +0000 UTC m=+52.352548933" Aug 19 08:21:40.993455 kubelet[2711]: I0819 08:21:40.993400 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:41.225964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1728639533.mount: Deactivated successfully. Aug 19 08:21:41.749673 containerd[1559]: time="2025-08-19T08:21:41.749615661Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:41.750467 containerd[1559]: time="2025-08-19T08:21:41.750432997Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 19 08:21:41.751659 containerd[1559]: time="2025-08-19T08:21:41.751629321Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:41.753947 containerd[1559]: time="2025-08-19T08:21:41.753875207Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:41.754384 containerd[1559]: time="2025-08-19T08:21:41.754359176Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.052571555s" Aug 19 08:21:41.754428 containerd[1559]: time="2025-08-19T08:21:41.754387749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 19 08:21:41.755290 containerd[1559]: time="2025-08-19T08:21:41.755251841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 08:21:41.759548 containerd[1559]: time="2025-08-19T08:21:41.759514873Z" level=info msg="CreateContainer within sandbox \"15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 08:21:41.771863 containerd[1559]: time="2025-08-19T08:21:41.771825712Z" level=info msg="Container 48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:41.780842 containerd[1559]: time="2025-08-19T08:21:41.780800850Z" level=info msg="CreateContainer within sandbox \"15e5c8a9ea91476869e6ce648b3a0e04455c8b97654f6b963962471da43bbdf8\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\"" Aug 19 08:21:41.782798 containerd[1559]: time="2025-08-19T08:21:41.781364367Z" level=info msg="StartContainer for \"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\"" Aug 19 08:21:41.782798 containerd[1559]: time="2025-08-19T08:21:41.782423696Z" level=info msg="connecting to shim 48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec" address="unix:///run/containerd/s/d2c25421e0d7997d2b40f0f9fd5b14530063377e4c7c5b05acab7c4ec783fe7d" protocol=ttrpc version=3 Aug 19 08:21:41.811257 systemd[1]: Started cri-containerd-48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec.scope - libcontainer container 48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec. Aug 19 08:21:41.860750 containerd[1559]: time="2025-08-19T08:21:41.860702242Z" level=info msg="StartContainer for \"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" returns successfully" Aug 19 08:21:42.360759 containerd[1559]: time="2025-08-19T08:21:42.360696357Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:42.361468 containerd[1559]: time="2025-08-19T08:21:42.361424383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 08:21:42.363127 containerd[1559]: time="2025-08-19T08:21:42.363088717Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 607.79647ms" Aug 19 08:21:42.363127 containerd[1559]: time="2025-08-19T08:21:42.363127430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 19 08:21:42.364117 containerd[1559]: time="2025-08-19T08:21:42.364061503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 08:21:42.368024 containerd[1559]: time="2025-08-19T08:21:42.367993373Z" level=info msg="CreateContainer within sandbox \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 08:21:42.376519 containerd[1559]: time="2025-08-19T08:21:42.376278965Z" level=info msg="Container 9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:42.386859 containerd[1559]: time="2025-08-19T08:21:42.386788712Z" level=info msg="CreateContainer within sandbox \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\"" Aug 19 08:21:42.387289 containerd[1559]: time="2025-08-19T08:21:42.387254666Z" level=info msg="StartContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\"" Aug 19 08:21:42.393223 containerd[1559]: time="2025-08-19T08:21:42.393184978Z" level=info msg="connecting to shim 9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473" address="unix:///run/containerd/s/b4a6a35f1e9cda6321abaff59ac444d2820ebb5ac3ba3936828979b5d7dea8d8" protocol=ttrpc version=3 Aug 19 08:21:42.415212 systemd[1]: Started cri-containerd-9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473.scope - libcontainer container 9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473. Aug 19 08:21:42.462905 containerd[1559]: time="2025-08-19T08:21:42.462834601Z" level=info msg="StartContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" returns successfully" Aug 19 08:21:42.575385 kubelet[2711]: I0819 08:21:42.575144 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-5qz84" podStartSLOduration=27.556934374 podStartE2EDuration="37.57512996s" podCreationTimestamp="2025-08-19 08:21:05 +0000 UTC" firstStartedPulling="2025-08-19 08:21:31.736908148 +0000 UTC m=+44.462693455" lastFinishedPulling="2025-08-19 08:21:41.755103723 +0000 UTC m=+54.480889041" observedRunningTime="2025-08-19 08:21:42.574651702 +0000 UTC m=+55.300437019" watchObservedRunningTime="2025-08-19 08:21:42.57512996 +0000 UTC m=+55.300915277" Aug 19 08:21:42.662839 containerd[1559]: time="2025-08-19T08:21:42.662721921Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" id:\"2eec2cd3f0f1a7a70b89215e6a61a0896ce9e3c0a192dad899780c947f597a09\" pid:5367 exit_status:1 exited_at:{seconds:1755591702 nanos:662090886}" Aug 19 08:21:43.307421 systemd[1]: Started sshd@10-10.0.0.150:22-10.0.0.1:56178.service - OpenSSH per-connection server daemon (10.0.0.1:56178). Aug 19 08:21:43.393039 sshd[5386]: Accepted publickey for core from 10.0.0.1 port 56178 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:43.394833 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:43.399444 systemd-logind[1540]: New session 11 of user core. Aug 19 08:21:43.409214 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 08:21:43.570380 sshd[5389]: Connection closed by 10.0.0.1 port 56178 Aug 19 08:21:43.571811 kubelet[2711]: I0819 08:21:43.571743 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:43.572958 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:43.582061 systemd[1]: sshd@10-10.0.0.150:22-10.0.0.1:56178.service: Deactivated successfully. Aug 19 08:21:43.585560 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 08:21:43.588765 systemd-logind[1540]: Session 11 logged out. Waiting for processes to exit. Aug 19 08:21:43.591833 systemd[1]: Started sshd@11-10.0.0.150:22-10.0.0.1:56180.service - OpenSSH per-connection server daemon (10.0.0.1:56180). Aug 19 08:21:43.593970 systemd-logind[1540]: Removed session 11. Aug 19 08:21:43.653463 sshd[5412]: Accepted publickey for core from 10.0.0.1 port 56180 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:43.655673 sshd-session[5412]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:43.661560 systemd-logind[1540]: New session 12 of user core. Aug 19 08:21:43.668307 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 08:21:43.688789 containerd[1559]: time="2025-08-19T08:21:43.688744307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" id:\"97674cdb11ac38fef8a384599bafea4956bd0fd86a861b3ed443e0916d5b24cc\" pid:5417 exit_status:1 exited_at:{seconds:1755591703 nanos:688219171}" Aug 19 08:21:43.827597 sshd[5430]: Connection closed by 10.0.0.1 port 56180 Aug 19 08:21:43.829409 sshd-session[5412]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:43.840596 systemd[1]: sshd@11-10.0.0.150:22-10.0.0.1:56180.service: Deactivated successfully. Aug 19 08:21:43.843086 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 08:21:43.844108 systemd-logind[1540]: Session 12 logged out. Waiting for processes to exit. Aug 19 08:21:43.848304 systemd[1]: Started sshd@12-10.0.0.150:22-10.0.0.1:56188.service - OpenSSH per-connection server daemon (10.0.0.1:56188). Aug 19 08:21:43.849581 systemd-logind[1540]: Removed session 12. Aug 19 08:21:43.904533 sshd[5442]: Accepted publickey for core from 10.0.0.1 port 56188 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:43.906169 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:43.910678 systemd-logind[1540]: New session 13 of user core. Aug 19 08:21:43.922201 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 08:21:44.053906 sshd[5445]: Connection closed by 10.0.0.1 port 56188 Aug 19 08:21:44.054378 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:44.059664 systemd[1]: sshd@12-10.0.0.150:22-10.0.0.1:56188.service: Deactivated successfully. Aug 19 08:21:44.062235 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 08:21:44.064562 systemd-logind[1540]: Session 13 logged out. Waiting for processes to exit. Aug 19 08:21:44.065592 systemd-logind[1540]: Removed session 13. Aug 19 08:21:44.092263 containerd[1559]: time="2025-08-19T08:21:44.092149071Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:44.093016 containerd[1559]: time="2025-08-19T08:21:44.092975442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 19 08:21:44.094232 containerd[1559]: time="2025-08-19T08:21:44.094197346Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:44.095994 containerd[1559]: time="2025-08-19T08:21:44.095959824Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:44.096535 containerd[1559]: time="2025-08-19T08:21:44.096493615Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.732374915s" Aug 19 08:21:44.096535 containerd[1559]: time="2025-08-19T08:21:44.096529803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 19 08:21:44.097956 containerd[1559]: time="2025-08-19T08:21:44.097762076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 08:21:44.101617 containerd[1559]: time="2025-08-19T08:21:44.101583989Z" level=info msg="CreateContainer within sandbox \"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 08:21:44.118097 containerd[1559]: time="2025-08-19T08:21:44.118048771Z" level=info msg="Container 7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:44.134652 containerd[1559]: time="2025-08-19T08:21:44.134599525Z" level=info msg="CreateContainer within sandbox \"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c\"" Aug 19 08:21:44.135145 containerd[1559]: time="2025-08-19T08:21:44.135114371Z" level=info msg="StartContainer for \"7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c\"" Aug 19 08:21:44.136491 containerd[1559]: time="2025-08-19T08:21:44.136459746Z" level=info msg="connecting to shim 7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c" address="unix:///run/containerd/s/bd0f025f21390bc70fa65912db0294e96e34e4c7b47e77ac42d6b0950a1157fe" protocol=ttrpc version=3 Aug 19 08:21:44.158206 systemd[1]: Started cri-containerd-7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c.scope - libcontainer container 7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c. Aug 19 08:21:44.201826 containerd[1559]: time="2025-08-19T08:21:44.201780393Z" level=info msg="StartContainer for \"7f43ff21a4ab6bb66f64bf2923a94512974c9914deba7d00b73212efcec0cd5c\" returns successfully" Aug 19 08:21:44.658517 containerd[1559]: time="2025-08-19T08:21:44.658457758Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" id:\"35040982fad8e790281f7d36156140a24d4ee794700fba4a11d20b353144d1ba\" pid:5505 exit_status:1 exited_at:{seconds:1755591704 nanos:658145592}" Aug 19 08:21:47.343061 containerd[1559]: time="2025-08-19T08:21:47.342940154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:47.343725 containerd[1559]: time="2025-08-19T08:21:47.343660246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 19 08:21:47.344908 containerd[1559]: time="2025-08-19T08:21:47.344875086Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:47.346743 containerd[1559]: time="2025-08-19T08:21:47.346706994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:47.347197 containerd[1559]: time="2025-08-19T08:21:47.347154914Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 3.249358664s" Aug 19 08:21:47.347197 containerd[1559]: time="2025-08-19T08:21:47.347195401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 19 08:21:47.348185 containerd[1559]: time="2025-08-19T08:21:47.348139031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 08:21:47.398113 containerd[1559]: time="2025-08-19T08:21:47.397909032Z" level=info msg="CreateContainer within sandbox \"050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 08:21:47.502668 containerd[1559]: time="2025-08-19T08:21:47.502601133Z" level=info msg="Container 5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:47.523095 containerd[1559]: time="2025-08-19T08:21:47.523032235Z" level=info msg="CreateContainer within sandbox \"050950c78285853631fd1a873f25a6c8c5c3aa6a9a41d0a6782b9dd5b49567f9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\"" Aug 19 08:21:47.523765 containerd[1559]: time="2025-08-19T08:21:47.523494593Z" level=info msg="StartContainer for \"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\"" Aug 19 08:21:47.524486 containerd[1559]: time="2025-08-19T08:21:47.524460495Z" level=info msg="connecting to shim 5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba" address="unix:///run/containerd/s/cedbb5c6eff67355ab6a83b64479b4b1f91c57b2ef587d7d168ddbdc6b8b477d" protocol=ttrpc version=3 Aug 19 08:21:47.565302 systemd[1]: Started cri-containerd-5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba.scope - libcontainer container 5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba. Aug 19 08:21:47.810330 containerd[1559]: time="2025-08-19T08:21:47.810247257Z" level=info msg="StartContainer for \"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\" returns successfully" Aug 19 08:21:48.599442 kubelet[2711]: I0819 08:21:48.599378 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7459684667-rxhqh" podStartSLOduration=35.043851101 podStartE2EDuration="45.599362036s" podCreationTimestamp="2025-08-19 08:21:03 +0000 UTC" firstStartedPulling="2025-08-19 08:21:31.808362605 +0000 UTC m=+44.534147912" lastFinishedPulling="2025-08-19 08:21:42.36387353 +0000 UTC m=+55.089658847" observedRunningTime="2025-08-19 08:21:42.591538699 +0000 UTC m=+55.317324016" watchObservedRunningTime="2025-08-19 08:21:48.599362036 +0000 UTC m=+61.325147353" Aug 19 08:21:48.600247 kubelet[2711]: I0819 08:21:48.599545 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7787566946-5j8pk" podStartSLOduration=27.285289092 podStartE2EDuration="42.59954016s" podCreationTimestamp="2025-08-19 08:21:06 +0000 UTC" firstStartedPulling="2025-08-19 08:21:32.033686916 +0000 UTC m=+44.759472223" lastFinishedPulling="2025-08-19 08:21:47.347937974 +0000 UTC m=+60.073723291" observedRunningTime="2025-08-19 08:21:48.598601258 +0000 UTC m=+61.324386595" watchObservedRunningTime="2025-08-19 08:21:48.59954016 +0000 UTC m=+61.325325477" Aug 19 08:21:48.632488 containerd[1559]: time="2025-08-19T08:21:48.632447523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\" id:\"435daa568fb6620edcdb4358e6698b9f1927c37b28c94906c34cb6ff5d261d53\" pid:5583 exited_at:{seconds:1755591708 nanos:631858337}" Aug 19 08:21:49.069305 systemd[1]: Started sshd@13-10.0.0.150:22-10.0.0.1:49716.service - OpenSSH per-connection server daemon (10.0.0.1:49716). Aug 19 08:21:49.152317 sshd[5595]: Accepted publickey for core from 10.0.0.1 port 49716 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:49.154217 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:49.163042 systemd-logind[1540]: New session 14 of user core. Aug 19 08:21:49.176204 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 08:21:49.387964 sshd[5598]: Connection closed by 10.0.0.1 port 49716 Aug 19 08:21:49.388240 sshd-session[5595]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:49.392635 systemd[1]: sshd@13-10.0.0.150:22-10.0.0.1:49716.service: Deactivated successfully. Aug 19 08:21:49.394805 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 08:21:49.395943 systemd-logind[1540]: Session 14 logged out. Waiting for processes to exit. Aug 19 08:21:49.397826 systemd-logind[1540]: Removed session 14. Aug 19 08:21:49.567049 containerd[1559]: time="2025-08-19T08:21:49.566990819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:49.567704 containerd[1559]: time="2025-08-19T08:21:49.567647281Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 19 08:21:49.568814 containerd[1559]: time="2025-08-19T08:21:49.568771361Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:49.570616 containerd[1559]: time="2025-08-19T08:21:49.570584853Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 08:21:49.571175 containerd[1559]: time="2025-08-19T08:21:49.571135978Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.209175058s" Aug 19 08:21:49.571219 containerd[1559]: time="2025-08-19T08:21:49.571178348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 19 08:21:49.576280 containerd[1559]: time="2025-08-19T08:21:49.576253441Z" level=info msg="CreateContainer within sandbox \"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 08:21:49.585551 containerd[1559]: time="2025-08-19T08:21:49.585506686Z" level=info msg="Container 332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b: CDI devices from CRI Config.CDIDevices: []" Aug 19 08:21:49.595091 containerd[1559]: time="2025-08-19T08:21:49.595038441Z" level=info msg="CreateContainer within sandbox \"55535b8d197bac9ce59570387059c8b6247628c56c36a953ff9cf9a318ad1802\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b\"" Aug 19 08:21:49.595489 containerd[1559]: time="2025-08-19T08:21:49.595442440Z" level=info msg="StartContainer for \"332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b\"" Aug 19 08:21:49.596847 containerd[1559]: time="2025-08-19T08:21:49.596821378Z" level=info msg="connecting to shim 332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b" address="unix:///run/containerd/s/bd0f025f21390bc70fa65912db0294e96e34e4c7b47e77ac42d6b0950a1157fe" protocol=ttrpc version=3 Aug 19 08:21:49.624237 systemd[1]: Started cri-containerd-332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b.scope - libcontainer container 332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b. Aug 19 08:21:49.740217 containerd[1559]: time="2025-08-19T08:21:49.740175026Z" level=info msg="StartContainer for \"332d1a328f5df92e604ab01a623671248b7d4336f4bc332db9a69b02d46aec8b\" returns successfully" Aug 19 08:21:50.429745 kubelet[2711]: I0819 08:21:50.429702 2711 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 08:21:50.430826 kubelet[2711]: I0819 08:21:50.430798 2711 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 08:21:50.609509 kubelet[2711]: I0819 08:21:50.609373 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-b4j8r" podStartSLOduration=26.958956812 podStartE2EDuration="44.609352269s" podCreationTimestamp="2025-08-19 08:21:06 +0000 UTC" firstStartedPulling="2025-08-19 08:21:31.921460622 +0000 UTC m=+44.647245939" lastFinishedPulling="2025-08-19 08:21:49.571856079 +0000 UTC m=+62.297641396" observedRunningTime="2025-08-19 08:21:50.608156495 +0000 UTC m=+63.333941812" watchObservedRunningTime="2025-08-19 08:21:50.609352269 +0000 UTC m=+63.335137586" Aug 19 08:21:51.214983 kubelet[2711]: I0819 08:21:51.214925 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:53.372036 kubelet[2711]: I0819 08:21:53.371983 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 08:21:53.460902 containerd[1559]: time="2025-08-19T08:21:53.460837361Z" level=info msg="StopContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" with timeout 30 (s)" Aug 19 08:21:53.465946 containerd[1559]: time="2025-08-19T08:21:53.465895238Z" level=info msg="Stop container \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" with signal terminated" Aug 19 08:21:53.481576 systemd[1]: cri-containerd-9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473.scope: Deactivated successfully. Aug 19 08:21:53.485273 containerd[1559]: time="2025-08-19T08:21:53.485217330Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" id:\"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" pid:5330 exit_status:1 exited_at:{seconds:1755591713 nanos:484755702}" Aug 19 08:21:53.485685 containerd[1559]: time="2025-08-19T08:21:53.485645326Z" level=info msg="received exit event container_id:\"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" id:\"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" pid:5330 exit_status:1 exited_at:{seconds:1755591713 nanos:484755702}" Aug 19 08:21:53.509555 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473-rootfs.mount: Deactivated successfully. Aug 19 08:21:53.532786 containerd[1559]: time="2025-08-19T08:21:53.532742606Z" level=info msg="StopContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" returns successfully" Aug 19 08:21:53.552661 containerd[1559]: time="2025-08-19T08:21:53.552629472Z" level=info msg="StopPodSandbox for \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\"" Aug 19 08:21:53.572322 containerd[1559]: time="2025-08-19T08:21:53.572273651Z" level=info msg="Container to stop \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 19 08:21:53.587004 systemd[1]: cri-containerd-bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea.scope: Deactivated successfully. Aug 19 08:21:53.589011 containerd[1559]: time="2025-08-19T08:21:53.588974312Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" id:\"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" pid:4966 exit_status:137 exited_at:{seconds:1755591713 nanos:588694455}" Aug 19 08:21:53.617667 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea-rootfs.mount: Deactivated successfully. Aug 19 08:21:53.630861 containerd[1559]: time="2025-08-19T08:21:53.630758957Z" level=info msg="shim disconnected" id=bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea namespace=k8s.io Aug 19 08:21:53.630861 containerd[1559]: time="2025-08-19T08:21:53.630791087Z" level=warning msg="cleaning up after shim disconnected" id=bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea namespace=k8s.io Aug 19 08:21:53.645220 containerd[1559]: time="2025-08-19T08:21:53.630799022Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 19 08:21:53.707793 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea-shm.mount: Deactivated successfully. Aug 19 08:21:53.717374 containerd[1559]: time="2025-08-19T08:21:53.717329704Z" level=info msg="received exit event sandbox_id:\"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" exit_status:137 exited_at:{seconds:1755591713 nanos:588694455}" Aug 19 08:21:53.807852 systemd-networkd[1467]: cali43f3092cc42: Link DOWN Aug 19 08:21:53.807861 systemd-networkd[1467]: cali43f3092cc42: Lost carrier Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.805 [INFO][5731] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.806 [INFO][5731] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" iface="eth0" netns="/var/run/netns/cni-358e2f50-c89b-9165-98f2-ced50ce8613b" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.806 [INFO][5731] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" iface="eth0" netns="/var/run/netns/cni-358e2f50-c89b-9165-98f2-ced50ce8613b" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.819 [INFO][5731] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" after=12.70798ms iface="eth0" netns="/var/run/netns/cni-358e2f50-c89b-9165-98f2-ced50ce8613b" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.819 [INFO][5731] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.819 [INFO][5731] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.851 [INFO][5745] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.851 [INFO][5745] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.851 [INFO][5745] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.933 [INFO][5745] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.933 [INFO][5745] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" HandleID="k8s-pod-network.bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Workload="localhost-k8s-calico--apiserver--7459684667--rxhqh-eth0" Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.935 [INFO][5745] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 08:21:53.946345 containerd[1559]: 2025-08-19 08:21:53.942 [INFO][5731] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea" Aug 19 08:21:53.948379 containerd[1559]: time="2025-08-19T08:21:53.948264292Z" level=info msg="TearDown network for sandbox \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" successfully" Aug 19 08:21:53.948379 containerd[1559]: time="2025-08-19T08:21:53.948298115Z" level=info msg="StopPodSandbox for \"bad5e4424808cc806c9922e9a084d10d989bd1311b9289a31c13a83c4c69fdea\" returns successfully" Aug 19 08:21:53.950624 systemd[1]: run-netns-cni\x2d358e2f50\x2dc89b\x2d9165\x2d98f2\x2dced50ce8613b.mount: Deactivated successfully. Aug 19 08:21:53.989771 kubelet[2711]: I0819 08:21:53.989721 2711 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9jnj\" (UniqueName: \"kubernetes.io/projected/4a25fc55-9c92-4016-bb5f-0473cc98f094-kube-api-access-s9jnj\") pod \"4a25fc55-9c92-4016-bb5f-0473cc98f094\" (UID: \"4a25fc55-9c92-4016-bb5f-0473cc98f094\") " Aug 19 08:21:53.989771 kubelet[2711]: I0819 08:21:53.989777 2711 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a25fc55-9c92-4016-bb5f-0473cc98f094-calico-apiserver-certs\") pod \"4a25fc55-9c92-4016-bb5f-0473cc98f094\" (UID: \"4a25fc55-9c92-4016-bb5f-0473cc98f094\") " Aug 19 08:21:53.995409 kubelet[2711]: I0819 08:21:53.995331 2711 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a25fc55-9c92-4016-bb5f-0473cc98f094-kube-api-access-s9jnj" (OuterVolumeSpecName: "kube-api-access-s9jnj") pod "4a25fc55-9c92-4016-bb5f-0473cc98f094" (UID: "4a25fc55-9c92-4016-bb5f-0473cc98f094"). InnerVolumeSpecName "kube-api-access-s9jnj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 08:21:53.995409 kubelet[2711]: I0819 08:21:53.995341 2711 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a25fc55-9c92-4016-bb5f-0473cc98f094-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "4a25fc55-9c92-4016-bb5f-0473cc98f094" (UID: "4a25fc55-9c92-4016-bb5f-0473cc98f094"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 08:21:53.997110 systemd[1]: var-lib-kubelet-pods-4a25fc55\x2d9c92\x2d4016\x2dbb5f\x2d0473cc98f094-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds9jnj.mount: Deactivated successfully. Aug 19 08:21:53.997244 systemd[1]: var-lib-kubelet-pods-4a25fc55\x2d9c92\x2d4016\x2dbb5f\x2d0473cc98f094-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 19 08:21:54.091115 kubelet[2711]: I0819 08:21:54.091025 2711 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9jnj\" (UniqueName: \"kubernetes.io/projected/4a25fc55-9c92-4016-bb5f-0473cc98f094-kube-api-access-s9jnj\") on node \"localhost\" DevicePath \"\"" Aug 19 08:21:54.091115 kubelet[2711]: I0819 08:21:54.091053 2711 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a25fc55-9c92-4016-bb5f-0473cc98f094-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Aug 19 08:21:54.237485 containerd[1559]: time="2025-08-19T08:21:54.237410556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" id:\"2b6bbc21181ceedc414e21e39adc2e40eb2af13833e65ff7eaee3d1701d0e99d\" pid:5767 exited_at:{seconds:1755591714 nanos:237086693}" Aug 19 08:21:54.407221 systemd[1]: Started sshd@14-10.0.0.150:22-10.0.0.1:49724.service - OpenSSH per-connection server daemon (10.0.0.1:49724). Aug 19 08:21:54.482550 sshd[5779]: Accepted publickey for core from 10.0.0.1 port 49724 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:54.484170 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:54.488603 systemd-logind[1540]: New session 15 of user core. Aug 19 08:21:54.495210 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 08:21:54.610966 kubelet[2711]: I0819 08:21:54.610927 2711 scope.go:117] "RemoveContainer" containerID="9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473" Aug 19 08:21:54.614761 containerd[1559]: time="2025-08-19T08:21:54.614709613Z" level=info msg="RemoveContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\"" Aug 19 08:21:54.617740 systemd[1]: Removed slice kubepods-besteffort-pod4a25fc55_9c92_4016_bb5f_0473cc98f094.slice - libcontainer container kubepods-besteffort-pod4a25fc55_9c92_4016_bb5f_0473cc98f094.slice. Aug 19 08:21:54.620250 containerd[1559]: time="2025-08-19T08:21:54.620216676Z" level=info msg="RemoveContainer for \"9e46cf73fed606f26bae4fceed396ef7eb8873642305eb1edbacca9185092473\" returns successfully" Aug 19 08:21:54.698826 sshd[5784]: Connection closed by 10.0.0.1 port 49724 Aug 19 08:21:54.699262 sshd-session[5779]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:54.704233 systemd[1]: sshd@14-10.0.0.150:22-10.0.0.1:49724.service: Deactivated successfully. Aug 19 08:21:54.706380 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 08:21:54.707193 systemd-logind[1540]: Session 15 logged out. Waiting for processes to exit. Aug 19 08:21:54.708585 systemd-logind[1540]: Removed session 15. Aug 19 08:21:55.365881 kubelet[2711]: I0819 08:21:55.365830 2711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a25fc55-9c92-4016-bb5f-0473cc98f094" path="/var/lib/kubelet/pods/4a25fc55-9c92-4016-bb5f-0473cc98f094/volumes" Aug 19 08:21:58.556388 containerd[1559]: time="2025-08-19T08:21:58.556341527Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6875c65604070fc2d906cbd4ae4c34a3a04662dcdba2fc783111c8062f647eb6\" id:\"558107b27a4286c680ef44b55dab2d84a25d4f938a5b7bcbad66e4b5599268d4\" pid:5814 exited_at:{seconds:1755591718 nanos:556032605}" Aug 19 08:21:59.718743 systemd[1]: Started sshd@15-10.0.0.150:22-10.0.0.1:54610.service - OpenSSH per-connection server daemon (10.0.0.1:54610). Aug 19 08:21:59.765755 sshd[5830]: Accepted publickey for core from 10.0.0.1 port 54610 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:21:59.767427 sshd-session[5830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:21:59.771369 systemd-logind[1540]: New session 16 of user core. Aug 19 08:21:59.782208 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 08:21:59.895364 sshd[5833]: Connection closed by 10.0.0.1 port 54610 Aug 19 08:21:59.895701 sshd-session[5830]: pam_unix(sshd:session): session closed for user core Aug 19 08:21:59.900300 systemd[1]: sshd@15-10.0.0.150:22-10.0.0.1:54610.service: Deactivated successfully. Aug 19 08:21:59.902402 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 08:21:59.903278 systemd-logind[1540]: Session 16 logged out. Waiting for processes to exit. Aug 19 08:21:59.904343 systemd-logind[1540]: Removed session 16. Aug 19 08:22:04.914403 systemd[1]: Started sshd@16-10.0.0.150:22-10.0.0.1:54620.service - OpenSSH per-connection server daemon (10.0.0.1:54620). Aug 19 08:22:04.972650 sshd[5849]: Accepted publickey for core from 10.0.0.1 port 54620 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:04.974334 sshd-session[5849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:04.979372 systemd-logind[1540]: New session 17 of user core. Aug 19 08:22:04.984259 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 08:22:05.098865 sshd[5852]: Connection closed by 10.0.0.1 port 54620 Aug 19 08:22:05.099244 sshd-session[5849]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:05.111956 systemd[1]: sshd@16-10.0.0.150:22-10.0.0.1:54620.service: Deactivated successfully. Aug 19 08:22:05.114150 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 08:22:05.114947 systemd-logind[1540]: Session 17 logged out. Waiting for processes to exit. Aug 19 08:22:05.117925 systemd[1]: Started sshd@17-10.0.0.150:22-10.0.0.1:54622.service - OpenSSH per-connection server daemon (10.0.0.1:54622). Aug 19 08:22:05.118535 systemd-logind[1540]: Removed session 17. Aug 19 08:22:05.177242 sshd[5865]: Accepted publickey for core from 10.0.0.1 port 54622 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:05.178663 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:05.183163 systemd-logind[1540]: New session 18 of user core. Aug 19 08:22:05.190219 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 08:22:05.412889 sshd[5868]: Connection closed by 10.0.0.1 port 54622 Aug 19 08:22:05.413887 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:05.422982 systemd[1]: sshd@17-10.0.0.150:22-10.0.0.1:54622.service: Deactivated successfully. Aug 19 08:22:05.424999 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 08:22:05.425787 systemd-logind[1540]: Session 18 logged out. Waiting for processes to exit. Aug 19 08:22:05.428784 systemd[1]: Started sshd@18-10.0.0.150:22-10.0.0.1:54636.service - OpenSSH per-connection server daemon (10.0.0.1:54636). Aug 19 08:22:05.430036 systemd-logind[1540]: Removed session 18. Aug 19 08:22:05.496939 sshd[5880]: Accepted publickey for core from 10.0.0.1 port 54636 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:05.498268 sshd-session[5880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:05.503120 systemd-logind[1540]: New session 19 of user core. Aug 19 08:22:05.517195 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 19 08:22:05.998533 sshd[5883]: Connection closed by 10.0.0.1 port 54636 Aug 19 08:22:05.999664 sshd-session[5880]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:06.011533 systemd[1]: sshd@18-10.0.0.150:22-10.0.0.1:54636.service: Deactivated successfully. Aug 19 08:22:06.014778 systemd[1]: session-19.scope: Deactivated successfully. Aug 19 08:22:06.017566 systemd-logind[1540]: Session 19 logged out. Waiting for processes to exit. Aug 19 08:22:06.021330 systemd[1]: Started sshd@19-10.0.0.150:22-10.0.0.1:54638.service - OpenSSH per-connection server daemon (10.0.0.1:54638). Aug 19 08:22:06.023020 systemd-logind[1540]: Removed session 19. Aug 19 08:22:06.075704 sshd[5903]: Accepted publickey for core from 10.0.0.1 port 54638 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:06.077012 sshd-session[5903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:06.081615 systemd-logind[1540]: New session 20 of user core. Aug 19 08:22:06.092207 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 19 08:22:06.415875 sshd[5906]: Connection closed by 10.0.0.1 port 54638 Aug 19 08:22:06.418785 sshd-session[5903]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:06.428444 systemd[1]: sshd@19-10.0.0.150:22-10.0.0.1:54638.service: Deactivated successfully. Aug 19 08:22:06.430745 systemd[1]: session-20.scope: Deactivated successfully. Aug 19 08:22:06.431660 systemd-logind[1540]: Session 20 logged out. Waiting for processes to exit. Aug 19 08:22:06.435283 systemd[1]: Started sshd@20-10.0.0.150:22-10.0.0.1:54650.service - OpenSSH per-connection server daemon (10.0.0.1:54650). Aug 19 08:22:06.436158 systemd-logind[1540]: Removed session 20. Aug 19 08:22:06.500032 sshd[5918]: Accepted publickey for core from 10.0.0.1 port 54650 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:06.501361 sshd-session[5918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:06.505716 systemd-logind[1540]: New session 21 of user core. Aug 19 08:22:06.517231 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 19 08:22:06.685188 sshd[5921]: Connection closed by 10.0.0.1 port 54650 Aug 19 08:22:06.685608 sshd-session[5918]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:06.694247 systemd-logind[1540]: Session 21 logged out. Waiting for processes to exit. Aug 19 08:22:06.695973 systemd[1]: sshd@20-10.0.0.150:22-10.0.0.1:54650.service: Deactivated successfully. Aug 19 08:22:06.700932 systemd[1]: session-21.scope: Deactivated successfully. Aug 19 08:22:06.705685 systemd-logind[1540]: Removed session 21. Aug 19 08:22:08.197627 containerd[1559]: time="2025-08-19T08:22:08.197576791Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\" id:\"004abc30009babf66b48b91befb4585cb49edfbee19b902be11f2457765f7817\" pid:5945 exited_at:{seconds:1755591728 nanos:197297629}" Aug 19 08:22:11.701996 systemd[1]: Started sshd@21-10.0.0.150:22-10.0.0.1:47632.service - OpenSSH per-connection server daemon (10.0.0.1:47632). Aug 19 08:22:11.767972 sshd[5964]: Accepted publickey for core from 10.0.0.1 port 47632 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:11.769485 sshd-session[5964]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:11.773577 systemd-logind[1540]: New session 22 of user core. Aug 19 08:22:11.781266 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 19 08:22:12.066137 sshd[5967]: Connection closed by 10.0.0.1 port 47632 Aug 19 08:22:12.066389 sshd-session[5964]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:12.071182 systemd[1]: sshd@21-10.0.0.150:22-10.0.0.1:47632.service: Deactivated successfully. Aug 19 08:22:12.073372 systemd[1]: session-22.scope: Deactivated successfully. Aug 19 08:22:12.074197 systemd-logind[1540]: Session 22 logged out. Waiting for processes to exit. Aug 19 08:22:12.075570 systemd-logind[1540]: Removed session 22. Aug 19 08:22:14.650377 containerd[1559]: time="2025-08-19T08:22:14.650309460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48dd0603b359c09b0657ea63b6288b1b4cf451990a5ab5799be86e73e4ca65ec\" id:\"c857975b0e270be3dd7f99a2624607645e33637c279b51db57317cdfa4550516\" pid:5993 exited_at:{seconds:1755591734 nanos:650023265}" Aug 19 08:22:17.091361 systemd[1]: Started sshd@22-10.0.0.150:22-10.0.0.1:47640.service - OpenSSH per-connection server daemon (10.0.0.1:47640). Aug 19 08:22:17.148868 sshd[6008]: Accepted publickey for core from 10.0.0.1 port 47640 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:17.150259 sshd-session[6008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:17.154430 systemd-logind[1540]: New session 23 of user core. Aug 19 08:22:17.163244 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 19 08:22:17.273781 sshd[6011]: Connection closed by 10.0.0.1 port 47640 Aug 19 08:22:17.274145 sshd-session[6008]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:17.278292 systemd[1]: sshd@22-10.0.0.150:22-10.0.0.1:47640.service: Deactivated successfully. Aug 19 08:22:17.280556 systemd[1]: session-23.scope: Deactivated successfully. Aug 19 08:22:17.282050 systemd-logind[1540]: Session 23 logged out. Waiting for processes to exit. Aug 19 08:22:17.283493 systemd-logind[1540]: Removed session 23. Aug 19 08:22:18.634746 containerd[1559]: time="2025-08-19T08:22:18.634693395Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5250e7ed40969f2196eec0a2e624f1761093cc0b079f4f022d70ace7977c4fba\" id:\"4aae29923f0857c3fe89232cacd28472789fb983232e636c91cc5d7ce086476b\" pid:6036 exited_at:{seconds:1755591738 nanos:634501801}" Aug 19 08:22:22.286978 systemd[1]: Started sshd@23-10.0.0.150:22-10.0.0.1:39114.service - OpenSSH per-connection server daemon (10.0.0.1:39114). Aug 19 08:22:22.353966 sshd[6050]: Accepted publickey for core from 10.0.0.1 port 39114 ssh2: RSA SHA256:uZ8V7j8LCmTM3KSaAXgS8PVqC8G+A4ZV+k7lCn4cemQ Aug 19 08:22:22.355698 sshd-session[6050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 08:22:22.359869 systemd-logind[1540]: New session 24 of user core. Aug 19 08:22:22.369203 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 19 08:22:22.478599 sshd[6053]: Connection closed by 10.0.0.1 port 39114 Aug 19 08:22:22.478972 sshd-session[6050]: pam_unix(sshd:session): session closed for user core Aug 19 08:22:22.483207 systemd[1]: sshd@23-10.0.0.150:22-10.0.0.1:39114.service: Deactivated successfully. Aug 19 08:22:22.485497 systemd[1]: session-24.scope: Deactivated successfully. Aug 19 08:22:22.486395 systemd-logind[1540]: Session 24 logged out. Waiting for processes to exit. Aug 19 08:22:22.487722 systemd-logind[1540]: Removed session 24. Aug 19 08:22:23.370645 kubelet[2711]: E0819 08:22:23.370364 2711 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8"