Sep 12 06:00:35.807671 kernel: Linux version 6.12.46-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 04:02:32 -00 2025 Sep 12 06:00:35.807701 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d36684c42387dba16669740eb40ca6a094be0dfb03f64a303630b6ac6cfe48d3 Sep 12 06:00:35.807715 kernel: BIOS-provided physical RAM map: Sep 12 06:00:35.807723 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 06:00:35.807731 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 12 06:00:35.807739 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 12 06:00:35.807749 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 12 06:00:35.807758 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 12 06:00:35.807766 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 12 06:00:35.807777 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 12 06:00:35.807786 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 12 06:00:35.807794 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 12 06:00:35.807803 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 12 06:00:35.807811 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 12 06:00:35.807822 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 12 06:00:35.807833 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 12 06:00:35.807842 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 12 06:00:35.807851 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 12 06:00:35.807860 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 12 06:00:35.807869 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 12 06:00:35.807878 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 12 06:00:35.807887 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 12 06:00:35.807896 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 06:00:35.807905 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 06:00:35.807914 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 12 06:00:35.807925 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 06:00:35.807934 kernel: NX (Execute Disable) protection: active Sep 12 06:00:35.807943 kernel: APIC: Static calls initialized Sep 12 06:00:35.807952 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 12 06:00:35.807961 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 12 06:00:35.807970 kernel: extended physical RAM map: Sep 12 06:00:35.807979 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 06:00:35.807988 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 12 06:00:35.807998 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 12 06:00:35.808007 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 12 06:00:35.808016 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 12 06:00:35.808028 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 12 06:00:35.808037 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 12 06:00:35.808046 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 12 06:00:35.808055 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 12 06:00:35.808068 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 12 06:00:35.808078 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 12 06:00:35.808089 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 12 06:00:35.808099 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 12 06:00:35.808108 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 12 06:00:35.808118 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 12 06:00:35.808127 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 12 06:00:35.808137 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 12 06:00:35.808147 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 12 06:00:35.808156 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 12 06:00:35.808166 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 12 06:00:35.808175 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 12 06:00:35.808187 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 12 06:00:35.808197 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 12 06:00:35.808206 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 06:00:35.808215 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 06:00:35.808224 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 12 06:00:35.808233 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 06:00:35.808242 kernel: efi: EFI v2.7 by EDK II Sep 12 06:00:35.808252 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 12 06:00:35.808262 kernel: random: crng init done Sep 12 06:00:35.808271 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 12 06:00:35.808278 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 12 06:00:35.808298 kernel: secureboot: Secure boot disabled Sep 12 06:00:35.808306 kernel: SMBIOS 2.8 present. Sep 12 06:00:35.808313 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 12 06:00:35.808320 kernel: DMI: Memory slots populated: 1/1 Sep 12 06:00:35.808327 kernel: Hypervisor detected: KVM Sep 12 06:00:35.808334 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 06:00:35.808343 kernel: kvm-clock: using sched offset of 3579424872 cycles Sep 12 06:00:35.808353 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 06:00:35.808362 kernel: tsc: Detected 2794.748 MHz processor Sep 12 06:00:35.808370 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 06:00:35.808377 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 06:00:35.808388 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 12 06:00:35.808398 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 12 06:00:35.808407 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 06:00:35.808417 kernel: Using GB pages for direct mapping Sep 12 06:00:35.808457 kernel: ACPI: Early table checksum verification disabled Sep 12 06:00:35.808465 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 12 06:00:35.808472 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 12 06:00:35.808481 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808491 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808505 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 12 06:00:35.808515 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808525 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808535 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808545 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 06:00:35.808555 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 06:00:35.808565 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 12 06:00:35.808575 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 12 06:00:35.808585 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 12 06:00:35.808598 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 12 06:00:35.808608 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 12 06:00:35.808618 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 12 06:00:35.808627 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 12 06:00:35.808637 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 12 06:00:35.808647 kernel: No NUMA configuration found Sep 12 06:00:35.808656 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 12 06:00:35.808666 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 12 06:00:35.808676 kernel: Zone ranges: Sep 12 06:00:35.808689 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 06:00:35.808699 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 12 06:00:35.808708 kernel: Normal empty Sep 12 06:00:35.808719 kernel: Device empty Sep 12 06:00:35.808728 kernel: Movable zone start for each node Sep 12 06:00:35.808739 kernel: Early memory node ranges Sep 12 06:00:35.808749 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 12 06:00:35.808759 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 12 06:00:35.808769 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 12 06:00:35.808779 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 12 06:00:35.808793 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 12 06:00:35.808802 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 12 06:00:35.808811 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 12 06:00:35.808820 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 12 06:00:35.808829 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 12 06:00:35.808839 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 06:00:35.808849 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 12 06:00:35.808870 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 12 06:00:35.808880 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 06:00:35.808890 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 12 06:00:35.808899 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 12 06:00:35.808906 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 12 06:00:35.808916 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 12 06:00:35.808924 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 12 06:00:35.808932 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 06:00:35.808939 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 06:00:35.808947 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 06:00:35.808957 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 06:00:35.808964 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 06:00:35.808972 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 06:00:35.808980 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 06:00:35.808987 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 06:00:35.808995 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 06:00:35.809003 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 06:00:35.809010 kernel: TSC deadline timer available Sep 12 06:00:35.809018 kernel: CPU topo: Max. logical packages: 1 Sep 12 06:00:35.809027 kernel: CPU topo: Max. logical dies: 1 Sep 12 06:00:35.809035 kernel: CPU topo: Max. dies per package: 1 Sep 12 06:00:35.809042 kernel: CPU topo: Max. threads per core: 1 Sep 12 06:00:35.809050 kernel: CPU topo: Num. cores per package: 4 Sep 12 06:00:35.809058 kernel: CPU topo: Num. threads per package: 4 Sep 12 06:00:35.809065 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 12 06:00:35.809073 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 06:00:35.809081 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 12 06:00:35.809088 kernel: kvm-guest: setup PV sched yield Sep 12 06:00:35.809096 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 12 06:00:35.809106 kernel: Booting paravirtualized kernel on KVM Sep 12 06:00:35.809114 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 06:00:35.809122 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 12 06:00:35.809130 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 12 06:00:35.809137 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 12 06:00:35.809145 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 12 06:00:35.809152 kernel: kvm-guest: PV spinlocks enabled Sep 12 06:00:35.809160 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 06:00:35.809171 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d36684c42387dba16669740eb40ca6a094be0dfb03f64a303630b6ac6cfe48d3 Sep 12 06:00:35.809180 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 06:00:35.809187 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 06:00:35.809195 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 06:00:35.809203 kernel: Fallback order for Node 0: 0 Sep 12 06:00:35.809211 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 12 06:00:35.809218 kernel: Policy zone: DMA32 Sep 12 06:00:35.809226 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 06:00:35.809234 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 06:00:35.809243 kernel: ftrace: allocating 40123 entries in 157 pages Sep 12 06:00:35.809251 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 06:00:35.809259 kernel: Dynamic Preempt: voluntary Sep 12 06:00:35.809267 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 06:00:35.809275 kernel: rcu: RCU event tracing is enabled. Sep 12 06:00:35.809284 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 06:00:35.809302 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 06:00:35.809310 kernel: Rude variant of Tasks RCU enabled. Sep 12 06:00:35.809317 kernel: Tracing variant of Tasks RCU enabled. Sep 12 06:00:35.809327 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 06:00:35.809335 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 06:00:35.809343 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 06:00:35.809351 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 06:00:35.809362 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 06:00:35.809369 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 12 06:00:35.809377 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 06:00:35.809385 kernel: Console: colour dummy device 80x25 Sep 12 06:00:35.809393 kernel: printk: legacy console [ttyS0] enabled Sep 12 06:00:35.809402 kernel: ACPI: Core revision 20240827 Sep 12 06:00:35.809410 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 06:00:35.809418 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 06:00:35.809534 kernel: x2apic enabled Sep 12 06:00:35.809542 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 06:00:35.809550 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 12 06:00:35.809558 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 12 06:00:35.809565 kernel: kvm-guest: setup PV IPIs Sep 12 06:00:35.809573 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 06:00:35.809584 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 12 06:00:35.809592 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 12 06:00:35.809600 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 06:00:35.809608 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 06:00:35.809615 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 06:00:35.809623 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 06:00:35.809631 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 06:00:35.809639 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 06:00:35.809652 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 06:00:35.809662 kernel: active return thunk: retbleed_return_thunk Sep 12 06:00:35.809673 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 06:00:35.809683 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 06:00:35.809693 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 06:00:35.809704 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 12 06:00:35.809715 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 12 06:00:35.809724 kernel: active return thunk: srso_return_thunk Sep 12 06:00:35.809732 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 12 06:00:35.809743 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 06:00:35.809751 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 06:00:35.809758 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 06:00:35.809766 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 06:00:35.809774 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 06:00:35.809782 kernel: Freeing SMP alternatives memory: 32K Sep 12 06:00:35.809789 kernel: pid_max: default: 32768 minimum: 301 Sep 12 06:00:35.809797 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 06:00:35.809805 kernel: landlock: Up and running. Sep 12 06:00:35.809814 kernel: SELinux: Initializing. Sep 12 06:00:35.809822 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 06:00:35.809830 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 06:00:35.809838 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 06:00:35.809846 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 06:00:35.809853 kernel: ... version: 0 Sep 12 06:00:35.809861 kernel: ... bit width: 48 Sep 12 06:00:35.809869 kernel: ... generic registers: 6 Sep 12 06:00:35.809876 kernel: ... value mask: 0000ffffffffffff Sep 12 06:00:35.809886 kernel: ... max period: 00007fffffffffff Sep 12 06:00:35.809894 kernel: ... fixed-purpose events: 0 Sep 12 06:00:35.809901 kernel: ... event mask: 000000000000003f Sep 12 06:00:35.809909 kernel: signal: max sigframe size: 1776 Sep 12 06:00:35.809917 kernel: rcu: Hierarchical SRCU implementation. Sep 12 06:00:35.809925 kernel: rcu: Max phase no-delay instances is 400. Sep 12 06:00:35.809933 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 06:00:35.809940 kernel: smp: Bringing up secondary CPUs ... Sep 12 06:00:35.809948 kernel: smpboot: x86: Booting SMP configuration: Sep 12 06:00:35.809958 kernel: .... node #0, CPUs: #1 #2 #3 Sep 12 06:00:35.809966 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 06:00:35.809973 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 12 06:00:35.809981 kernel: Memory: 2422672K/2565800K available (14336K kernel code, 2432K rwdata, 9988K rodata, 54092K init, 2872K bss, 137200K reserved, 0K cma-reserved) Sep 12 06:00:35.809989 kernel: devtmpfs: initialized Sep 12 06:00:35.809997 kernel: x86/mm: Memory block size: 128MB Sep 12 06:00:35.810005 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 12 06:00:35.810013 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 12 06:00:35.810021 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 12 06:00:35.810030 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 12 06:00:35.810038 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 12 06:00:35.810046 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 12 06:00:35.810054 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 06:00:35.810062 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 06:00:35.810070 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 06:00:35.810078 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 06:00:35.810085 kernel: audit: initializing netlink subsys (disabled) Sep 12 06:00:35.810093 kernel: audit: type=2000 audit(1757656833.736:1): state=initialized audit_enabled=0 res=1 Sep 12 06:00:35.810103 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 06:00:35.810111 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 06:00:35.810118 kernel: cpuidle: using governor menu Sep 12 06:00:35.810126 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 06:00:35.810134 kernel: dca service started, version 1.12.1 Sep 12 06:00:35.810142 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 12 06:00:35.810149 kernel: PCI: Using configuration type 1 for base access Sep 12 06:00:35.810157 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 06:00:35.810167 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 06:00:35.810175 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 06:00:35.810183 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 06:00:35.810191 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 06:00:35.810199 kernel: ACPI: Added _OSI(Module Device) Sep 12 06:00:35.810206 kernel: ACPI: Added _OSI(Processor Device) Sep 12 06:00:35.810214 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 06:00:35.810221 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 06:00:35.810229 kernel: ACPI: Interpreter enabled Sep 12 06:00:35.810237 kernel: ACPI: PM: (supports S0 S3 S5) Sep 12 06:00:35.810246 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 06:00:35.810254 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 06:00:35.810262 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 06:00:35.810270 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 06:00:35.810278 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 06:00:35.810488 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 06:00:35.810610 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 06:00:35.810748 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 06:00:35.810761 kernel: PCI host bridge to bus 0000:00 Sep 12 06:00:35.810881 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 06:00:35.810987 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 06:00:35.811091 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 06:00:35.811194 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 12 06:00:35.811306 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 12 06:00:35.811416 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 12 06:00:35.811551 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 06:00:35.811689 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 12 06:00:35.811832 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 12 06:00:35.811950 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 12 06:00:35.812063 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 12 06:00:35.812177 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 12 06:00:35.812305 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 06:00:35.812472 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 06:00:35.812644 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 12 06:00:35.812790 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 12 06:00:35.812930 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 12 06:00:35.813087 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 12 06:00:35.813252 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 12 06:00:35.813414 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 12 06:00:35.813587 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 12 06:00:35.813754 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 12 06:00:35.813896 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 12 06:00:35.814030 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 12 06:00:35.814166 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 12 06:00:35.814322 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 12 06:00:35.814500 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 12 06:00:35.814640 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 06:00:35.814787 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 12 06:00:35.814921 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 12 06:00:35.815054 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 12 06:00:35.815197 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 06:00:35.815347 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 12 06:00:35.815362 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 06:00:35.815373 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 06:00:35.815383 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 06:00:35.815394 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 06:00:35.815404 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 06:00:35.815415 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 06:00:35.815444 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 06:00:35.815458 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 06:00:35.815468 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 06:00:35.815479 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 06:00:35.815489 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 06:00:35.815500 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 06:00:35.815510 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 06:00:35.815521 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 06:00:35.815531 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 06:00:35.815542 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 06:00:35.815555 kernel: iommu: Default domain type: Translated Sep 12 06:00:35.815566 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 06:00:35.815576 kernel: efivars: Registered efivars operations Sep 12 06:00:35.815587 kernel: PCI: Using ACPI for IRQ routing Sep 12 06:00:35.815597 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 06:00:35.815608 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 12 06:00:35.815618 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 12 06:00:35.815629 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 12 06:00:35.815639 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 12 06:00:35.815652 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 12 06:00:35.815662 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 12 06:00:35.815673 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 12 06:00:35.815683 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 12 06:00:35.815825 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 06:00:35.815959 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 06:00:35.816091 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 06:00:35.816104 kernel: vgaarb: loaded Sep 12 06:00:35.816119 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 06:00:35.816130 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 06:00:35.816140 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 06:00:35.816151 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 06:00:35.816162 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 06:00:35.816173 kernel: pnp: PnP ACPI init Sep 12 06:00:35.816346 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 12 06:00:35.816366 kernel: pnp: PnP ACPI: found 6 devices Sep 12 06:00:35.816380 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 06:00:35.816391 kernel: NET: Registered PF_INET protocol family Sep 12 06:00:35.816402 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 06:00:35.816413 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 06:00:35.816437 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 06:00:35.816449 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 06:00:35.816460 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 06:00:35.816471 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 06:00:35.816482 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 06:00:35.816496 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 06:00:35.816506 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 06:00:35.816517 kernel: NET: Registered PF_XDP protocol family Sep 12 06:00:35.816659 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 12 06:00:35.816795 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 12 06:00:35.816975 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 06:00:35.817129 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 06:00:35.817261 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 06:00:35.817408 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 12 06:00:35.817560 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 12 06:00:35.817693 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 12 06:00:35.817711 kernel: PCI: CLS 0 bytes, default 64 Sep 12 06:00:35.817723 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 12 06:00:35.817735 kernel: Initialise system trusted keyrings Sep 12 06:00:35.817751 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 06:00:35.817763 kernel: Key type asymmetric registered Sep 12 06:00:35.817774 kernel: Asymmetric key parser 'x509' registered Sep 12 06:00:35.817786 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 06:00:35.817798 kernel: io scheduler mq-deadline registered Sep 12 06:00:35.817810 kernel: io scheduler kyber registered Sep 12 06:00:35.817822 kernel: io scheduler bfq registered Sep 12 06:00:35.817833 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 06:00:35.817849 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 06:00:35.817861 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 06:00:35.817872 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 12 06:00:35.817884 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 06:00:35.817896 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 06:00:35.817908 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 06:00:35.817919 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 06:00:35.817931 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 06:00:35.818090 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 12 06:00:35.818110 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 06:00:35.818244 kernel: rtc_cmos 00:04: registered as rtc0 Sep 12 06:00:35.818410 kernel: rtc_cmos 00:04: setting system clock to 2025-09-12T06:00:35 UTC (1757656835) Sep 12 06:00:35.818624 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 12 06:00:35.818641 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 06:00:35.818653 kernel: efifb: probing for efifb Sep 12 06:00:35.818663 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 12 06:00:35.818674 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 12 06:00:35.818689 kernel: efifb: scrolling: redraw Sep 12 06:00:35.818699 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 06:00:35.818710 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 06:00:35.818721 kernel: fb0: EFI VGA frame buffer device Sep 12 06:00:35.818736 kernel: pstore: Using crash dump compression: deflate Sep 12 06:00:35.818747 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 06:00:35.818758 kernel: NET: Registered PF_INET6 protocol family Sep 12 06:00:35.818778 kernel: Segment Routing with IPv6 Sep 12 06:00:35.818797 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 06:00:35.818813 kernel: NET: Registered PF_PACKET protocol family Sep 12 06:00:35.818823 kernel: Key type dns_resolver registered Sep 12 06:00:35.818833 kernel: IPI shorthand broadcast: enabled Sep 12 06:00:35.818844 kernel: sched_clock: Marking stable (2784002785, 152745064)->(2951091180, -14343331) Sep 12 06:00:35.818857 kernel: registered taskstats version 1 Sep 12 06:00:35.818868 kernel: Loading compiled-in X.509 certificates Sep 12 06:00:35.818878 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.46-flatcar: c974434132f0296e0aaf9b1358c8dc50eba5c8b9' Sep 12 06:00:35.818888 kernel: Demotion targets for Node 0: null Sep 12 06:00:35.818899 kernel: Key type .fscrypt registered Sep 12 06:00:35.818912 kernel: Key type fscrypt-provisioning registered Sep 12 06:00:35.818922 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 06:00:35.818933 kernel: ima: Allocated hash algorithm: sha1 Sep 12 06:00:35.818943 kernel: ima: No architecture policies found Sep 12 06:00:35.818954 kernel: clk: Disabling unused clocks Sep 12 06:00:35.818964 kernel: Warning: unable to open an initial console. Sep 12 06:00:35.818976 kernel: Freeing unused kernel image (initmem) memory: 54092K Sep 12 06:00:35.818987 kernel: Write protecting the kernel read-only data: 24576k Sep 12 06:00:35.818997 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 12 06:00:35.819010 kernel: Run /init as init process Sep 12 06:00:35.819021 kernel: with arguments: Sep 12 06:00:35.819032 kernel: /init Sep 12 06:00:35.819043 kernel: with environment: Sep 12 06:00:35.819053 kernel: HOME=/ Sep 12 06:00:35.819064 kernel: TERM=linux Sep 12 06:00:35.819075 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 06:00:35.819087 systemd[1]: Successfully made /usr/ read-only. Sep 12 06:00:35.819106 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 06:00:35.819119 systemd[1]: Detected virtualization kvm. Sep 12 06:00:35.819131 systemd[1]: Detected architecture x86-64. Sep 12 06:00:35.819143 systemd[1]: Running in initrd. Sep 12 06:00:35.819155 systemd[1]: No hostname configured, using default hostname. Sep 12 06:00:35.819167 systemd[1]: Hostname set to . Sep 12 06:00:35.819179 systemd[1]: Initializing machine ID from VM UUID. Sep 12 06:00:35.819191 systemd[1]: Queued start job for default target initrd.target. Sep 12 06:00:35.819205 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 06:00:35.819252 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 06:00:35.819266 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 06:00:35.819278 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 06:00:35.819302 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 06:00:35.819316 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 06:00:35.819329 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 06:00:35.819345 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 06:00:35.819357 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 06:00:35.819370 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 06:00:35.819382 systemd[1]: Reached target paths.target - Path Units. Sep 12 06:00:35.819394 systemd[1]: Reached target slices.target - Slice Units. Sep 12 06:00:35.819406 systemd[1]: Reached target swap.target - Swaps. Sep 12 06:00:35.819418 systemd[1]: Reached target timers.target - Timer Units. Sep 12 06:00:35.819458 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 06:00:35.819473 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 06:00:35.819486 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 06:00:35.819498 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 06:00:35.819511 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 06:00:35.819523 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 06:00:35.819535 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 06:00:35.819548 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 06:00:35.819560 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 06:00:35.819573 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 06:00:35.819588 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 06:00:35.819601 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 06:00:35.819613 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 06:00:35.819625 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 06:00:35.819636 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 06:00:35.819648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:35.819660 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 06:00:35.819677 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 06:00:35.819692 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 06:00:35.819740 systemd-journald[220]: Collecting audit messages is disabled. Sep 12 06:00:35.819774 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 06:00:35.819787 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 06:00:35.819800 systemd-journald[220]: Journal started Sep 12 06:00:35.819828 systemd-journald[220]: Runtime Journal (/run/log/journal/c7c84de63c7e4683ab72be894039b1eb) is 6M, max 48.4M, 42.4M free. Sep 12 06:00:35.818905 systemd-modules-load[222]: Inserted module 'overlay' Sep 12 06:00:35.824442 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 06:00:35.825236 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:35.830750 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 06:00:35.832068 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 06:00:35.836466 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 06:00:35.850471 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 06:00:35.852452 kernel: Bridge firewalling registered Sep 12 06:00:35.852394 systemd-modules-load[222]: Inserted module 'br_netfilter' Sep 12 06:00:35.853861 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 06:00:35.857910 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 06:00:35.861599 systemd-tmpfiles[241]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 06:00:35.864911 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 06:00:35.868268 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 06:00:35.872044 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 06:00:35.873303 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 06:00:35.877300 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 06:00:35.879059 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 06:00:35.907000 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d36684c42387dba16669740eb40ca6a094be0dfb03f64a303630b6ac6cfe48d3 Sep 12 06:00:35.927319 systemd-resolved[262]: Positive Trust Anchors: Sep 12 06:00:35.927334 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 06:00:35.927364 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 06:00:35.930415 systemd-resolved[262]: Defaulting to hostname 'linux'. Sep 12 06:00:35.931507 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 06:00:35.936476 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 06:00:36.018464 kernel: SCSI subsystem initialized Sep 12 06:00:36.027456 kernel: Loading iSCSI transport class v2.0-870. Sep 12 06:00:36.038467 kernel: iscsi: registered transport (tcp) Sep 12 06:00:36.059739 kernel: iscsi: registered transport (qla4xxx) Sep 12 06:00:36.059829 kernel: QLogic iSCSI HBA Driver Sep 12 06:00:36.082135 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 06:00:36.112447 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 06:00:36.115189 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 06:00:36.172186 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 06:00:36.174325 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 06:00:36.229458 kernel: raid6: avx2x4 gen() 21394 MB/s Sep 12 06:00:36.246446 kernel: raid6: avx2x2 gen() 27993 MB/s Sep 12 06:00:36.263505 kernel: raid6: avx2x1 gen() 25979 MB/s Sep 12 06:00:36.263532 kernel: raid6: using algorithm avx2x2 gen() 27993 MB/s Sep 12 06:00:36.281506 kernel: raid6: .... xor() 19993 MB/s, rmw enabled Sep 12 06:00:36.281522 kernel: raid6: using avx2x2 recovery algorithm Sep 12 06:00:36.301448 kernel: xor: automatically using best checksumming function avx Sep 12 06:00:36.460464 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 06:00:36.469036 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 06:00:36.471799 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 06:00:36.513832 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 12 06:00:36.519395 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 06:00:36.523457 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 06:00:36.557451 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 12 06:00:36.585315 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 06:00:36.588791 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 06:00:36.765998 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 06:00:36.778557 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 06:00:36.816442 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 12 06:00:36.821441 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 06:00:36.823459 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 06:00:36.826992 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 06:00:36.827013 kernel: GPT:9289727 != 19775487 Sep 12 06:00:36.827028 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 06:00:36.827038 kernel: GPT:9289727 != 19775487 Sep 12 06:00:36.828437 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 06:00:36.828459 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 06:00:36.841443 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 06:00:36.842462 kernel: AES CTR mode by8 optimization enabled Sep 12 06:00:36.862442 kernel: libata version 3.00 loaded. Sep 12 06:00:36.869944 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 06:00:36.870545 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 06:00:36.876418 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 06:00:36.876649 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:36.885068 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 12 06:00:36.885235 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 12 06:00:36.885385 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 06:00:36.879864 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:36.890273 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:36.897769 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 06:00:36.904463 kernel: scsi host0: ahci Sep 12 06:00:36.904709 kernel: scsi host1: ahci Sep 12 06:00:36.908667 kernel: scsi host2: ahci Sep 12 06:00:36.908875 kernel: scsi host3: ahci Sep 12 06:00:36.909019 kernel: scsi host4: ahci Sep 12 06:00:36.916480 kernel: scsi host5: ahci Sep 12 06:00:36.916686 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 12 06:00:36.916705 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 12 06:00:36.916715 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 12 06:00:36.917493 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 12 06:00:36.918540 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 12 06:00:36.918558 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 12 06:00:36.924267 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 06:00:36.940301 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 06:00:36.958179 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 06:00:36.966621 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 06:00:36.967859 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 06:00:36.971678 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 06:00:36.973213 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 06:00:36.973272 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:36.976450 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:36.997917 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:36.999171 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 06:00:37.017923 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:37.228103 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 06:00:37.228133 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 06:00:37.228144 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 06:00:37.228155 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 06:00:37.228445 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 12 06:00:37.229450 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 06:00:37.229465 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 06:00:37.230024 kernel: ata3.00: applying bridge limits Sep 12 06:00:37.231456 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 06:00:37.231470 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 06:00:37.231860 kernel: ata3.00: configured for UDMA/100 Sep 12 06:00:37.234458 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 06:00:37.285457 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 06:00:37.285827 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 06:00:37.286972 disk-uuid[636]: Primary Header is updated. Sep 12 06:00:37.286972 disk-uuid[636]: Secondary Entries is updated. Sep 12 06:00:37.286972 disk-uuid[636]: Secondary Header is updated. Sep 12 06:00:37.290443 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 06:00:37.303447 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 06:00:37.647839 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 06:00:37.649595 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 06:00:37.651196 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 06:00:37.652406 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 06:00:37.655629 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 06:00:37.680367 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 06:00:38.297449 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 06:00:38.297631 disk-uuid[642]: The operation has completed successfully. Sep 12 06:00:38.328967 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 06:00:38.329089 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 06:00:38.362974 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 06:00:38.384523 sh[670]: Success Sep 12 06:00:38.402352 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 06:00:38.402397 kernel: device-mapper: uevent: version 1.0.3 Sep 12 06:00:38.402409 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 06:00:38.411529 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 06:00:38.440751 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 06:00:38.443860 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 06:00:38.459418 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 06:00:38.466454 kernel: BTRFS: device fsid 29ae74b1-0ab1-4a84-96e7-98d98e1ec77f devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (682) Sep 12 06:00:38.468588 kernel: BTRFS info (device dm-0): first mount of filesystem 29ae74b1-0ab1-4a84-96e7-98d98e1ec77f Sep 12 06:00:38.468604 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 06:00:38.473453 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 06:00:38.473472 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 06:00:38.474628 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 06:00:38.476750 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 06:00:38.478868 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 06:00:38.481494 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 06:00:38.484017 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 06:00:38.514759 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (715) Sep 12 06:00:38.517194 kernel: BTRFS info (device vda6): first mount of filesystem 88e8cff7-d302-45f0-bf99-3731957f99ae Sep 12 06:00:38.517225 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 06:00:38.520462 kernel: BTRFS info (device vda6): turning on async discard Sep 12 06:00:38.520520 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 06:00:38.525441 kernel: BTRFS info (device vda6): last unmount of filesystem 88e8cff7-d302-45f0-bf99-3731957f99ae Sep 12 06:00:38.527106 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 06:00:38.530190 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 06:00:38.654719 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 06:00:38.659346 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 06:00:38.820771 ignition[760]: Ignition 2.22.0 Sep 12 06:00:38.820784 ignition[760]: Stage: fetch-offline Sep 12 06:00:38.820839 ignition[760]: no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:38.820849 ignition[760]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:38.820956 ignition[760]: parsed url from cmdline: "" Sep 12 06:00:38.820960 ignition[760]: no config URL provided Sep 12 06:00:38.820967 ignition[760]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 06:00:38.820976 ignition[760]: no config at "/usr/lib/ignition/user.ign" Sep 12 06:00:38.821000 ignition[760]: op(1): [started] loading QEMU firmware config module Sep 12 06:00:38.821005 ignition[760]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 06:00:38.831085 ignition[760]: op(1): [finished] loading QEMU firmware config module Sep 12 06:00:38.834835 systemd-networkd[851]: lo: Link UP Sep 12 06:00:38.834845 systemd-networkd[851]: lo: Gained carrier Sep 12 06:00:38.836327 systemd-networkd[851]: Enumeration completed Sep 12 06:00:38.836513 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 06:00:38.837131 systemd[1]: Reached target network.target - Network. Sep 12 06:00:38.837834 systemd-networkd[851]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 06:00:38.837838 systemd-networkd[851]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 06:00:38.838398 systemd-networkd[851]: eth0: Link UP Sep 12 06:00:38.841112 systemd-networkd[851]: eth0: Gained carrier Sep 12 06:00:38.841135 systemd-networkd[851]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 06:00:38.882876 ignition[760]: parsing config with SHA512: 0090cf5a0d6025baf4ac42360b15380f71fb5ce96f779d02bdfa29dd20230f98fb27b45d1dce9f3346970205ed5bcd9bd1d9f51c3a54abf08e5faab620fb7a8e Sep 12 06:00:38.885480 systemd-networkd[851]: eth0: DHCPv4 address 10.0.0.132/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 06:00:38.892838 unknown[760]: fetched base config from "system" Sep 12 06:00:38.893755 unknown[760]: fetched user config from "qemu" Sep 12 06:00:38.894144 ignition[760]: fetch-offline: fetch-offline passed Sep 12 06:00:38.894225 ignition[760]: Ignition finished successfully Sep 12 06:00:38.897934 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 06:00:38.898575 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 06:00:38.915641 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 06:00:38.992579 ignition[864]: Ignition 2.22.0 Sep 12 06:00:38.992593 ignition[864]: Stage: kargs Sep 12 06:00:38.992745 ignition[864]: no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:38.992756 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:38.993459 ignition[864]: kargs: kargs passed Sep 12 06:00:38.993501 ignition[864]: Ignition finished successfully Sep 12 06:00:39.001558 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 06:00:39.005079 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 06:00:39.051907 ignition[872]: Ignition 2.22.0 Sep 12 06:00:39.051927 ignition[872]: Stage: disks Sep 12 06:00:39.052072 ignition[872]: no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:39.052082 ignition[872]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:39.053134 ignition[872]: disks: disks passed Sep 12 06:00:39.053185 ignition[872]: Ignition finished successfully Sep 12 06:00:39.059397 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 06:00:39.061571 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 06:00:39.062000 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 06:00:39.062320 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 06:00:39.062808 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 06:00:39.063117 systemd[1]: Reached target basic.target - Basic System. Sep 12 06:00:39.064566 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 06:00:39.095657 systemd-fsck[884]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 06:00:39.104103 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 06:00:39.107567 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 06:00:39.342451 kernel: EXT4-fs (vda9): mounted filesystem 2b8062f9-897a-46cb-bde4-2b62ba4cc712 r/w with ordered data mode. Quota mode: none. Sep 12 06:00:39.342846 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 06:00:39.343785 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 06:00:39.345980 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 06:00:39.348362 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 06:00:39.349547 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 06:00:39.349594 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 06:00:39.349616 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 06:00:39.363354 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 06:00:39.365234 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 06:00:39.369977 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (892) Sep 12 06:00:39.370008 kernel: BTRFS info (device vda6): first mount of filesystem 88e8cff7-d302-45f0-bf99-3731957f99ae Sep 12 06:00:39.370914 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 06:00:39.374903 kernel: BTRFS info (device vda6): turning on async discard Sep 12 06:00:39.374925 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 06:00:39.377717 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 06:00:39.445193 initrd-setup-root[916]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 06:00:39.449331 initrd-setup-root[923]: cut: /sysroot/etc/group: No such file or directory Sep 12 06:00:39.453788 initrd-setup-root[930]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 06:00:39.457547 initrd-setup-root[937]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 06:00:39.563135 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 06:00:39.565517 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 06:00:39.566797 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 06:00:39.590223 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 06:00:39.591534 kernel: BTRFS info (device vda6): last unmount of filesystem 88e8cff7-d302-45f0-bf99-3731957f99ae Sep 12 06:00:39.607857 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 06:00:39.692805 ignition[1005]: INFO : Ignition 2.22.0 Sep 12 06:00:39.692805 ignition[1005]: INFO : Stage: mount Sep 12 06:00:39.694640 ignition[1005]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:39.694640 ignition[1005]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:39.698677 ignition[1005]: INFO : mount: mount passed Sep 12 06:00:39.699510 ignition[1005]: INFO : Ignition finished successfully Sep 12 06:00:39.703627 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 06:00:39.706742 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 06:00:39.730308 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 06:00:39.760448 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1018) Sep 12 06:00:39.760475 kernel: BTRFS info (device vda6): first mount of filesystem 88e8cff7-d302-45f0-bf99-3731957f99ae Sep 12 06:00:39.761829 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 06:00:39.765449 kernel: BTRFS info (device vda6): turning on async discard Sep 12 06:00:39.765467 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 06:00:39.767074 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 06:00:39.829123 ignition[1035]: INFO : Ignition 2.22.0 Sep 12 06:00:39.829123 ignition[1035]: INFO : Stage: files Sep 12 06:00:39.831270 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:39.831270 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:39.831270 ignition[1035]: DEBUG : files: compiled without relabeling support, skipping Sep 12 06:00:39.834716 ignition[1035]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 06:00:39.834716 ignition[1035]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 06:00:39.837874 ignition[1035]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 06:00:39.839667 ignition[1035]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 06:00:39.841673 unknown[1035]: wrote ssh authorized keys file for user: core Sep 12 06:00:39.842914 ignition[1035]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 06:00:39.846115 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 06:00:39.847960 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 06:00:39.895817 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 06:00:40.385264 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 06:00:40.385264 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 06:00:40.389110 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 06:00:40.403580 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 06:00:40.405501 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 06:00:40.405501 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 06:00:40.409785 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 06:00:40.412279 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 06:00:40.412279 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 06:00:40.667581 systemd-networkd[851]: eth0: Gained IPv6LL Sep 12 06:00:40.712900 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 06:00:41.443193 ignition[1035]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 06:00:41.443193 ignition[1035]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 06:00:41.446966 ignition[1035]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 06:00:41.679725 ignition[1035]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 06:00:41.679725 ignition[1035]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 06:00:41.679725 ignition[1035]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 06:00:41.679725 ignition[1035]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 06:00:41.686412 ignition[1035]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 06:00:41.686412 ignition[1035]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 06:00:41.686412 ignition[1035]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 06:00:41.703955 ignition[1035]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 06:00:41.712189 ignition[1035]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 06:00:41.713780 ignition[1035]: INFO : files: files passed Sep 12 06:00:41.713780 ignition[1035]: INFO : Ignition finished successfully Sep 12 06:00:41.722390 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 06:00:41.725639 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 06:00:41.727946 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 06:00:41.744793 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 06:00:41.744912 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 06:00:41.748605 initrd-setup-root-after-ignition[1064]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 06:00:41.751418 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 06:00:41.751418 initrd-setup-root-after-ignition[1066]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 06:00:41.793528 initrd-setup-root-after-ignition[1070]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 06:00:41.795311 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 06:00:41.798414 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 06:00:41.801316 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 06:00:41.878239 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 06:00:41.878367 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 06:00:41.878897 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 06:00:41.881678 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 06:00:41.883715 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 06:00:41.884515 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 06:00:41.925165 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 06:00:41.927044 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 06:00:41.947172 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 06:00:41.959410 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 06:00:41.959788 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 06:00:41.961901 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 06:00:41.962044 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 06:00:41.965301 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 06:00:41.965946 systemd[1]: Stopped target basic.target - Basic System. Sep 12 06:00:41.966275 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 06:00:41.966761 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 06:00:41.967075 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 06:00:41.967408 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 06:00:41.967888 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 06:00:42.041239 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 06:00:42.054607 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 06:00:42.055266 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 06:00:42.057245 systemd[1]: Stopped target swap.target - Swaps. Sep 12 06:00:42.057711 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 06:00:42.057903 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 06:00:42.060937 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 06:00:42.061305 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 06:00:42.061813 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 06:00:42.067020 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 06:00:42.067936 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 06:00:42.068144 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 06:00:42.071396 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 06:00:42.071616 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 06:00:42.072026 systemd[1]: Stopped target paths.target - Path Units. Sep 12 06:00:42.072270 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 06:00:42.081574 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 06:00:42.084537 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 06:00:42.085173 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 06:00:42.086844 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 06:00:42.086949 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 06:00:42.089250 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 06:00:42.089337 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 06:00:42.090581 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 06:00:42.090695 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 06:00:42.092439 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 06:00:42.092547 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 06:00:42.094018 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 06:00:42.103106 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 06:00:42.104886 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 06:00:42.105024 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 06:00:42.105760 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 06:00:42.105862 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 06:00:42.115032 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 06:00:42.119632 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 06:00:42.145653 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 06:00:42.151591 ignition[1091]: INFO : Ignition 2.22.0 Sep 12 06:00:42.151591 ignition[1091]: INFO : Stage: umount Sep 12 06:00:42.151591 ignition[1091]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 06:00:42.151591 ignition[1091]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 06:00:42.156555 ignition[1091]: INFO : umount: umount passed Sep 12 06:00:42.156555 ignition[1091]: INFO : Ignition finished successfully Sep 12 06:00:42.155840 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 06:00:42.155973 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 06:00:42.159073 systemd[1]: Stopped target network.target - Network. Sep 12 06:00:42.159767 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 06:00:42.159833 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 06:00:42.160098 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 06:00:42.160151 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 06:00:42.160755 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 06:00:42.160807 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 06:00:42.161062 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 06:00:42.161101 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 06:00:42.161559 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 06:00:42.162043 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 06:00:42.174132 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 06:00:42.174245 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 06:00:42.179104 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 06:00:42.179765 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 06:00:42.179842 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 06:00:42.186456 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 06:00:42.186709 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 06:00:42.186821 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 06:00:42.191523 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 06:00:42.192075 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 06:00:42.194754 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 06:00:42.194809 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 06:00:42.196384 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 06:00:42.199545 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 06:00:42.200532 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 06:00:42.202917 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 06:00:42.202967 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 06:00:42.205821 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 06:00:42.205869 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 06:00:42.208945 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 06:00:42.212214 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 06:00:42.226648 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 06:00:42.226784 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 06:00:42.228380 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 06:00:42.228561 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 06:00:42.230618 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 06:00:42.230676 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 06:00:42.233003 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 06:00:42.233042 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 06:00:42.233299 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 06:00:42.233351 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 06:00:42.234161 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 06:00:42.234208 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 06:00:42.240479 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 06:00:42.240531 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 06:00:42.245215 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 06:00:42.245566 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 06:00:42.245623 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 06:00:42.250323 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 06:00:42.250367 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 06:00:42.253626 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 06:00:42.253672 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 06:00:42.256910 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 06:00:42.256956 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 06:00:42.257444 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 06:00:42.257484 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:42.268191 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 06:00:42.268309 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 06:00:42.690765 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 06:00:42.690911 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 06:00:42.691881 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 06:00:42.693762 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 06:00:42.693813 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 06:00:42.694888 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 06:00:42.723891 systemd[1]: Switching root. Sep 12 06:00:42.763370 systemd-journald[220]: Journal stopped Sep 12 06:00:44.149444 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Sep 12 06:00:44.149536 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 06:00:44.149554 kernel: SELinux: policy capability open_perms=1 Sep 12 06:00:44.149569 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 06:00:44.149590 kernel: SELinux: policy capability always_check_network=0 Sep 12 06:00:44.149605 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 06:00:44.149620 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 06:00:44.149634 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 06:00:44.149652 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 06:00:44.149666 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 06:00:44.149680 kernel: audit: type=1403 audit(1757656843.218:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 06:00:44.149696 systemd[1]: Successfully loaded SELinux policy in 67.820ms. Sep 12 06:00:44.149746 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.555ms. Sep 12 06:00:44.149763 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 06:00:44.149780 systemd[1]: Detected virtualization kvm. Sep 12 06:00:44.149801 systemd[1]: Detected architecture x86-64. Sep 12 06:00:44.149816 systemd[1]: Detected first boot. Sep 12 06:00:44.149834 systemd[1]: Initializing machine ID from VM UUID. Sep 12 06:00:44.149849 zram_generator::config[1137]: No configuration found. Sep 12 06:00:44.149866 kernel: Guest personality initialized and is inactive Sep 12 06:00:44.149898 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 06:00:44.149912 kernel: Initialized host personality Sep 12 06:00:44.149926 kernel: NET: Registered PF_VSOCK protocol family Sep 12 06:00:44.149941 systemd[1]: Populated /etc with preset unit settings. Sep 12 06:00:44.149957 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 06:00:44.149976 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 06:00:44.149991 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 06:00:44.150006 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 06:00:44.150022 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 06:00:44.150037 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 06:00:44.150063 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 06:00:44.150079 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 06:00:44.150094 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 06:00:44.150117 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 06:00:44.150135 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 06:00:44.150150 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 06:00:44.150165 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 06:00:44.150181 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 06:00:44.150197 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 06:00:44.150212 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 06:00:44.150227 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 06:00:44.150258 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 06:00:44.150274 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 06:00:44.150289 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 06:00:44.150305 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 06:00:44.150320 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 06:00:44.150335 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 06:00:44.150351 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 06:00:44.150367 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 06:00:44.150382 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 06:00:44.150398 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 06:00:44.150416 systemd[1]: Reached target slices.target - Slice Units. Sep 12 06:00:44.150466 systemd[1]: Reached target swap.target - Swaps. Sep 12 06:00:44.150482 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 06:00:44.150497 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 06:00:44.150513 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 06:00:44.150528 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 06:00:44.150544 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 06:00:44.150559 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 06:00:44.150575 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 06:00:44.150595 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 06:00:44.150611 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 06:00:44.150627 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 06:00:44.150643 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:44.150658 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 06:00:44.150674 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 06:00:44.150689 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 06:00:44.150705 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 06:00:44.150724 systemd[1]: Reached target machines.target - Containers. Sep 12 06:00:44.150740 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 06:00:44.150755 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 06:00:44.150770 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 06:00:44.150785 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 06:00:44.150800 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 06:00:44.150818 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 06:00:44.150833 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 06:00:44.150849 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 06:00:44.150869 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 06:00:44.150884 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 06:00:44.150900 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 06:00:44.150916 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 06:00:44.150931 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 06:00:44.150947 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 06:00:44.150963 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 06:00:44.150980 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 06:00:44.150999 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 06:00:44.151014 kernel: loop: module loaded Sep 12 06:00:44.151028 kernel: fuse: init (API version 7.41) Sep 12 06:00:44.151060 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 06:00:44.151077 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 06:00:44.151093 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 06:00:44.151109 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 06:00:44.151128 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 06:00:44.151144 systemd[1]: Stopped verity-setup.service. Sep 12 06:00:44.151163 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:44.151180 kernel: ACPI: bus type drm_connector registered Sep 12 06:00:44.151198 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 06:00:44.151214 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 06:00:44.151265 systemd-journald[1212]: Collecting audit messages is disabled. Sep 12 06:00:44.151297 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 06:00:44.151314 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 06:00:44.151331 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 06:00:44.151348 systemd-journald[1212]: Journal started Sep 12 06:00:44.151382 systemd-journald[1212]: Runtime Journal (/run/log/journal/c7c84de63c7e4683ab72be894039b1eb) is 6M, max 48.4M, 42.4M free. Sep 12 06:00:43.862192 systemd[1]: Queued start job for default target multi-user.target. Sep 12 06:00:43.880568 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 06:00:43.881026 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 06:00:44.154872 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 06:00:44.155645 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 06:00:44.156908 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 06:00:44.158361 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 06:00:44.159940 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 06:00:44.160169 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 06:00:44.161723 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 06:00:44.161933 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 06:00:44.163418 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 06:00:44.163643 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 06:00:44.164957 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 06:00:44.165178 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 06:00:44.166661 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 06:00:44.166866 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 06:00:44.168309 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 06:00:44.168538 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 06:00:44.169917 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 06:00:44.171351 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 06:00:44.172887 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 06:00:44.174587 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 06:00:44.190383 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 06:00:44.193134 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 06:00:44.196056 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 06:00:44.197273 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 06:00:44.197366 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 06:00:44.199752 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 06:00:44.204390 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 06:00:44.206737 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 06:00:44.210095 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 06:00:44.213497 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 06:00:44.215440 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 06:00:44.218525 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 06:00:44.219699 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 06:00:44.223620 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 06:00:44.226630 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 06:00:44.231259 systemd-journald[1212]: Time spent on flushing to /var/log/journal/c7c84de63c7e4683ab72be894039b1eb is 30.680ms for 1070 entries. Sep 12 06:00:44.231259 systemd-journald[1212]: System Journal (/var/log/journal/c7c84de63c7e4683ab72be894039b1eb) is 8M, max 195.6M, 187.6M free. Sep 12 06:00:44.282493 systemd-journald[1212]: Received client request to flush runtime journal. Sep 12 06:00:44.282555 kernel: loop0: detected capacity change from 0 to 128016 Sep 12 06:00:44.230631 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 06:00:44.235146 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 06:00:44.236584 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 06:00:44.246121 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 06:00:44.247690 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 06:00:44.254216 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 06:00:44.258160 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 06:00:44.286077 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 06:00:44.294542 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 06:00:44.300700 systemd-tmpfiles[1257]: ACLs are not supported, ignoring. Sep 12 06:00:44.300743 systemd-tmpfiles[1257]: ACLs are not supported, ignoring. Sep 12 06:00:44.308450 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 06:00:44.308956 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 06:00:44.313556 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 06:00:44.334450 kernel: loop1: detected capacity change from 0 to 110984 Sep 12 06:00:44.341187 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 06:00:44.353787 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 06:00:44.357500 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 06:00:44.361520 kernel: loop2: detected capacity change from 0 to 224512 Sep 12 06:00:44.385034 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Sep 12 06:00:44.385501 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Sep 12 06:00:44.390554 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 06:00:44.401448 kernel: loop3: detected capacity change from 0 to 128016 Sep 12 06:00:44.410480 kernel: loop4: detected capacity change from 0 to 110984 Sep 12 06:00:44.421458 kernel: loop5: detected capacity change from 0 to 224512 Sep 12 06:00:44.427295 (sd-merge)[1282]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 06:00:44.428155 (sd-merge)[1282]: Merged extensions into '/usr'. Sep 12 06:00:44.462614 systemd[1]: Reload requested from client PID 1256 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 06:00:44.462629 systemd[1]: Reloading... Sep 12 06:00:44.539501 zram_generator::config[1309]: No configuration found. Sep 12 06:00:44.732442 ldconfig[1251]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 06:00:44.756691 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 06:00:44.757016 systemd[1]: Reloading finished in 293 ms. Sep 12 06:00:44.785573 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 06:00:44.787203 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 06:00:44.805757 systemd[1]: Starting ensure-sysext.service... Sep 12 06:00:44.807610 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 06:00:44.818189 systemd[1]: Reload requested from client PID 1345 ('systemctl') (unit ensure-sysext.service)... Sep 12 06:00:44.818207 systemd[1]: Reloading... Sep 12 06:00:44.829208 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 06:00:44.829244 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 06:00:44.829553 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 06:00:44.829816 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 06:00:44.830716 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 06:00:44.830977 systemd-tmpfiles[1346]: ACLs are not supported, ignoring. Sep 12 06:00:44.831123 systemd-tmpfiles[1346]: ACLs are not supported, ignoring. Sep 12 06:00:44.835272 systemd-tmpfiles[1346]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 06:00:44.835282 systemd-tmpfiles[1346]: Skipping /boot Sep 12 06:00:44.846141 systemd-tmpfiles[1346]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 06:00:44.846153 systemd-tmpfiles[1346]: Skipping /boot Sep 12 06:00:44.926467 zram_generator::config[1376]: No configuration found. Sep 12 06:00:45.087239 systemd[1]: Reloading finished in 268 ms. Sep 12 06:00:45.109150 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 06:00:45.134994 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 06:00:45.144164 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 06:00:45.146479 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 06:00:45.148988 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 06:00:45.168855 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 06:00:45.171841 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 06:00:45.175356 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 06:00:45.182666 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:45.182858 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 06:00:45.188838 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 06:00:45.192843 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 06:00:45.195768 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 06:00:45.196966 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 06:00:45.197124 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 06:00:45.197396 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:45.200000 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 06:00:45.200277 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 06:00:45.209688 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 06:00:45.211723 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 06:00:45.211942 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 06:00:45.217197 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 06:00:45.219250 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 06:00:45.219544 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 06:00:45.227394 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:45.227864 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 06:00:45.229614 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 06:00:45.233744 augenrules[1446]: No rules Sep 12 06:00:45.233013 systemd-udevd[1416]: Using default interface naming scheme 'v255'. Sep 12 06:00:45.233342 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 06:00:45.235728 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 06:00:45.240610 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 06:00:45.241766 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 06:00:45.241885 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 06:00:45.252008 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 06:00:45.255368 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 06:00:45.256577 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 06:00:45.258383 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 06:00:45.258681 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 06:00:45.260879 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 06:00:45.263311 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 06:00:45.263547 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 06:00:45.265163 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 06:00:45.265366 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 06:00:45.266816 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 06:00:45.267034 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 06:00:45.268822 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 06:00:45.269048 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 06:00:45.270847 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 06:00:45.273966 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 06:00:45.292508 systemd[1]: Finished ensure-sysext.service. Sep 12 06:00:45.304246 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 06:00:45.305351 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 06:00:45.305440 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 06:00:45.308562 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 06:00:45.309832 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 06:00:45.324531 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 06:00:45.347895 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 06:00:45.393683 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 06:00:45.396285 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 06:00:45.405454 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 06:00:45.414453 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 06:00:45.418474 kernel: ACPI: button: Power Button [PWRF] Sep 12 06:00:45.423718 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 06:00:45.440458 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 12 06:00:45.440755 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 06:00:45.440911 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 06:00:45.507345 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:45.520994 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 06:00:45.521273 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:45.531339 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 06:00:45.810056 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 06:00:45.810434 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 06:00:45.835314 systemd-networkd[1491]: lo: Link UP Sep 12 06:00:45.835328 systemd-networkd[1491]: lo: Gained carrier Sep 12 06:00:45.837501 kernel: kvm_amd: TSC scaling supported Sep 12 06:00:45.837548 kernel: kvm_amd: Nested Virtualization enabled Sep 12 06:00:45.837572 kernel: kvm_amd: Nested Paging enabled Sep 12 06:00:45.837591 kernel: kvm_amd: LBR virtualization supported Sep 12 06:00:45.837342 systemd-networkd[1491]: Enumeration completed Sep 12 06:00:45.837415 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 06:00:45.839536 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 12 06:00:45.839578 kernel: kvm_amd: Virtual GIF supported Sep 12 06:00:45.837735 systemd-networkd[1491]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 06:00:45.837746 systemd-networkd[1491]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 06:00:45.840798 systemd-networkd[1491]: eth0: Link UP Sep 12 06:00:45.841014 systemd-networkd[1491]: eth0: Gained carrier Sep 12 06:00:45.841040 systemd-networkd[1491]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 06:00:45.842831 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 06:00:45.844482 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 06:00:45.853932 systemd-networkd[1491]: eth0: DHCPv4 address 10.0.0.132/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 06:00:45.855677 systemd-timesyncd[1492]: Network configuration changed, trying to establish connection. Sep 12 06:00:46.909735 systemd-timesyncd[1492]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 06:00:46.909852 systemd-timesyncd[1492]: Initial clock synchronization to Fri 2025-09-12 06:00:46.909479 UTC. Sep 12 06:00:46.910733 systemd-resolved[1415]: Positive Trust Anchors: Sep 12 06:00:46.910772 systemd-resolved[1415]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 06:00:46.910829 systemd-resolved[1415]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 06:00:46.917702 systemd-resolved[1415]: Defaulting to hostname 'linux'. Sep 12 06:00:46.922907 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 06:00:46.924648 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 06:00:46.926233 systemd[1]: Reached target network.target - Network. Sep 12 06:00:46.927227 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 06:00:46.928399 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 06:00:46.929787 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 06:00:46.931077 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 06:00:46.932318 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 06:00:46.933595 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 06:00:46.934728 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 06:00:46.935949 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 06:00:46.937247 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 06:00:46.937282 systemd[1]: Reached target paths.target - Path Units. Sep 12 06:00:46.938184 systemd[1]: Reached target timers.target - Timer Units. Sep 12 06:00:46.940215 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 06:00:46.943869 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 06:00:46.947837 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 06:00:46.949232 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 06:00:46.950576 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 06:00:46.953205 kernel: EDAC MC: Ver: 3.0.0 Sep 12 06:00:46.959593 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 06:00:46.962137 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 06:00:46.964858 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 06:00:46.966710 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 06:00:46.975014 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 06:00:46.976167 systemd[1]: Reached target basic.target - Basic System. Sep 12 06:00:46.977322 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 06:00:46.977354 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 06:00:46.978735 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 06:00:46.981708 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 06:00:46.984215 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 06:00:46.987185 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 06:00:46.994581 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 06:00:46.995824 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 06:00:46.997275 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 06:00:46.997449 jq[1548]: false Sep 12 06:00:46.999535 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 06:00:47.004210 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 06:00:47.008963 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 06:00:47.012577 oslogin_cache_refresh[1550]: Refreshing passwd entry cache Sep 12 06:00:47.014524 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Refreshing passwd entry cache Sep 12 06:00:47.013320 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 06:00:47.016519 extend-filesystems[1549]: Found /dev/vda6 Sep 12 06:00:47.018804 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 06:00:47.021128 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 06:00:47.021692 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 06:00:47.022891 extend-filesystems[1549]: Found /dev/vda9 Sep 12 06:00:47.022636 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 06:00:47.024727 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Failure getting users, quitting Sep 12 06:00:47.024721 oslogin_cache_refresh[1550]: Failure getting users, quitting Sep 12 06:00:47.024935 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 06:00:47.024935 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Refreshing group entry cache Sep 12 06:00:47.024744 oslogin_cache_refresh[1550]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 06:00:47.024800 oslogin_cache_refresh[1550]: Refreshing group entry cache Sep 12 06:00:47.028409 extend-filesystems[1549]: Checking size of /dev/vda9 Sep 12 06:00:47.029882 oslogin_cache_refresh[1550]: Failure getting groups, quitting Sep 12 06:00:47.030150 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Failure getting groups, quitting Sep 12 06:00:47.030150 google_oslogin_nss_cache[1550]: oslogin_cache_refresh[1550]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 06:00:47.029892 oslogin_cache_refresh[1550]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 06:00:47.031706 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 06:00:47.038745 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 06:00:47.041589 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 06:00:47.043393 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 06:00:47.043726 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 06:00:47.043949 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 06:00:47.045779 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 06:00:47.046015 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 06:00:47.049466 jq[1567]: true Sep 12 06:00:47.052555 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 06:00:47.052977 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 06:00:47.054159 update_engine[1562]: I20250912 06:00:47.054056 1562 main.cc:92] Flatcar Update Engine starting Sep 12 06:00:47.060714 extend-filesystems[1549]: Resized partition /dev/vda9 Sep 12 06:00:47.071149 jq[1576]: true Sep 12 06:00:47.084567 (ntainerd)[1587]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 06:00:47.182915 extend-filesystems[1607]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 06:00:47.200934 tar[1574]: linux-amd64/LICENSE Sep 12 06:00:47.201232 tar[1574]: linux-amd64/helm Sep 12 06:00:47.203740 systemd-logind[1559]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 06:00:47.203766 systemd-logind[1559]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 06:00:47.204264 systemd-logind[1559]: New seat seat0. Sep 12 06:00:47.210306 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 06:00:47.217765 dbus-daemon[1546]: [system] SELinux support is enabled Sep 12 06:00:47.218174 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 06:00:47.221580 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 06:00:47.221729 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 06:00:47.223034 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 06:00:47.223053 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 06:00:47.224492 update_engine[1562]: I20250912 06:00:47.224428 1562 update_check_scheduler.cc:74] Next update check in 7m0s Sep 12 06:00:47.225426 systemd[1]: Started update-engine.service - Update Engine. Sep 12 06:00:47.228051 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 06:00:47.443644 locksmithd[1609]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 06:00:47.454132 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 06:00:47.491644 sshd_keygen[1573]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 06:00:47.515232 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 06:00:47.518218 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 06:00:47.542456 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 06:00:47.542980 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 06:00:47.545995 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 06:00:47.561123 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 06:00:47.584154 extend-filesystems[1607]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 06:00:47.584154 extend-filesystems[1607]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 06:00:47.584154 extend-filesystems[1607]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 06:00:47.635498 extend-filesystems[1549]: Resized filesystem in /dev/vda9 Sep 12 06:00:47.591571 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 06:00:47.592051 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 06:00:47.650025 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 06:00:47.653019 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 06:00:47.655352 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 06:00:47.658340 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 06:00:47.679702 bash[1606]: Updated "/home/core/.ssh/authorized_keys" Sep 12 06:00:47.681385 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 06:00:47.683895 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 06:00:47.738799 tar[1574]: linux-amd64/README.md Sep 12 06:00:47.775060 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 06:00:47.800929 containerd[1587]: time="2025-09-12T06:00:47Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 06:00:47.801881 containerd[1587]: time="2025-09-12T06:00:47.801820854Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 06:00:47.814724 containerd[1587]: time="2025-09-12T06:00:47.814649803Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.312µs" Sep 12 06:00:47.814724 containerd[1587]: time="2025-09-12T06:00:47.814704896Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 06:00:47.814724 containerd[1587]: time="2025-09-12T06:00:47.814729883Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 06:00:47.815000 containerd[1587]: time="2025-09-12T06:00:47.814966797Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 06:00:47.815000 containerd[1587]: time="2025-09-12T06:00:47.814988808Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 06:00:47.815044 containerd[1587]: time="2025-09-12T06:00:47.815017642Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815135 containerd[1587]: time="2025-09-12T06:00:47.815090209Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815135 containerd[1587]: time="2025-09-12T06:00:47.815125825Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815483 containerd[1587]: time="2025-09-12T06:00:47.815447679Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815483 containerd[1587]: time="2025-09-12T06:00:47.815466965Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815483 containerd[1587]: time="2025-09-12T06:00:47.815477475Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815553 containerd[1587]: time="2025-09-12T06:00:47.815485350Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815614 containerd[1587]: time="2025-09-12T06:00:47.815589735Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815876 containerd[1587]: time="2025-09-12T06:00:47.815841498Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815900 containerd[1587]: time="2025-09-12T06:00:47.815879970Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 06:00:47.815900 containerd[1587]: time="2025-09-12T06:00:47.815891451Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 06:00:47.815948 containerd[1587]: time="2025-09-12T06:00:47.815926988Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 06:00:47.816186 containerd[1587]: time="2025-09-12T06:00:47.816161688Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 06:00:47.816256 containerd[1587]: time="2025-09-12T06:00:47.816235176Z" level=info msg="metadata content store policy set" policy=shared Sep 12 06:00:47.823264 containerd[1587]: time="2025-09-12T06:00:47.823226347Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 06:00:47.823308 containerd[1587]: time="2025-09-12T06:00:47.823276270Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 06:00:47.823308 containerd[1587]: time="2025-09-12T06:00:47.823291830Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 06:00:47.823361 containerd[1587]: time="2025-09-12T06:00:47.823317688Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 06:00:47.823361 containerd[1587]: time="2025-09-12T06:00:47.823338888Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 06:00:47.823361 containerd[1587]: time="2025-09-12T06:00:47.823350259Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 06:00:47.823414 containerd[1587]: time="2025-09-12T06:00:47.823362652Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 06:00:47.823414 containerd[1587]: time="2025-09-12T06:00:47.823377170Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 06:00:47.823414 containerd[1587]: time="2025-09-12T06:00:47.823388481Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 06:00:47.823414 containerd[1587]: time="2025-09-12T06:00:47.823399662Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 06:00:47.823414 containerd[1587]: time="2025-09-12T06:00:47.823407707Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 06:00:47.823504 containerd[1587]: time="2025-09-12T06:00:47.823419459Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 06:00:47.823575 containerd[1587]: time="2025-09-12T06:00:47.823542389Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 06:00:47.823575 containerd[1587]: time="2025-09-12T06:00:47.823567607Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 06:00:47.823615 containerd[1587]: time="2025-09-12T06:00:47.823582865Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 06:00:47.823615 containerd[1587]: time="2025-09-12T06:00:47.823597703Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 06:00:47.823615 containerd[1587]: time="2025-09-12T06:00:47.823607441Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823618522Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823629493Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823639742Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823650452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823682111Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 06:00:47.823696 containerd[1587]: time="2025-09-12T06:00:47.823692781Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 06:00:47.823816 containerd[1587]: time="2025-09-12T06:00:47.823763234Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 06:00:47.823816 containerd[1587]: time="2025-09-12T06:00:47.823778142Z" level=info msg="Start snapshots syncer" Sep 12 06:00:47.823816 containerd[1587]: time="2025-09-12T06:00:47.823804942Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 06:00:47.824161 containerd[1587]: time="2025-09-12T06:00:47.824073455Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 06:00:47.824317 containerd[1587]: time="2025-09-12T06:00:47.824171860Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 06:00:47.826129 containerd[1587]: time="2025-09-12T06:00:47.826085188Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 06:00:47.826241 containerd[1587]: time="2025-09-12T06:00:47.826207978Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 06:00:47.826241 containerd[1587]: time="2025-09-12T06:00:47.826231953Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 06:00:47.826295 containerd[1587]: time="2025-09-12T06:00:47.826248044Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 06:00:47.826295 containerd[1587]: time="2025-09-12T06:00:47.826258082Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 06:00:47.826295 containerd[1587]: time="2025-09-12T06:00:47.826268762Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 06:00:47.826295 containerd[1587]: time="2025-09-12T06:00:47.826278360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 06:00:47.826366 containerd[1587]: time="2025-09-12T06:00:47.826310461Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 06:00:47.826366 containerd[1587]: time="2025-09-12T06:00:47.826334045Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 06:00:47.826366 containerd[1587]: time="2025-09-12T06:00:47.826343753Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 06:00:47.826366 containerd[1587]: time="2025-09-12T06:00:47.826352920Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826382536Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826396943Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826405038Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826413724Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826421439Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 06:00:47.826444 containerd[1587]: time="2025-09-12T06:00:47.826442789Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826457286Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826479468Z" level=info msg="runtime interface created" Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826485038Z" level=info msg="created NRI interface" Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826492412Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826502901Z" level=info msg="Connect containerd service" Sep 12 06:00:47.826562 containerd[1587]: time="2025-09-12T06:00:47.826529481Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 06:00:47.827390 containerd[1587]: time="2025-09-12T06:00:47.827351152Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 06:00:48.059992 containerd[1587]: time="2025-09-12T06:00:48.059829302Z" level=info msg="Start subscribing containerd event" Sep 12 06:00:48.060116 containerd[1587]: time="2025-09-12T06:00:48.059973983Z" level=info msg="Start recovering state" Sep 12 06:00:48.060138 containerd[1587]: time="2025-09-12T06:00:48.060087456Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 06:00:48.060208 containerd[1587]: time="2025-09-12T06:00:48.060184588Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 06:00:48.060269 containerd[1587]: time="2025-09-12T06:00:48.060186752Z" level=info msg="Start event monitor" Sep 12 06:00:48.060293 containerd[1587]: time="2025-09-12T06:00:48.060286319Z" level=info msg="Start cni network conf syncer for default" Sep 12 06:00:48.060313 containerd[1587]: time="2025-09-12T06:00:48.060296798Z" level=info msg="Start streaming server" Sep 12 06:00:48.060332 containerd[1587]: time="2025-09-12T06:00:48.060312528Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 06:00:48.060332 containerd[1587]: time="2025-09-12T06:00:48.060319491Z" level=info msg="runtime interface starting up..." Sep 12 06:00:48.060332 containerd[1587]: time="2025-09-12T06:00:48.060324571Z" level=info msg="starting plugins..." Sep 12 06:00:48.060456 containerd[1587]: time="2025-09-12T06:00:48.060339208Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 06:00:48.060545 containerd[1587]: time="2025-09-12T06:00:48.060528513Z" level=info msg="containerd successfully booted in 0.260201s" Sep 12 06:00:48.060687 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 06:00:48.504441 systemd-networkd[1491]: eth0: Gained IPv6LL Sep 12 06:00:48.601981 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 06:00:48.603861 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 06:00:48.606733 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 06:00:48.609321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:00:48.611753 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 06:00:48.639956 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 06:00:48.651574 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 06:00:48.651894 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 06:00:48.653566 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 06:00:49.573056 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:00:49.575028 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 06:00:49.576964 systemd[1]: Startup finished in 2.845s (kernel) + 7.573s (initrd) + 5.371s (userspace) = 15.789s. Sep 12 06:00:49.594687 (kubelet)[1680]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 06:00:50.021409 kubelet[1680]: E0912 06:00:50.021280 1680 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 06:00:50.025357 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 06:00:50.025545 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 06:00:50.025898 systemd[1]: kubelet.service: Consumed 1.215s CPU time, 265.4M memory peak. Sep 12 06:00:50.261755 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 06:00:50.263051 systemd[1]: Started sshd@0-10.0.0.132:22-10.0.0.1:54212.service - OpenSSH per-connection server daemon (10.0.0.1:54212). Sep 12 06:00:50.363851 sshd[1693]: Accepted publickey for core from 10.0.0.1 port 54212 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:50.366230 sshd-session[1693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:50.418900 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 06:00:50.420085 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 06:00:50.427823 systemd-logind[1559]: New session 1 of user core. Sep 12 06:00:50.454602 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 06:00:50.458248 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 06:00:50.486457 (systemd)[1698]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 06:00:50.490555 systemd-logind[1559]: New session c1 of user core. Sep 12 06:00:50.677388 systemd[1698]: Queued start job for default target default.target. Sep 12 06:00:50.696658 systemd[1698]: Created slice app.slice - User Application Slice. Sep 12 06:00:50.696694 systemd[1698]: Reached target paths.target - Paths. Sep 12 06:00:50.696738 systemd[1698]: Reached target timers.target - Timers. Sep 12 06:00:50.701385 systemd[1698]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 06:00:50.715178 systemd[1698]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 06:00:50.715364 systemd[1698]: Reached target sockets.target - Sockets. Sep 12 06:00:50.715423 systemd[1698]: Reached target basic.target - Basic System. Sep 12 06:00:50.715483 systemd[1698]: Reached target default.target - Main User Target. Sep 12 06:00:50.715536 systemd[1698]: Startup finished in 215ms. Sep 12 06:00:50.715934 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 06:00:50.717565 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 06:00:50.786487 systemd[1]: Started sshd@1-10.0.0.132:22-10.0.0.1:54222.service - OpenSSH per-connection server daemon (10.0.0.1:54222). Sep 12 06:00:50.861630 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 54222 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:50.863641 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:50.868892 systemd-logind[1559]: New session 2 of user core. Sep 12 06:00:50.882298 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 06:00:50.937400 sshd[1712]: Connection closed by 10.0.0.1 port 54222 Sep 12 06:00:50.937794 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Sep 12 06:00:50.954854 systemd[1]: sshd@1-10.0.0.132:22-10.0.0.1:54222.service: Deactivated successfully. Sep 12 06:00:50.957048 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 06:00:50.957861 systemd-logind[1559]: Session 2 logged out. Waiting for processes to exit. Sep 12 06:00:50.961362 systemd[1]: Started sshd@2-10.0.0.132:22-10.0.0.1:54224.service - OpenSSH per-connection server daemon (10.0.0.1:54224). Sep 12 06:00:50.962092 systemd-logind[1559]: Removed session 2. Sep 12 06:00:51.021908 sshd[1718]: Accepted publickey for core from 10.0.0.1 port 54224 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:51.023692 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:51.028876 systemd-logind[1559]: New session 3 of user core. Sep 12 06:00:51.043304 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 06:00:51.093410 sshd[1721]: Connection closed by 10.0.0.1 port 54224 Sep 12 06:00:51.093950 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Sep 12 06:00:51.108914 systemd[1]: sshd@2-10.0.0.132:22-10.0.0.1:54224.service: Deactivated successfully. Sep 12 06:00:51.111259 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 06:00:51.112209 systemd-logind[1559]: Session 3 logged out. Waiting for processes to exit. Sep 12 06:00:51.115540 systemd[1]: Started sshd@3-10.0.0.132:22-10.0.0.1:54230.service - OpenSSH per-connection server daemon (10.0.0.1:54230). Sep 12 06:00:51.116413 systemd-logind[1559]: Removed session 3. Sep 12 06:00:51.169776 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 54230 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:51.171690 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:51.178149 systemd-logind[1559]: New session 4 of user core. Sep 12 06:00:51.188261 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 06:00:51.242872 sshd[1731]: Connection closed by 10.0.0.1 port 54230 Sep 12 06:00:51.242902 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Sep 12 06:00:51.259998 systemd[1]: sshd@3-10.0.0.132:22-10.0.0.1:54230.service: Deactivated successfully. Sep 12 06:00:51.261826 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 06:00:51.262824 systemd-logind[1559]: Session 4 logged out. Waiting for processes to exit. Sep 12 06:00:51.265353 systemd[1]: Started sshd@4-10.0.0.132:22-10.0.0.1:54244.service - OpenSSH per-connection server daemon (10.0.0.1:54244). Sep 12 06:00:51.266192 systemd-logind[1559]: Removed session 4. Sep 12 06:00:51.324964 sshd[1737]: Accepted publickey for core from 10.0.0.1 port 54244 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:51.326741 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:51.331407 systemd-logind[1559]: New session 5 of user core. Sep 12 06:00:51.345256 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 06:00:51.405855 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 06:00:51.406227 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 06:00:51.430611 sudo[1741]: pam_unix(sudo:session): session closed for user root Sep 12 06:00:51.432471 sshd[1740]: Connection closed by 10.0.0.1 port 54244 Sep 12 06:00:51.432845 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Sep 12 06:00:51.443773 systemd[1]: sshd@4-10.0.0.132:22-10.0.0.1:54244.service: Deactivated successfully. Sep 12 06:00:51.445661 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 06:00:51.446458 systemd-logind[1559]: Session 5 logged out. Waiting for processes to exit. Sep 12 06:00:51.449447 systemd[1]: Started sshd@5-10.0.0.132:22-10.0.0.1:54258.service - OpenSSH per-connection server daemon (10.0.0.1:54258). Sep 12 06:00:51.450308 systemd-logind[1559]: Removed session 5. Sep 12 06:00:51.513290 sshd[1747]: Accepted publickey for core from 10.0.0.1 port 54258 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:51.514521 sshd-session[1747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:51.520054 systemd-logind[1559]: New session 6 of user core. Sep 12 06:00:51.530283 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 06:00:51.584872 sudo[1753]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 06:00:51.585215 sudo[1753]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 06:00:51.592133 sudo[1753]: pam_unix(sudo:session): session closed for user root Sep 12 06:00:51.599250 sudo[1752]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 06:00:51.599570 sudo[1752]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 06:00:51.610719 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 06:00:51.671018 augenrules[1775]: No rules Sep 12 06:00:51.672783 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 06:00:51.673047 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 06:00:51.674448 sudo[1752]: pam_unix(sudo:session): session closed for user root Sep 12 06:00:51.676264 sshd[1751]: Connection closed by 10.0.0.1 port 54258 Sep 12 06:00:51.676662 sshd-session[1747]: pam_unix(sshd:session): session closed for user core Sep 12 06:00:51.685436 systemd[1]: sshd@5-10.0.0.132:22-10.0.0.1:54258.service: Deactivated successfully. Sep 12 06:00:51.687305 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 06:00:51.688160 systemd-logind[1559]: Session 6 logged out. Waiting for processes to exit. Sep 12 06:00:51.690785 systemd[1]: Started sshd@6-10.0.0.132:22-10.0.0.1:54260.service - OpenSSH per-connection server daemon (10.0.0.1:54260). Sep 12 06:00:51.691605 systemd-logind[1559]: Removed session 6. Sep 12 06:00:51.760149 sshd[1784]: Accepted publickey for core from 10.0.0.1 port 54260 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:00:51.761783 sshd-session[1784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:00:51.766398 systemd-logind[1559]: New session 7 of user core. Sep 12 06:00:51.776268 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 06:00:51.828205 sudo[1788]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 06:00:51.828492 sudo[1788]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 06:00:52.630003 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 06:00:52.650479 (dockerd)[1808]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 06:00:53.254308 dockerd[1808]: time="2025-09-12T06:00:53.254226040Z" level=info msg="Starting up" Sep 12 06:00:53.255254 dockerd[1808]: time="2025-09-12T06:00:53.255230423Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 06:00:53.278813 dockerd[1808]: time="2025-09-12T06:00:53.278744972Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 06:00:53.939950 dockerd[1808]: time="2025-09-12T06:00:53.939888802Z" level=info msg="Loading containers: start." Sep 12 06:00:53.952125 kernel: Initializing XFRM netlink socket Sep 12 06:00:54.218054 systemd-networkd[1491]: docker0: Link UP Sep 12 06:00:54.222859 dockerd[1808]: time="2025-09-12T06:00:54.222817875Z" level=info msg="Loading containers: done." Sep 12 06:00:54.314906 dockerd[1808]: time="2025-09-12T06:00:54.314830442Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 06:00:54.315311 dockerd[1808]: time="2025-09-12T06:00:54.314969102Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 06:00:54.315311 dockerd[1808]: time="2025-09-12T06:00:54.315077204Z" level=info msg="Initializing buildkit" Sep 12 06:00:54.346420 dockerd[1808]: time="2025-09-12T06:00:54.346356685Z" level=info msg="Completed buildkit initialization" Sep 12 06:00:54.352791 dockerd[1808]: time="2025-09-12T06:00:54.352759492Z" level=info msg="Daemon has completed initialization" Sep 12 06:00:54.352906 dockerd[1808]: time="2025-09-12T06:00:54.352844812Z" level=info msg="API listen on /run/docker.sock" Sep 12 06:00:54.353003 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 06:00:55.286745 containerd[1587]: time="2025-09-12T06:00:55.286677872Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 06:00:56.087412 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2422848487.mount: Deactivated successfully. Sep 12 06:00:57.432283 containerd[1587]: time="2025-09-12T06:00:57.432194933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:57.432985 containerd[1587]: time="2025-09-12T06:00:57.432949428Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Sep 12 06:00:57.434466 containerd[1587]: time="2025-09-12T06:00:57.434396852Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:57.437664 containerd[1587]: time="2025-09-12T06:00:57.437613354Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:57.439253 containerd[1587]: time="2025-09-12T06:00:57.439212784Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 2.15246517s" Sep 12 06:00:57.439307 containerd[1587]: time="2025-09-12T06:00:57.439262828Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 06:00:57.440582 containerd[1587]: time="2025-09-12T06:00:57.440313398Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 06:00:59.253882 containerd[1587]: time="2025-09-12T06:00:59.253808158Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:59.254638 containerd[1587]: time="2025-09-12T06:00:59.254563715Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Sep 12 06:00:59.258525 containerd[1587]: time="2025-09-12T06:00:59.258491781Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:59.260834 containerd[1587]: time="2025-09-12T06:00:59.260810680Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:00:59.262097 containerd[1587]: time="2025-09-12T06:00:59.262058660Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.821712452s" Sep 12 06:00:59.262182 containerd[1587]: time="2025-09-12T06:00:59.262121218Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 06:00:59.262663 containerd[1587]: time="2025-09-12T06:00:59.262632677Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 06:01:00.275956 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 06:01:00.277527 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:00.524581 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:00.529472 (kubelet)[2098]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 06:01:00.664988 containerd[1587]: time="2025-09-12T06:01:00.664928322Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:00.666054 containerd[1587]: time="2025-09-12T06:01:00.665698296Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Sep 12 06:01:00.666811 containerd[1587]: time="2025-09-12T06:01:00.666783661Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:00.669318 containerd[1587]: time="2025-09-12T06:01:00.669276306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:00.670311 containerd[1587]: time="2025-09-12T06:01:00.670249641Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 1.407589533s" Sep 12 06:01:00.670311 containerd[1587]: time="2025-09-12T06:01:00.670283004Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 06:01:00.670800 containerd[1587]: time="2025-09-12T06:01:00.670777351Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 06:01:00.686958 kubelet[2098]: E0912 06:01:00.686907 2098 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 06:01:00.693360 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 06:01:00.693560 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 06:01:00.693946 systemd[1]: kubelet.service: Consumed 236ms CPU time, 111.9M memory peak. Sep 12 06:01:01.885612 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1773786106.mount: Deactivated successfully. Sep 12 06:01:03.018626 containerd[1587]: time="2025-09-12T06:01:03.018549181Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:03.020115 containerd[1587]: time="2025-09-12T06:01:03.020048212Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Sep 12 06:01:03.021760 containerd[1587]: time="2025-09-12T06:01:03.021714738Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:03.024208 containerd[1587]: time="2025-09-12T06:01:03.024169171Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:03.024877 containerd[1587]: time="2025-09-12T06:01:03.024826914Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 2.3540162s" Sep 12 06:01:03.024877 containerd[1587]: time="2025-09-12T06:01:03.024863713Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 06:01:03.025492 containerd[1587]: time="2025-09-12T06:01:03.025412512Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 06:01:03.504162 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2327533924.mount: Deactivated successfully. Sep 12 06:01:04.845350 containerd[1587]: time="2025-09-12T06:01:04.845281316Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:04.846526 containerd[1587]: time="2025-09-12T06:01:04.846480946Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 06:01:04.847975 containerd[1587]: time="2025-09-12T06:01:04.847922038Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:04.851871 containerd[1587]: time="2025-09-12T06:01:04.851819397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:04.852688 containerd[1587]: time="2025-09-12T06:01:04.852644304Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.827207857s" Sep 12 06:01:04.852688 containerd[1587]: time="2025-09-12T06:01:04.852675903Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 06:01:04.853280 containerd[1587]: time="2025-09-12T06:01:04.853239911Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 06:01:05.362963 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount950793554.mount: Deactivated successfully. Sep 12 06:01:05.973247 containerd[1587]: time="2025-09-12T06:01:05.973168126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 06:01:06.036059 containerd[1587]: time="2025-09-12T06:01:06.035982045Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 06:01:06.128270 containerd[1587]: time="2025-09-12T06:01:06.128180580Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 06:01:06.149249 containerd[1587]: time="2025-09-12T06:01:06.149173209Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 06:01:06.150028 containerd[1587]: time="2025-09-12T06:01:06.149981064Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.296691279s" Sep 12 06:01:06.150077 containerd[1587]: time="2025-09-12T06:01:06.150030607Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 06:01:06.150637 containerd[1587]: time="2025-09-12T06:01:06.150608761Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 06:01:09.215719 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2784952499.mount: Deactivated successfully. Sep 12 06:01:10.832069 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 06:01:10.833913 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:11.023442 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:11.028050 (kubelet)[2231]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 06:01:11.081834 kubelet[2231]: E0912 06:01:11.081774 2231 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 06:01:11.085934 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 06:01:11.086174 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 06:01:11.086541 systemd[1]: kubelet.service: Consumed 225ms CPU time, 108.8M memory peak. Sep 12 06:01:12.687127 containerd[1587]: time="2025-09-12T06:01:12.687058768Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:12.687853 containerd[1587]: time="2025-09-12T06:01:12.687794718Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 12 06:01:12.689034 containerd[1587]: time="2025-09-12T06:01:12.688998926Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:12.691968 containerd[1587]: time="2025-09-12T06:01:12.691933520Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:12.693095 containerd[1587]: time="2025-09-12T06:01:12.693042970Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 6.542407078s" Sep 12 06:01:12.693095 containerd[1587]: time="2025-09-12T06:01:12.693074580Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 06:01:15.430431 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:15.430654 systemd[1]: kubelet.service: Consumed 225ms CPU time, 108.8M memory peak. Sep 12 06:01:15.433376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:15.460565 systemd[1]: Reload requested from client PID 2271 ('systemctl') (unit session-7.scope)... Sep 12 06:01:15.460579 systemd[1]: Reloading... Sep 12 06:01:15.632383 zram_generator::config[2313]: No configuration found. Sep 12 06:01:16.022345 systemd[1]: Reloading finished in 561 ms. Sep 12 06:01:16.096807 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 06:01:16.096914 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 06:01:16.097267 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:16.097311 systemd[1]: kubelet.service: Consumed 153ms CPU time, 98.3M memory peak. Sep 12 06:01:16.098927 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:16.289908 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:16.310656 (kubelet)[2362]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 06:01:16.428436 kubelet[2362]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 06:01:16.428436 kubelet[2362]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 06:01:16.428436 kubelet[2362]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 06:01:16.428949 kubelet[2362]: I0912 06:01:16.428519 2362 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 06:01:16.740323 kubelet[2362]: I0912 06:01:16.740280 2362 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 06:01:16.740323 kubelet[2362]: I0912 06:01:16.740309 2362 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 06:01:16.740603 kubelet[2362]: I0912 06:01:16.740584 2362 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 06:01:16.768500 kubelet[2362]: E0912 06:01:16.768445 2362 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.132:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:16.771266 kubelet[2362]: I0912 06:01:16.771178 2362 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 06:01:16.777655 kubelet[2362]: I0912 06:01:16.777623 2362 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 06:01:16.782678 kubelet[2362]: I0912 06:01:16.782647 2362 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 06:01:16.782985 kubelet[2362]: I0912 06:01:16.782949 2362 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 06:01:16.783198 kubelet[2362]: I0912 06:01:16.782975 2362 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 06:01:16.783906 kubelet[2362]: I0912 06:01:16.783881 2362 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 06:01:16.783906 kubelet[2362]: I0912 06:01:16.783895 2362 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 06:01:16.784070 kubelet[2362]: I0912 06:01:16.784049 2362 state_mem.go:36] "Initialized new in-memory state store" Sep 12 06:01:16.788661 kubelet[2362]: I0912 06:01:16.788636 2362 kubelet.go:446] "Attempting to sync node with API server" Sep 12 06:01:16.788701 kubelet[2362]: I0912 06:01:16.788671 2362 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 06:01:16.788701 kubelet[2362]: I0912 06:01:16.788696 2362 kubelet.go:352] "Adding apiserver pod source" Sep 12 06:01:16.788740 kubelet[2362]: I0912 06:01:16.788709 2362 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 06:01:16.791957 kubelet[2362]: W0912 06:01:16.791761 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.132:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:16.791957 kubelet[2362]: W0912 06:01:16.791887 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.132:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:16.792252 kubelet[2362]: E0912 06:01:16.792020 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.132:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:16.792252 kubelet[2362]: E0912 06:01:16.792190 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.132:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:16.793360 kubelet[2362]: I0912 06:01:16.793339 2362 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 06:01:16.793733 kubelet[2362]: I0912 06:01:16.793708 2362 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 06:01:16.793784 kubelet[2362]: W0912 06:01:16.793771 2362 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 06:01:16.796246 kubelet[2362]: I0912 06:01:16.796222 2362 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 06:01:16.796294 kubelet[2362]: I0912 06:01:16.796264 2362 server.go:1287] "Started kubelet" Sep 12 06:01:16.801124 kubelet[2362]: I0912 06:01:16.799056 2362 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 06:01:16.801124 kubelet[2362]: I0912 06:01:16.800423 2362 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 06:01:16.801643 kubelet[2362]: I0912 06:01:16.801628 2362 server.go:479] "Adding debug handlers to kubelet server" Sep 12 06:01:16.878622 kubelet[2362]: I0912 06:01:16.878579 2362 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 06:01:16.878968 kubelet[2362]: I0912 06:01:16.878952 2362 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 06:01:16.879430 kubelet[2362]: E0912 06:01:16.879405 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:16.880198 kubelet[2362]: I0912 06:01:16.880157 2362 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 06:01:16.880289 kubelet[2362]: I0912 06:01:16.880241 2362 reconciler.go:26] "Reconciler: start to sync state" Sep 12 06:01:16.881496 kubelet[2362]: I0912 06:01:16.880831 2362 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 06:01:16.881496 kubelet[2362]: I0912 06:01:16.881092 2362 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 06:01:16.885360 kubelet[2362]: W0912 06:01:16.885303 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.132:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:16.885414 kubelet[2362]: E0912 06:01:16.885374 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.132:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:16.885616 kubelet[2362]: E0912 06:01:16.885588 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.132:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.132:6443: connect: connection refused" interval="200ms" Sep 12 06:01:16.886640 kubelet[2362]: I0912 06:01:16.886609 2362 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 06:01:16.889037 kubelet[2362]: E0912 06:01:16.885146 2362 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.132:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.132:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864739df27650c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 06:01:16.796235972 +0000 UTC m=+0.477468625,LastTimestamp:2025-09-12 06:01:16.796235972 +0000 UTC m=+0.477468625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 06:01:16.891115 kubelet[2362]: I0912 06:01:16.889740 2362 factory.go:221] Registration of the containerd container factory successfully Sep 12 06:01:16.891115 kubelet[2362]: I0912 06:01:16.889751 2362 factory.go:221] Registration of the systemd container factory successfully Sep 12 06:01:16.892298 kubelet[2362]: E0912 06:01:16.892270 2362 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 06:01:16.898975 kubelet[2362]: I0912 06:01:16.898818 2362 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 06:01:16.900158 kubelet[2362]: I0912 06:01:16.900141 2362 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 06:01:16.900198 kubelet[2362]: I0912 06:01:16.900174 2362 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 06:01:16.900198 kubelet[2362]: I0912 06:01:16.900193 2362 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 06:01:16.900252 kubelet[2362]: I0912 06:01:16.900203 2362 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 06:01:16.900252 kubelet[2362]: E0912 06:01:16.900242 2362 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 06:01:16.900333 kubelet[2362]: I0912 06:01:16.900321 2362 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 06:01:16.900380 kubelet[2362]: I0912 06:01:16.900371 2362 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 06:01:16.900430 kubelet[2362]: I0912 06:01:16.900422 2362 state_mem.go:36] "Initialized new in-memory state store" Sep 12 06:01:16.904580 kubelet[2362]: W0912 06:01:16.904548 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.132:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:16.904655 kubelet[2362]: E0912 06:01:16.904588 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.132:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:16.980707 kubelet[2362]: E0912 06:01:16.980638 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.000983 kubelet[2362]: E0912 06:01:17.000885 2362 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 06:01:17.081398 kubelet[2362]: E0912 06:01:17.081333 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.086973 kubelet[2362]: E0912 06:01:17.086948 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.132:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.132:6443: connect: connection refused" interval="400ms" Sep 12 06:01:17.182501 kubelet[2362]: E0912 06:01:17.182451 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.201689 kubelet[2362]: E0912 06:01:17.201627 2362 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 06:01:17.283624 kubelet[2362]: E0912 06:01:17.283458 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.383702 kubelet[2362]: E0912 06:01:17.383615 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.436187 kubelet[2362]: I0912 06:01:17.436136 2362 policy_none.go:49] "None policy: Start" Sep 12 06:01:17.436187 kubelet[2362]: I0912 06:01:17.436173 2362 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 06:01:17.436187 kubelet[2362]: I0912 06:01:17.436190 2362 state_mem.go:35] "Initializing new in-memory state store" Sep 12 06:01:17.447670 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 06:01:17.462588 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 06:01:17.466009 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 06:01:17.484567 kubelet[2362]: E0912 06:01:17.484522 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:17.485075 kubelet[2362]: I0912 06:01:17.485047 2362 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 06:01:17.485338 kubelet[2362]: I0912 06:01:17.485315 2362 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 06:01:17.485550 kubelet[2362]: I0912 06:01:17.485333 2362 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 06:01:17.485685 kubelet[2362]: I0912 06:01:17.485662 2362 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 06:01:17.486790 kubelet[2362]: E0912 06:01:17.486764 2362 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 06:01:17.486846 kubelet[2362]: E0912 06:01:17.486824 2362 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 06:01:17.487403 kubelet[2362]: E0912 06:01:17.487363 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.132:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.132:6443: connect: connection refused" interval="800ms" Sep 12 06:01:17.587197 kubelet[2362]: I0912 06:01:17.587141 2362 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 06:01:17.588363 kubelet[2362]: E0912 06:01:17.588328 2362 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.132:6443/api/v1/nodes\": dial tcp 10.0.0.132:6443: connect: connection refused" node="localhost" Sep 12 06:01:17.610969 systemd[1]: Created slice kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice - libcontainer container kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice. Sep 12 06:01:17.630083 kubelet[2362]: E0912 06:01:17.630038 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:17.632491 systemd[1]: Created slice kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice - libcontainer container kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice. Sep 12 06:01:17.634089 kubelet[2362]: E0912 06:01:17.634067 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:17.653826 systemd[1]: Created slice kubepods-burstable-pod7a2027d3a44e192188446be93e818156.slice - libcontainer container kubepods-burstable-pod7a2027d3a44e192188446be93e818156.slice. Sep 12 06:01:17.655677 kubelet[2362]: E0912 06:01:17.655640 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:17.687029 kubelet[2362]: I0912 06:01:17.686980 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:17.687029 kubelet[2362]: I0912 06:01:17.687015 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:17.687029 kubelet[2362]: I0912 06:01:17.687041 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:17.687243 kubelet[2362]: I0912 06:01:17.687058 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:17.687243 kubelet[2362]: I0912 06:01:17.687168 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:17.687243 kubelet[2362]: I0912 06:01:17.687213 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:17.687312 kubelet[2362]: I0912 06:01:17.687242 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:17.687312 kubelet[2362]: I0912 06:01:17.687262 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:17.687312 kubelet[2362]: I0912 06:01:17.687276 2362 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:17.711584 kubelet[2362]: W0912 06:01:17.711536 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.132:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:17.711682 kubelet[2362]: E0912 06:01:17.711607 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.132:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:17.789729 kubelet[2362]: I0912 06:01:17.789679 2362 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 06:01:17.790109 kubelet[2362]: E0912 06:01:17.790074 2362 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.132:6443/api/v1/nodes\": dial tcp 10.0.0.132:6443: connect: connection refused" node="localhost" Sep 12 06:01:17.931543 containerd[1587]: time="2025-09-12T06:01:17.931400160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:17.934742 containerd[1587]: time="2025-09-12T06:01:17.934685872Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:17.957563 containerd[1587]: time="2025-09-12T06:01:17.957503453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7a2027d3a44e192188446be93e818156,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:17.960092 kubelet[2362]: W0912 06:01:17.960013 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.132:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:17.960197 kubelet[2362]: E0912 06:01:17.960096 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.132:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:17.970227 containerd[1587]: time="2025-09-12T06:01:17.970135272Z" level=info msg="connecting to shim 5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14" address="unix:///run/containerd/s/2607da506df48ab456a43c1a0e328b39cc026a6dc90c3312bef2d17c55ef0dd2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:17.977859 containerd[1587]: time="2025-09-12T06:01:17.977806248Z" level=info msg="connecting to shim b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c" address="unix:///run/containerd/s/92014b576b0141472f93a22713e52d26a63cacbbd5037fea795622f656c3a36f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:17.995626 containerd[1587]: time="2025-09-12T06:01:17.995571734Z" level=info msg="connecting to shim cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024" address="unix:///run/containerd/s/06d7c87595cb8b6bfa67886b97cc1aff47e30824920f9ed4dae816670fbed604" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:18.013255 systemd[1]: Started cri-containerd-b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c.scope - libcontainer container b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c. Sep 12 06:01:18.016486 systemd[1]: Started cri-containerd-5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14.scope - libcontainer container 5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14. Sep 12 06:01:18.024783 systemd[1]: Started cri-containerd-cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024.scope - libcontainer container cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024. Sep 12 06:01:18.192518 kubelet[2362]: I0912 06:01:18.192174 2362 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 06:01:18.193070 kubelet[2362]: E0912 06:01:18.193042 2362 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.132:6443/api/v1/nodes\": dial tcp 10.0.0.132:6443: connect: connection refused" node="localhost" Sep 12 06:01:18.210498 containerd[1587]: time="2025-09-12T06:01:18.210452462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,} returns sandbox id \"b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c\"" Sep 12 06:01:18.215526 containerd[1587]: time="2025-09-12T06:01:18.215469892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14\"" Sep 12 06:01:18.220353 containerd[1587]: time="2025-09-12T06:01:18.220303276Z" level=info msg="CreateContainer within sandbox \"b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 06:01:18.222429 containerd[1587]: time="2025-09-12T06:01:18.222401842Z" level=info msg="CreateContainer within sandbox \"5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 06:01:18.226641 containerd[1587]: time="2025-09-12T06:01:18.226603972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7a2027d3a44e192188446be93e818156,Namespace:kube-system,Attempt:0,} returns sandbox id \"cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024\"" Sep 12 06:01:18.228395 containerd[1587]: time="2025-09-12T06:01:18.228357561Z" level=info msg="CreateContainer within sandbox \"cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 06:01:18.235093 containerd[1587]: time="2025-09-12T06:01:18.235059159Z" level=info msg="Container 511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:18.239209 containerd[1587]: time="2025-09-12T06:01:18.238576084Z" level=info msg="Container eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:18.242790 containerd[1587]: time="2025-09-12T06:01:18.242752246Z" level=info msg="Container ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:18.247068 containerd[1587]: time="2025-09-12T06:01:18.247024177Z" level=info msg="CreateContainer within sandbox \"b6a878d212916405e21c4068f2f9ce9f1ace79327ea9070323f6cc5b5f9e624c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716\"" Sep 12 06:01:18.247536 containerd[1587]: time="2025-09-12T06:01:18.247510570Z" level=info msg="StartContainer for \"511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716\"" Sep 12 06:01:18.248531 containerd[1587]: time="2025-09-12T06:01:18.248498132Z" level=info msg="connecting to shim 511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716" address="unix:///run/containerd/s/92014b576b0141472f93a22713e52d26a63cacbbd5037fea795622f656c3a36f" protocol=ttrpc version=3 Sep 12 06:01:18.250370 containerd[1587]: time="2025-09-12T06:01:18.250336730Z" level=info msg="CreateContainer within sandbox \"5bc3d093587d736a0ad7cb3c194a7c02bd738d4455a9c4e1bfb128d2a9afaf14\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510\"" Sep 12 06:01:18.250692 containerd[1587]: time="2025-09-12T06:01:18.250662901Z" level=info msg="StartContainer for \"eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510\"" Sep 12 06:01:18.251551 containerd[1587]: time="2025-09-12T06:01:18.251523736Z" level=info msg="connecting to shim eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510" address="unix:///run/containerd/s/2607da506df48ab456a43c1a0e328b39cc026a6dc90c3312bef2d17c55ef0dd2" protocol=ttrpc version=3 Sep 12 06:01:18.253370 containerd[1587]: time="2025-09-12T06:01:18.253337637Z" level=info msg="CreateContainer within sandbox \"cd3aa4aed0192a8828b1a7057ecc984eeb1643eee6316e77fe4c750096785024\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b\"" Sep 12 06:01:18.253934 containerd[1587]: time="2025-09-12T06:01:18.253911794Z" level=info msg="StartContainer for \"ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b\"" Sep 12 06:01:18.255080 containerd[1587]: time="2025-09-12T06:01:18.255057363Z" level=info msg="connecting to shim ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b" address="unix:///run/containerd/s/06d7c87595cb8b6bfa67886b97cc1aff47e30824920f9ed4dae816670fbed604" protocol=ttrpc version=3 Sep 12 06:01:18.268582 kubelet[2362]: W0912 06:01:18.268529 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.132:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:18.268731 kubelet[2362]: E0912 06:01:18.268631 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.132:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:18.269434 kubelet[2362]: W0912 06:01:18.269383 2362 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.132:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.132:6443: connect: connection refused Sep 12 06:01:18.269491 kubelet[2362]: E0912 06:01:18.269444 2362 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.132:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.132:6443: connect: connection refused" logger="UnhandledError" Sep 12 06:01:18.271271 systemd[1]: Started cri-containerd-eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510.scope - libcontainer container eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510. Sep 12 06:01:18.275890 systemd[1]: Started cri-containerd-511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716.scope - libcontainer container 511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716. Sep 12 06:01:18.290469 kubelet[2362]: E0912 06:01:18.288321 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.132:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.132:6443: connect: connection refused" interval="1.6s" Sep 12 06:01:18.317734 systemd[1]: Started cri-containerd-ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b.scope - libcontainer container ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b. Sep 12 06:01:18.368274 containerd[1587]: time="2025-09-12T06:01:18.368182072Z" level=info msg="StartContainer for \"511d0c4415c93bd631e1009bfae458c40c3a0a1e0d69ba655e9a19ee218a9716\" returns successfully" Sep 12 06:01:18.387579 containerd[1587]: time="2025-09-12T06:01:18.387503657Z" level=info msg="StartContainer for \"ca6130b20e58b2cd089c562cc710d7c15fa71fe326d5b71b0788b56039c6a95b\" returns successfully" Sep 12 06:01:18.398181 containerd[1587]: time="2025-09-12T06:01:18.398144201Z" level=info msg="StartContainer for \"eabc11eaf28027f95f4007dd8fb26530d864ae1eef2c833c176c85ba9c865510\" returns successfully" Sep 12 06:01:18.913393 kubelet[2362]: E0912 06:01:18.913223 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:18.916252 kubelet[2362]: E0912 06:01:18.916069 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:18.918977 kubelet[2362]: E0912 06:01:18.918963 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:18.997172 kubelet[2362]: I0912 06:01:18.995360 2362 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 06:01:19.892000 kubelet[2362]: E0912 06:01:19.891963 2362 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 06:01:19.920836 kubelet[2362]: E0912 06:01:19.920794 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:19.921168 kubelet[2362]: E0912 06:01:19.920889 2362 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 06:01:19.962072 kubelet[2362]: I0912 06:01:19.962015 2362 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 06:01:19.962072 kubelet[2362]: E0912 06:01:19.962059 2362 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 12 06:01:20.119397 kubelet[2362]: E0912 06:01:20.119335 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:20.219974 kubelet[2362]: E0912 06:01:20.219844 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:20.320993 kubelet[2362]: E0912 06:01:20.320937 2362 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 06:01:20.380679 kubelet[2362]: I0912 06:01:20.380612 2362 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:20.387033 kubelet[2362]: E0912 06:01:20.386979 2362 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:20.387033 kubelet[2362]: I0912 06:01:20.387019 2362 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:20.388506 kubelet[2362]: E0912 06:01:20.388475 2362 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:20.388506 kubelet[2362]: I0912 06:01:20.388497 2362 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:20.389703 kubelet[2362]: E0912 06:01:20.389669 2362 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:20.790489 kubelet[2362]: I0912 06:01:20.790446 2362 apiserver.go:52] "Watching apiserver" Sep 12 06:01:20.880472 kubelet[2362]: I0912 06:01:20.880428 2362 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 06:01:20.921898 kubelet[2362]: I0912 06:01:20.921865 2362 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:21.737288 systemd[1]: Reload requested from client PID 2638 ('systemctl') (unit session-7.scope)... Sep 12 06:01:21.737304 systemd[1]: Reloading... Sep 12 06:01:21.811138 zram_generator::config[2681]: No configuration found. Sep 12 06:01:22.076319 systemd[1]: Reloading finished in 338 ms. Sep 12 06:01:22.112800 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:22.140315 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 06:01:22.140593 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:22.140642 systemd[1]: kubelet.service: Consumed 1.125s CPU time, 133M memory peak. Sep 12 06:01:22.142352 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 06:01:22.324081 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 06:01:22.336565 (kubelet)[2726]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 06:01:22.380347 kubelet[2726]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 06:01:22.380347 kubelet[2726]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 06:01:22.380347 kubelet[2726]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 06:01:22.380724 kubelet[2726]: I0912 06:01:22.380471 2726 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 06:01:22.387056 kubelet[2726]: I0912 06:01:22.387023 2726 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 06:01:22.387056 kubelet[2726]: I0912 06:01:22.387042 2726 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 06:01:22.387287 kubelet[2726]: I0912 06:01:22.387263 2726 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 06:01:22.388420 kubelet[2726]: I0912 06:01:22.388395 2726 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 06:01:22.390551 kubelet[2726]: I0912 06:01:22.390510 2726 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 06:01:22.396122 kubelet[2726]: I0912 06:01:22.394427 2726 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 06:01:22.399947 kubelet[2726]: I0912 06:01:22.399924 2726 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 06:01:22.400234 kubelet[2726]: I0912 06:01:22.400194 2726 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 06:01:22.400400 kubelet[2726]: I0912 06:01:22.400226 2726 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 06:01:22.400500 kubelet[2726]: I0912 06:01:22.400406 2726 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 06:01:22.400500 kubelet[2726]: I0912 06:01:22.400415 2726 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 06:01:22.400500 kubelet[2726]: I0912 06:01:22.400466 2726 state_mem.go:36] "Initialized new in-memory state store" Sep 12 06:01:22.400680 kubelet[2726]: I0912 06:01:22.400665 2726 kubelet.go:446] "Attempting to sync node with API server" Sep 12 06:01:22.400705 kubelet[2726]: I0912 06:01:22.400683 2726 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 06:01:22.400734 kubelet[2726]: I0912 06:01:22.400709 2726 kubelet.go:352] "Adding apiserver pod source" Sep 12 06:01:22.400734 kubelet[2726]: I0912 06:01:22.400719 2726 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 06:01:22.401850 kubelet[2726]: I0912 06:01:22.401829 2726 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 06:01:22.402357 kubelet[2726]: I0912 06:01:22.402318 2726 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 06:01:22.402914 kubelet[2726]: I0912 06:01:22.402835 2726 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 06:01:22.402914 kubelet[2726]: I0912 06:01:22.402869 2726 server.go:1287] "Started kubelet" Sep 12 06:01:22.403210 kubelet[2726]: I0912 06:01:22.403148 2726 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 06:01:22.403330 kubelet[2726]: I0912 06:01:22.403279 2726 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 06:01:22.403610 kubelet[2726]: I0912 06:01:22.403572 2726 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 06:01:22.404937 kubelet[2726]: I0912 06:01:22.404720 2726 server.go:479] "Adding debug handlers to kubelet server" Sep 12 06:01:22.405490 kubelet[2726]: I0912 06:01:22.405475 2726 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 06:01:22.414539 kubelet[2726]: I0912 06:01:22.414518 2726 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 06:01:22.414722 kubelet[2726]: I0912 06:01:22.414700 2726 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 06:01:22.416618 kubelet[2726]: I0912 06:01:22.416551 2726 reconciler.go:26] "Reconciler: start to sync state" Sep 12 06:01:22.416618 kubelet[2726]: I0912 06:01:22.416572 2726 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 06:01:22.416897 kubelet[2726]: I0912 06:01:22.416884 2726 factory.go:221] Registration of the systemd container factory successfully Sep 12 06:01:22.417066 kubelet[2726]: I0912 06:01:22.417049 2726 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 06:01:22.418967 kubelet[2726]: E0912 06:01:22.418939 2726 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 06:01:22.420726 kubelet[2726]: I0912 06:01:22.420601 2726 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 06:01:22.421881 kubelet[2726]: I0912 06:01:22.421853 2726 factory.go:221] Registration of the containerd container factory successfully Sep 12 06:01:22.422685 kubelet[2726]: I0912 06:01:22.422648 2726 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 06:01:22.422743 kubelet[2726]: I0912 06:01:22.422690 2726 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 06:01:22.422743 kubelet[2726]: I0912 06:01:22.422714 2726 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 06:01:22.422743 kubelet[2726]: I0912 06:01:22.422735 2726 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 06:01:22.422875 kubelet[2726]: E0912 06:01:22.422803 2726 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 06:01:22.455617 kubelet[2726]: I0912 06:01:22.455572 2726 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 06:01:22.455617 kubelet[2726]: I0912 06:01:22.455596 2726 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 06:01:22.455617 kubelet[2726]: I0912 06:01:22.455613 2726 state_mem.go:36] "Initialized new in-memory state store" Sep 12 06:01:22.455808 kubelet[2726]: I0912 06:01:22.455761 2726 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 06:01:22.455808 kubelet[2726]: I0912 06:01:22.455770 2726 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 06:01:22.455808 kubelet[2726]: I0912 06:01:22.455796 2726 policy_none.go:49] "None policy: Start" Sep 12 06:01:22.455808 kubelet[2726]: I0912 06:01:22.455805 2726 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 06:01:22.455911 kubelet[2726]: I0912 06:01:22.455814 2726 state_mem.go:35] "Initializing new in-memory state store" Sep 12 06:01:22.455931 kubelet[2726]: I0912 06:01:22.455913 2726 state_mem.go:75] "Updated machine memory state" Sep 12 06:01:22.459635 kubelet[2726]: I0912 06:01:22.459604 2726 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 06:01:22.459859 kubelet[2726]: I0912 06:01:22.459779 2726 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 06:01:22.459859 kubelet[2726]: I0912 06:01:22.459799 2726 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 06:01:22.460005 kubelet[2726]: I0912 06:01:22.459951 2726 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 06:01:22.461300 kubelet[2726]: E0912 06:01:22.461093 2726 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 06:01:22.524081 kubelet[2726]: I0912 06:01:22.524043 2726 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.524081 kubelet[2726]: I0912 06:01:22.524065 2726 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:22.524315 kubelet[2726]: I0912 06:01:22.524290 2726 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:22.529471 kubelet[2726]: E0912 06:01:22.529446 2726 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:22.563652 kubelet[2726]: I0912 06:01:22.563604 2726 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 06:01:22.569139 kubelet[2726]: I0912 06:01:22.569089 2726 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 06:01:22.569236 kubelet[2726]: I0912 06:01:22.569186 2726 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 06:01:22.618276 kubelet[2726]: I0912 06:01:22.618163 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:22.618276 kubelet[2726]: I0912 06:01:22.618195 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:22.618276 kubelet[2726]: I0912 06:01:22.618226 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.618276 kubelet[2726]: I0912 06:01:22.618246 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.618276 kubelet[2726]: I0912 06:01:22.618261 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.618496 kubelet[2726]: I0912 06:01:22.618275 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a2027d3a44e192188446be93e818156-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7a2027d3a44e192188446be93e818156\") " pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:22.618496 kubelet[2726]: I0912 06:01:22.618288 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.618496 kubelet[2726]: I0912 06:01:22.618302 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 06:01:22.618496 kubelet[2726]: I0912 06:01:22.618349 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 06:01:23.401193 kubelet[2726]: I0912 06:01:23.401153 2726 apiserver.go:52] "Watching apiserver" Sep 12 06:01:23.417366 kubelet[2726]: I0912 06:01:23.417295 2726 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 06:01:23.440436 kubelet[2726]: I0912 06:01:23.440406 2726 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:23.636784 kubelet[2726]: E0912 06:01:23.636730 2726 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 06:01:23.656246 kubelet[2726]: I0912 06:01:23.655778 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.6557489570000001 podStartE2EDuration="1.655748957s" podCreationTimestamp="2025-09-12 06:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:01:23.647674704 +0000 UTC m=+1.306447207" watchObservedRunningTime="2025-09-12 06:01:23.655748957 +0000 UTC m=+1.314521460" Sep 12 06:01:23.664671 kubelet[2726]: I0912 06:01:23.664414 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.664396378 podStartE2EDuration="1.664396378s" podCreationTimestamp="2025-09-12 06:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:01:23.664312657 +0000 UTC m=+1.323085170" watchObservedRunningTime="2025-09-12 06:01:23.664396378 +0000 UTC m=+1.323168881" Sep 12 06:01:23.664671 kubelet[2726]: I0912 06:01:23.664537 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.6645321490000002 podStartE2EDuration="3.664532149s" podCreationTimestamp="2025-09-12 06:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:01:23.656302008 +0000 UTC m=+1.315074511" watchObservedRunningTime="2025-09-12 06:01:23.664532149 +0000 UTC m=+1.323304652" Sep 12 06:01:27.694410 kubelet[2726]: I0912 06:01:27.694376 2726 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 06:01:27.694885 containerd[1587]: time="2025-09-12T06:01:27.694731455Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 06:01:27.695210 kubelet[2726]: I0912 06:01:27.695005 2726 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 06:01:28.373501 systemd[1]: Created slice kubepods-besteffort-podce95b102_40ab_4ab6_a97e_e1f5cb7a5b90.slice - libcontainer container kubepods-besteffort-podce95b102_40ab_4ab6_a97e_e1f5cb7a5b90.slice. Sep 12 06:01:28.452083 kubelet[2726]: I0912 06:01:28.452036 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90-xtables-lock\") pod \"kube-proxy-m6ssp\" (UID: \"ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90\") " pod="kube-system/kube-proxy-m6ssp" Sep 12 06:01:28.452245 kubelet[2726]: I0912 06:01:28.452094 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90-kube-proxy\") pod \"kube-proxy-m6ssp\" (UID: \"ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90\") " pod="kube-system/kube-proxy-m6ssp" Sep 12 06:01:28.452245 kubelet[2726]: I0912 06:01:28.452152 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90-lib-modules\") pod \"kube-proxy-m6ssp\" (UID: \"ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90\") " pod="kube-system/kube-proxy-m6ssp" Sep 12 06:01:28.452245 kubelet[2726]: I0912 06:01:28.452178 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7jn\" (UniqueName: \"kubernetes.io/projected/ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90-kube-api-access-ts7jn\") pod \"kube-proxy-m6ssp\" (UID: \"ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90\") " pod="kube-system/kube-proxy-m6ssp" Sep 12 06:01:28.684439 containerd[1587]: time="2025-09-12T06:01:28.684293813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-m6ssp,Uid:ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:28.695691 systemd[1]: Created slice kubepods-besteffort-podfc66469b_e068_44a5_9880_ff3c88b45a9e.slice - libcontainer container kubepods-besteffort-podfc66469b_e068_44a5_9880_ff3c88b45a9e.slice. Sep 12 06:01:28.903159 containerd[1587]: time="2025-09-12T06:01:28.902662904Z" level=info msg="connecting to shim 38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d" address="unix:///run/containerd/s/b0ee002f702621e60450cb99ff42cdc414f5bb8cdf7780014b48c4514bea469d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:28.941248 systemd[1]: Started cri-containerd-38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d.scope - libcontainer container 38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d. Sep 12 06:01:28.978211 containerd[1587]: time="2025-09-12T06:01:28.978158514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-m6ssp,Uid:ce95b102-40ab-4ab6-a97e-e1f5cb7a5b90,Namespace:kube-system,Attempt:0,} returns sandbox id \"38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d\"" Sep 12 06:01:28.981492 containerd[1587]: time="2025-09-12T06:01:28.981461562Z" level=info msg="CreateContainer within sandbox \"38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 06:01:28.990240 kubelet[2726]: I0912 06:01:28.990190 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsntr\" (UniqueName: \"kubernetes.io/projected/fc66469b-e068-44a5-9880-ff3c88b45a9e-kube-api-access-tsntr\") pod \"tigera-operator-755d956888-s4bnt\" (UID: \"fc66469b-e068-44a5-9880-ff3c88b45a9e\") " pod="tigera-operator/tigera-operator-755d956888-s4bnt" Sep 12 06:01:28.990240 kubelet[2726]: I0912 06:01:28.990238 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fc66469b-e068-44a5-9880-ff3c88b45a9e-var-lib-calico\") pod \"tigera-operator-755d956888-s4bnt\" (UID: \"fc66469b-e068-44a5-9880-ff3c88b45a9e\") " pod="tigera-operator/tigera-operator-755d956888-s4bnt" Sep 12 06:01:28.997769 containerd[1587]: time="2025-09-12T06:01:28.997135507Z" level=info msg="Container b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:29.006993 containerd[1587]: time="2025-09-12T06:01:29.006923640Z" level=info msg="CreateContainer within sandbox \"38eb66e473e5353f6957f46e5d107c59bd42662941275166aee98a00aa238e3d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169\"" Sep 12 06:01:29.009119 containerd[1587]: time="2025-09-12T06:01:29.007567446Z" level=info msg="StartContainer for \"b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169\"" Sep 12 06:01:29.009119 containerd[1587]: time="2025-09-12T06:01:29.008897880Z" level=info msg="connecting to shim b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169" address="unix:///run/containerd/s/b0ee002f702621e60450cb99ff42cdc414f5bb8cdf7780014b48c4514bea469d" protocol=ttrpc version=3 Sep 12 06:01:29.043261 systemd[1]: Started cri-containerd-b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169.scope - libcontainer container b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169. Sep 12 06:01:29.088304 containerd[1587]: time="2025-09-12T06:01:29.088246140Z" level=info msg="StartContainer for \"b1e2d1c38b0e7c2bd27c2a2be162b13f677d083d997636a00ffd22ae4a4ee169\" returns successfully" Sep 12 06:01:29.300324 containerd[1587]: time="2025-09-12T06:01:29.300212066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-s4bnt,Uid:fc66469b-e068-44a5-9880-ff3c88b45a9e,Namespace:tigera-operator,Attempt:0,}" Sep 12 06:01:29.320838 containerd[1587]: time="2025-09-12T06:01:29.320796616Z" level=info msg="connecting to shim 3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4" address="unix:///run/containerd/s/b51ab111e1fb2d470b7f34069012fcc251d3dbbd43515defcf64132a6a8ec44e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:29.347241 systemd[1]: Started cri-containerd-3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4.scope - libcontainer container 3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4. Sep 12 06:01:29.396080 containerd[1587]: time="2025-09-12T06:01:29.396012514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-s4bnt,Uid:fc66469b-e068-44a5-9880-ff3c88b45a9e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4\"" Sep 12 06:01:29.398248 containerd[1587]: time="2025-09-12T06:01:29.398203127Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 06:01:29.566644 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1752338999.mount: Deactivated successfully. Sep 12 06:01:31.345935 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4024303136.mount: Deactivated successfully. Sep 12 06:01:31.720825 containerd[1587]: time="2025-09-12T06:01:31.720693303Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:31.721797 containerd[1587]: time="2025-09-12T06:01:31.721775300Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 06:01:31.723169 containerd[1587]: time="2025-09-12T06:01:31.723114547Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:31.725076 containerd[1587]: time="2025-09-12T06:01:31.725049004Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:31.725604 containerd[1587]: time="2025-09-12T06:01:31.725580545Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.32733653s" Sep 12 06:01:31.725659 containerd[1587]: time="2025-09-12T06:01:31.725608728Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 06:01:31.727469 containerd[1587]: time="2025-09-12T06:01:31.727435943Z" level=info msg="CreateContainer within sandbox \"3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 06:01:31.736034 containerd[1587]: time="2025-09-12T06:01:31.735987791Z" level=info msg="Container ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:31.742562 containerd[1587]: time="2025-09-12T06:01:31.742518407Z" level=info msg="CreateContainer within sandbox \"3e1327bcfee94a8495907803c99beef85428a10c5bc83e2f88abd8e040d632c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a\"" Sep 12 06:01:31.743041 containerd[1587]: time="2025-09-12T06:01:31.743019931Z" level=info msg="StartContainer for \"ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a\"" Sep 12 06:01:31.743795 containerd[1587]: time="2025-09-12T06:01:31.743772111Z" level=info msg="connecting to shim ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a" address="unix:///run/containerd/s/b51ab111e1fb2d470b7f34069012fcc251d3dbbd43515defcf64132a6a8ec44e" protocol=ttrpc version=3 Sep 12 06:01:31.802244 systemd[1]: Started cri-containerd-ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a.scope - libcontainer container ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a. Sep 12 06:01:31.832648 containerd[1587]: time="2025-09-12T06:01:31.832606924Z" level=info msg="StartContainer for \"ebe4cc687a32231002b44108f02d4c336163e7493113aec7296cef3d9bb8358a\" returns successfully" Sep 12 06:01:32.467598 kubelet[2726]: I0912 06:01:32.467510 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-m6ssp" podStartSLOduration=4.467457475 podStartE2EDuration="4.467457475s" podCreationTimestamp="2025-09-12 06:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:01:29.649932032 +0000 UTC m=+7.308704535" watchObservedRunningTime="2025-09-12 06:01:32.467457475 +0000 UTC m=+10.126229988" Sep 12 06:01:32.468095 kubelet[2726]: I0912 06:01:32.467676 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-s4bnt" podStartSLOduration=2.138982206 podStartE2EDuration="4.467667844s" podCreationTimestamp="2025-09-12 06:01:28 +0000 UTC" firstStartedPulling="2025-09-12 06:01:29.39769061 +0000 UTC m=+7.056463113" lastFinishedPulling="2025-09-12 06:01:31.726376247 +0000 UTC m=+9.385148751" observedRunningTime="2025-09-12 06:01:32.466926556 +0000 UTC m=+10.125699049" watchObservedRunningTime="2025-09-12 06:01:32.467667844 +0000 UTC m=+10.126440367" Sep 12 06:01:32.788335 update_engine[1562]: I20250912 06:01:32.788175 1562 update_attempter.cc:509] Updating boot flags... Sep 12 06:01:38.115909 sudo[1788]: pam_unix(sudo:session): session closed for user root Sep 12 06:01:38.118226 sshd-session[1784]: pam_unix(sshd:session): session closed for user core Sep 12 06:01:38.118597 sshd[1787]: Connection closed by 10.0.0.1 port 54260 Sep 12 06:01:38.123812 systemd[1]: sshd@6-10.0.0.132:22-10.0.0.1:54260.service: Deactivated successfully. Sep 12 06:01:38.127545 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 06:01:38.127877 systemd[1]: session-7.scope: Consumed 5.491s CPU time, 229.2M memory peak. Sep 12 06:01:38.134420 systemd-logind[1559]: Session 7 logged out. Waiting for processes to exit. Sep 12 06:01:38.137150 systemd-logind[1559]: Removed session 7. Sep 12 06:01:40.530471 systemd[1]: Created slice kubepods-besteffort-pod076226dd_ca7f_444f_9c27_27b92c2aa42d.slice - libcontainer container kubepods-besteffort-pod076226dd_ca7f_444f_9c27_27b92c2aa42d.slice. Sep 12 06:01:40.661275 kubelet[2726]: I0912 06:01:40.661197 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/076226dd-ca7f-444f-9c27-27b92c2aa42d-typha-certs\") pod \"calico-typha-66cbd86995-tsfr7\" (UID: \"076226dd-ca7f-444f-9c27-27b92c2aa42d\") " pod="calico-system/calico-typha-66cbd86995-tsfr7" Sep 12 06:01:40.661275 kubelet[2726]: I0912 06:01:40.661257 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/076226dd-ca7f-444f-9c27-27b92c2aa42d-tigera-ca-bundle\") pod \"calico-typha-66cbd86995-tsfr7\" (UID: \"076226dd-ca7f-444f-9c27-27b92c2aa42d\") " pod="calico-system/calico-typha-66cbd86995-tsfr7" Sep 12 06:01:40.661275 kubelet[2726]: I0912 06:01:40.661282 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmzg\" (UniqueName: \"kubernetes.io/projected/076226dd-ca7f-444f-9c27-27b92c2aa42d-kube-api-access-ppmzg\") pod \"calico-typha-66cbd86995-tsfr7\" (UID: \"076226dd-ca7f-444f-9c27-27b92c2aa42d\") " pod="calico-system/calico-typha-66cbd86995-tsfr7" Sep 12 06:01:40.837325 containerd[1587]: time="2025-09-12T06:01:40.837278843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66cbd86995-tsfr7,Uid:076226dd-ca7f-444f-9c27-27b92c2aa42d,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:41.260509 systemd[1]: Created slice kubepods-besteffort-pod3f247a0e_585f_441b_bc8d_791ee43ea5d4.slice - libcontainer container kubepods-besteffort-pod3f247a0e_585f_441b_bc8d_791ee43ea5d4.slice. Sep 12 06:01:41.265007 kubelet[2726]: I0912 06:01:41.264667 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-cni-net-dir\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265007 kubelet[2726]: I0912 06:01:41.264707 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-flexvol-driver-host\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265007 kubelet[2726]: I0912 06:01:41.264843 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-cni-bin-dir\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265511 kubelet[2726]: I0912 06:01:41.264863 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-lib-modules\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265559 kubelet[2726]: I0912 06:01:41.265531 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-var-run-calico\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265559 kubelet[2726]: I0912 06:01:41.265551 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-xtables-lock\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265619 kubelet[2726]: I0912 06:01:41.265609 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tg8\" (UniqueName: \"kubernetes.io/projected/3f247a0e-585f-441b-bc8d-791ee43ea5d4-kube-api-access-67tg8\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265648 kubelet[2726]: I0912 06:01:41.265627 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-policysync\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265648 kubelet[2726]: I0912 06:01:41.265641 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f247a0e-585f-441b-bc8d-791ee43ea5d4-tigera-ca-bundle\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265741 kubelet[2726]: I0912 06:01:41.265700 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-var-lib-calico\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265802 kubelet[2726]: I0912 06:01:41.265753 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3f247a0e-585f-441b-bc8d-791ee43ea5d4-cni-log-dir\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.265802 kubelet[2726]: I0912 06:01:41.265776 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3f247a0e-585f-441b-bc8d-791ee43ea5d4-node-certs\") pod \"calico-node-mnjns\" (UID: \"3f247a0e-585f-441b-bc8d-791ee43ea5d4\") " pod="calico-system/calico-node-mnjns" Sep 12 06:01:41.283799 containerd[1587]: time="2025-09-12T06:01:41.283717627Z" level=info msg="connecting to shim cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e" address="unix:///run/containerd/s/0d5d5bc2c57a6fcddb4e8a656a4a7123aba253514d487884c0604cd4ec27dd9f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:41.337278 systemd[1]: Started cri-containerd-cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e.scope - libcontainer container cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e. Sep 12 06:01:41.370144 kubelet[2726]: E0912 06:01:41.369993 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.370144 kubelet[2726]: W0912 06:01:41.370018 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.370144 kubelet[2726]: E0912 06:01:41.370155 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.372123 kubelet[2726]: E0912 06:01:41.372022 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.372123 kubelet[2726]: W0912 06:01:41.372038 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.372123 kubelet[2726]: E0912 06:01:41.372049 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.379560 kubelet[2726]: E0912 06:01:41.379504 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:41.382469 kubelet[2726]: E0912 06:01:41.382289 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.382469 kubelet[2726]: W0912 06:01:41.382327 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.382469 kubelet[2726]: E0912 06:01:41.382361 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.387915 kubelet[2726]: E0912 06:01:41.387882 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.388409 kubelet[2726]: W0912 06:01:41.388330 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.388409 kubelet[2726]: E0912 06:01:41.388366 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.426384 containerd[1587]: time="2025-09-12T06:01:41.426336132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66cbd86995-tsfr7,Uid:076226dd-ca7f-444f-9c27-27b92c2aa42d,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e\"" Sep 12 06:01:41.428063 containerd[1587]: time="2025-09-12T06:01:41.428031855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 06:01:41.466432 kubelet[2726]: E0912 06:01:41.466173 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.466432 kubelet[2726]: W0912 06:01:41.466197 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.466432 kubelet[2726]: E0912 06:01:41.466219 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.466432 kubelet[2726]: E0912 06:01:41.466447 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.466432 kubelet[2726]: W0912 06:01:41.466455 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.466747 kubelet[2726]: E0912 06:01:41.466464 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.467686 kubelet[2726]: E0912 06:01:41.467003 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.467686 kubelet[2726]: W0912 06:01:41.467020 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.467686 kubelet[2726]: E0912 06:01:41.467029 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.467686 kubelet[2726]: E0912 06:01:41.467632 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.467686 kubelet[2726]: W0912 06:01:41.467640 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.467686 kubelet[2726]: E0912 06:01:41.467650 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.468051 kubelet[2726]: E0912 06:01:41.467944 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.468051 kubelet[2726]: W0912 06:01:41.467953 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.468051 kubelet[2726]: E0912 06:01:41.467961 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468145 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.468615 kubelet[2726]: W0912 06:01:41.468154 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468165 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468343 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.468615 kubelet[2726]: W0912 06:01:41.468350 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468360 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468559 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.468615 kubelet[2726]: W0912 06:01:41.468566 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.468615 kubelet[2726]: E0912 06:01:41.468575 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469084 kubelet[2726]: E0912 06:01:41.468787 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469084 kubelet[2726]: W0912 06:01:41.468794 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469084 kubelet[2726]: E0912 06:01:41.468802 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469084 kubelet[2726]: E0912 06:01:41.468994 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469084 kubelet[2726]: W0912 06:01:41.469001 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469084 kubelet[2726]: E0912 06:01:41.469010 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469384 kubelet[2726]: E0912 06:01:41.469223 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469384 kubelet[2726]: W0912 06:01:41.469231 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469384 kubelet[2726]: E0912 06:01:41.469240 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469485 kubelet[2726]: E0912 06:01:41.469424 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469485 kubelet[2726]: W0912 06:01:41.469432 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469485 kubelet[2726]: E0912 06:01:41.469440 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469658 kubelet[2726]: E0912 06:01:41.469636 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469658 kubelet[2726]: W0912 06:01:41.469647 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469658 kubelet[2726]: E0912 06:01:41.469656 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469833 kubelet[2726]: E0912 06:01:41.469816 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469833 kubelet[2726]: W0912 06:01:41.469825 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469833 kubelet[2726]: E0912 06:01:41.469832 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.469995 kubelet[2726]: E0912 06:01:41.469978 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.469995 kubelet[2726]: W0912 06:01:41.469988 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.469995 kubelet[2726]: E0912 06:01:41.469995 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470175 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471085 kubelet[2726]: W0912 06:01:41.470187 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470194 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470437 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471085 kubelet[2726]: W0912 06:01:41.470445 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470452 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470662 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471085 kubelet[2726]: W0912 06:01:41.470693 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470702 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471085 kubelet[2726]: E0912 06:01:41.470908 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471485 kubelet[2726]: W0912 06:01:41.470921 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471485 kubelet[2726]: E0912 06:01:41.470930 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471485 kubelet[2726]: E0912 06:01:41.471179 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471485 kubelet[2726]: W0912 06:01:41.471187 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471485 kubelet[2726]: E0912 06:01:41.471223 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471622 kubelet[2726]: E0912 06:01:41.471603 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471622 kubelet[2726]: W0912 06:01:41.471621 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.471682 kubelet[2726]: E0912 06:01:41.471631 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.471682 kubelet[2726]: I0912 06:01:41.471674 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9181799d-61c8-4e39-8795-bc27b5674755-socket-dir\") pod \"csi-node-driver-clj9k\" (UID: \"9181799d-61c8-4e39-8795-bc27b5674755\") " pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:41.471960 kubelet[2726]: E0912 06:01:41.471931 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.471960 kubelet[2726]: W0912 06:01:41.471947 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.472054 kubelet[2726]: E0912 06:01:41.471967 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.472054 kubelet[2726]: I0912 06:01:41.471981 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/9181799d-61c8-4e39-8795-bc27b5674755-varrun\") pod \"csi-node-driver-clj9k\" (UID: \"9181799d-61c8-4e39-8795-bc27b5674755\") " pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:41.473286 kubelet[2726]: E0912 06:01:41.473259 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.473286 kubelet[2726]: W0912 06:01:41.473276 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.473373 kubelet[2726]: E0912 06:01:41.473294 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.473373 kubelet[2726]: I0912 06:01:41.473341 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9181799d-61c8-4e39-8795-bc27b5674755-registration-dir\") pod \"csi-node-driver-clj9k\" (UID: \"9181799d-61c8-4e39-8795-bc27b5674755\") " pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:41.473664 kubelet[2726]: E0912 06:01:41.473595 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.473664 kubelet[2726]: W0912 06:01:41.473608 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.473750 kubelet[2726]: E0912 06:01:41.473671 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.473750 kubelet[2726]: I0912 06:01:41.473694 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwf9s\" (UniqueName: \"kubernetes.io/projected/9181799d-61c8-4e39-8795-bc27b5674755-kube-api-access-gwf9s\") pod \"csi-node-driver-clj9k\" (UID: \"9181799d-61c8-4e39-8795-bc27b5674755\") " pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:41.474066 kubelet[2726]: E0912 06:01:41.474039 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.474066 kubelet[2726]: W0912 06:01:41.474055 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.474189 kubelet[2726]: E0912 06:01:41.474129 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.474445 kubelet[2726]: E0912 06:01:41.474419 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.474445 kubelet[2726]: W0912 06:01:41.474433 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.475140 kubelet[2726]: E0912 06:01:41.474538 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.475140 kubelet[2726]: E0912 06:01:41.474852 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.475140 kubelet[2726]: W0912 06:01:41.474862 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.475304 kubelet[2726]: E0912 06:01:41.475278 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.475645 kubelet[2726]: E0912 06:01:41.475619 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.475845 kubelet[2726]: W0912 06:01:41.475818 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.476012 kubelet[2726]: E0912 06:01:41.475949 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.476213 kubelet[2726]: I0912 06:01:41.476070 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9181799d-61c8-4e39-8795-bc27b5674755-kubelet-dir\") pod \"csi-node-driver-clj9k\" (UID: \"9181799d-61c8-4e39-8795-bc27b5674755\") " pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:41.477126 kubelet[2726]: E0912 06:01:41.476742 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.477126 kubelet[2726]: W0912 06:01:41.476766 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.477126 kubelet[2726]: E0912 06:01:41.476955 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.477648 kubelet[2726]: E0912 06:01:41.477623 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.477648 kubelet[2726]: W0912 06:01:41.477638 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.477648 kubelet[2726]: E0912 06:01:41.477647 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.478327 kubelet[2726]: E0912 06:01:41.478299 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.478327 kubelet[2726]: W0912 06:01:41.478313 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.478938 kubelet[2726]: E0912 06:01:41.478904 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.478938 kubelet[2726]: W0912 06:01:41.478918 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.478938 kubelet[2726]: E0912 06:01:41.478928 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.479185 kubelet[2726]: E0912 06:01:41.479047 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.481137 kubelet[2726]: E0912 06:01:41.480398 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.481137 kubelet[2726]: W0912 06:01:41.480522 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.481137 kubelet[2726]: E0912 06:01:41.480533 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.481137 kubelet[2726]: E0912 06:01:41.480886 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.481137 kubelet[2726]: W0912 06:01:41.480894 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.481137 kubelet[2726]: E0912 06:01:41.480902 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.481137 kubelet[2726]: E0912 06:01:41.481144 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.481415 kubelet[2726]: W0912 06:01:41.481152 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.481415 kubelet[2726]: E0912 06:01:41.481161 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.567152 containerd[1587]: time="2025-09-12T06:01:41.566135070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mnjns,Uid:3f247a0e-585f-441b-bc8d-791ee43ea5d4,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:41.581967 kubelet[2726]: E0912 06:01:41.581921 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.581967 kubelet[2726]: W0912 06:01:41.581944 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.581967 kubelet[2726]: E0912 06:01:41.581963 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.582222 kubelet[2726]: E0912 06:01:41.582192 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.582222 kubelet[2726]: W0912 06:01:41.582204 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.582222 kubelet[2726]: E0912 06:01:41.582217 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.582515 kubelet[2726]: E0912 06:01:41.582483 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.582515 kubelet[2726]: W0912 06:01:41.582511 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.582611 kubelet[2726]: E0912 06:01:41.582551 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.582878 kubelet[2726]: E0912 06:01:41.582839 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.582878 kubelet[2726]: W0912 06:01:41.582850 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.582878 kubelet[2726]: E0912 06:01:41.582866 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.583125 kubelet[2726]: E0912 06:01:41.583076 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.583125 kubelet[2726]: W0912 06:01:41.583097 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.583230 kubelet[2726]: E0912 06:01:41.583143 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.583412 kubelet[2726]: E0912 06:01:41.583381 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.583412 kubelet[2726]: W0912 06:01:41.583395 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.583487 kubelet[2726]: E0912 06:01:41.583446 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.583614 kubelet[2726]: E0912 06:01:41.583584 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.583614 kubelet[2726]: W0912 06:01:41.583600 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.583798 kubelet[2726]: E0912 06:01:41.583686 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.583823 kubelet[2726]: E0912 06:01:41.583814 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.583846 kubelet[2726]: W0912 06:01:41.583823 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.583912 kubelet[2726]: E0912 06:01:41.583891 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.584059 kubelet[2726]: E0912 06:01:41.584043 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.584059 kubelet[2726]: W0912 06:01:41.584053 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.584155 kubelet[2726]: E0912 06:01:41.584123 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.584290 kubelet[2726]: E0912 06:01:41.584274 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.584290 kubelet[2726]: W0912 06:01:41.584284 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.584345 kubelet[2726]: E0912 06:01:41.584334 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.584485 kubelet[2726]: E0912 06:01:41.584458 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.584485 kubelet[2726]: W0912 06:01:41.584473 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.584485 kubelet[2726]: E0912 06:01:41.584486 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.584770 kubelet[2726]: E0912 06:01:41.584731 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.584770 kubelet[2726]: W0912 06:01:41.584763 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.584847 kubelet[2726]: E0912 06:01:41.584789 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.585169 kubelet[2726]: E0912 06:01:41.585150 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.585169 kubelet[2726]: W0912 06:01:41.585165 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.585359 kubelet[2726]: E0912 06:01:41.585284 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.585416 kubelet[2726]: E0912 06:01:41.585403 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.585416 kubelet[2726]: W0912 06:01:41.585414 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.585477 kubelet[2726]: E0912 06:01:41.585452 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.585670 kubelet[2726]: E0912 06:01:41.585646 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.585670 kubelet[2726]: W0912 06:01:41.585658 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.585775 kubelet[2726]: E0912 06:01:41.585694 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.585882 kubelet[2726]: E0912 06:01:41.585862 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.585882 kubelet[2726]: W0912 06:01:41.585872 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.585952 kubelet[2726]: E0912 06:01:41.585900 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.586057 kubelet[2726]: E0912 06:01:41.586038 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.586057 kubelet[2726]: W0912 06:01:41.586047 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.586150 kubelet[2726]: E0912 06:01:41.586079 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.586256 kubelet[2726]: E0912 06:01:41.586237 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.586256 kubelet[2726]: W0912 06:01:41.586247 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.586339 kubelet[2726]: E0912 06:01:41.586261 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.586587 kubelet[2726]: E0912 06:01:41.586560 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.586587 kubelet[2726]: W0912 06:01:41.586580 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.586659 kubelet[2726]: E0912 06:01:41.586605 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.586903 kubelet[2726]: E0912 06:01:41.586880 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.586903 kubelet[2726]: W0912 06:01:41.586893 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.586903 kubelet[2726]: E0912 06:01:41.586907 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.587108 kubelet[2726]: E0912 06:01:41.587084 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.587108 kubelet[2726]: W0912 06:01:41.587094 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.587170 kubelet[2726]: E0912 06:01:41.587123 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.587396 kubelet[2726]: E0912 06:01:41.587379 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.587396 kubelet[2726]: W0912 06:01:41.587393 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.587524 kubelet[2726]: E0912 06:01:41.587505 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.587787 kubelet[2726]: E0912 06:01:41.587759 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.587824 kubelet[2726]: W0912 06:01:41.587775 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.587936 kubelet[2726]: E0912 06:01:41.587915 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.588236 kubelet[2726]: E0912 06:01:41.588217 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.588236 kubelet[2726]: W0912 06:01:41.588232 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.588296 kubelet[2726]: E0912 06:01:41.588253 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.589674 kubelet[2726]: E0912 06:01:41.589650 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.589674 kubelet[2726]: W0912 06:01:41.589664 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.589674 kubelet[2726]: E0912 06:01:41.589674 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.592549 containerd[1587]: time="2025-09-12T06:01:41.592487908Z" level=info msg="connecting to shim 364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1" address="unix:///run/containerd/s/1e4a4d983a8f95987eaa771e9c86dd4677a09ca626e8dbece8fc999a3ae0d910" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:01:41.598571 kubelet[2726]: E0912 06:01:41.598523 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:41.598571 kubelet[2726]: W0912 06:01:41.598545 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:41.598571 kubelet[2726]: E0912 06:01:41.598561 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:41.624327 systemd[1]: Started cri-containerd-364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1.scope - libcontainer container 364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1. Sep 12 06:01:41.660960 containerd[1587]: time="2025-09-12T06:01:41.660904341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mnjns,Uid:3f247a0e-585f-441b-bc8d-791ee43ea5d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\"" Sep 12 06:01:43.423582 kubelet[2726]: E0912 06:01:43.423495 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:44.503019 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2738754939.mount: Deactivated successfully. Sep 12 06:01:45.423680 kubelet[2726]: E0912 06:01:45.423622 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:47.426242 kubelet[2726]: E0912 06:01:47.426182 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:48.550949 containerd[1587]: time="2025-09-12T06:01:48.550884675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:48.551604 containerd[1587]: time="2025-09-12T06:01:48.551563774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 06:01:48.552791 containerd[1587]: time="2025-09-12T06:01:48.552757884Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:48.554616 containerd[1587]: time="2025-09-12T06:01:48.554576170Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:48.555222 containerd[1587]: time="2025-09-12T06:01:48.555192460Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 7.127131141s" Sep 12 06:01:48.555222 containerd[1587]: time="2025-09-12T06:01:48.555222928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 06:01:48.556276 containerd[1587]: time="2025-09-12T06:01:48.556054114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 06:01:48.566387 containerd[1587]: time="2025-09-12T06:01:48.566348900Z" level=info msg="CreateContainer within sandbox \"cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 06:01:48.575225 containerd[1587]: time="2025-09-12T06:01:48.575178284Z" level=info msg="Container bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:48.584371 containerd[1587]: time="2025-09-12T06:01:48.584325156Z" level=info msg="CreateContainer within sandbox \"cc6e22f4bcb68c9d26097e485a1f86a29022b05764b5694a02bc74e61dbc772e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9\"" Sep 12 06:01:48.584856 containerd[1587]: time="2025-09-12T06:01:48.584827061Z" level=info msg="StartContainer for \"bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9\"" Sep 12 06:01:48.585892 containerd[1587]: time="2025-09-12T06:01:48.585865328Z" level=info msg="connecting to shim bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9" address="unix:///run/containerd/s/0d5d5bc2c57a6fcddb4e8a656a4a7123aba253514d487884c0604cd4ec27dd9f" protocol=ttrpc version=3 Sep 12 06:01:48.610406 systemd[1]: Started cri-containerd-bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9.scope - libcontainer container bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9. Sep 12 06:01:48.665834 containerd[1587]: time="2025-09-12T06:01:48.665782410Z" level=info msg="StartContainer for \"bf9eaa2a5db2d95303449b07ce2fb299926a5fe4f321b7991939f4452f9e02f9\" returns successfully" Sep 12 06:01:49.423836 kubelet[2726]: E0912 06:01:49.423778 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:49.508128 kubelet[2726]: I0912 06:01:49.508003 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-66cbd86995-tsfr7" podStartSLOduration=2.379796462 podStartE2EDuration="9.507976279s" podCreationTimestamp="2025-09-12 06:01:40 +0000 UTC" firstStartedPulling="2025-09-12 06:01:41.427729804 +0000 UTC m=+19.086502307" lastFinishedPulling="2025-09-12 06:01:48.555909621 +0000 UTC m=+26.214682124" observedRunningTime="2025-09-12 06:01:49.50773247 +0000 UTC m=+27.166504983" watchObservedRunningTime="2025-09-12 06:01:49.507976279 +0000 UTC m=+27.166748782" Sep 12 06:01:49.519067 kubelet[2726]: E0912 06:01:49.519013 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.519067 kubelet[2726]: W0912 06:01:49.519042 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.519067 kubelet[2726]: E0912 06:01:49.519068 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.520095 kubelet[2726]: E0912 06:01:49.520035 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.520095 kubelet[2726]: W0912 06:01:49.520060 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.520355 kubelet[2726]: E0912 06:01:49.520275 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.520841 kubelet[2726]: E0912 06:01:49.520780 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.520841 kubelet[2726]: W0912 06:01:49.520839 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.520922 kubelet[2726]: E0912 06:01:49.520858 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.521360 kubelet[2726]: E0912 06:01:49.521332 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.521503 kubelet[2726]: W0912 06:01:49.521436 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.521503 kubelet[2726]: E0912 06:01:49.521454 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.521886 kubelet[2726]: E0912 06:01:49.521858 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.521886 kubelet[2726]: W0912 06:01:49.521870 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.521886 kubelet[2726]: E0912 06:01:49.521879 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.522302 kubelet[2726]: E0912 06:01:49.522039 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.522302 kubelet[2726]: W0912 06:01:49.522046 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.522302 kubelet[2726]: E0912 06:01:49.522054 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.522302 kubelet[2726]: E0912 06:01:49.522262 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.522302 kubelet[2726]: W0912 06:01:49.522286 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.522302 kubelet[2726]: E0912 06:01:49.522294 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.522654 kubelet[2726]: E0912 06:01:49.522494 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.522654 kubelet[2726]: W0912 06:01:49.522524 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.522654 kubelet[2726]: E0912 06:01:49.522534 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.522828 kubelet[2726]: E0912 06:01:49.522795 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.522828 kubelet[2726]: W0912 06:01:49.522804 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.522828 kubelet[2726]: E0912 06:01:49.522812 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.523041 kubelet[2726]: E0912 06:01:49.523009 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.523041 kubelet[2726]: W0912 06:01:49.523023 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.523041 kubelet[2726]: E0912 06:01:49.523032 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.523242 kubelet[2726]: E0912 06:01:49.523224 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.523242 kubelet[2726]: W0912 06:01:49.523240 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.523311 kubelet[2726]: E0912 06:01:49.523249 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.523484 kubelet[2726]: E0912 06:01:49.523458 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.523484 kubelet[2726]: W0912 06:01:49.523476 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.523594 kubelet[2726]: E0912 06:01:49.523492 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.523811 kubelet[2726]: E0912 06:01:49.523784 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.523811 kubelet[2726]: W0912 06:01:49.523801 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.523908 kubelet[2726]: E0912 06:01:49.523815 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.524017 kubelet[2726]: E0912 06:01:49.523999 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.524017 kubelet[2726]: W0912 06:01:49.524012 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.524140 kubelet[2726]: E0912 06:01:49.524022 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.524358 kubelet[2726]: E0912 06:01:49.524338 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.524358 kubelet[2726]: W0912 06:01:49.524353 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.524437 kubelet[2726]: E0912 06:01:49.524366 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.538447 kubelet[2726]: E0912 06:01:49.538394 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.538447 kubelet[2726]: W0912 06:01:49.538415 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.538447 kubelet[2726]: E0912 06:01:49.538433 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.539025 kubelet[2726]: E0912 06:01:49.539007 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.539025 kubelet[2726]: W0912 06:01:49.539026 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.539142 kubelet[2726]: E0912 06:01:49.539041 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.539294 kubelet[2726]: E0912 06:01:49.539276 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.539294 kubelet[2726]: W0912 06:01:49.539287 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.539373 kubelet[2726]: E0912 06:01:49.539301 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.539478 kubelet[2726]: E0912 06:01:49.539453 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.539478 kubelet[2726]: W0912 06:01:49.539472 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.539561 kubelet[2726]: E0912 06:01:49.539495 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.539790 kubelet[2726]: E0912 06:01:49.539765 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.539790 kubelet[2726]: W0912 06:01:49.539778 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.539790 kubelet[2726]: E0912 06:01:49.539790 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.540046 kubelet[2726]: E0912 06:01:49.540013 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.540046 kubelet[2726]: W0912 06:01:49.540036 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.540246 kubelet[2726]: E0912 06:01:49.540068 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.540337 kubelet[2726]: E0912 06:01:49.540318 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.540337 kubelet[2726]: W0912 06:01:49.540333 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.540392 kubelet[2726]: E0912 06:01:49.540353 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.540644 kubelet[2726]: E0912 06:01:49.540601 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.540644 kubelet[2726]: W0912 06:01:49.540632 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.540692 kubelet[2726]: E0912 06:01:49.540665 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.540846 kubelet[2726]: E0912 06:01:49.540828 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.540846 kubelet[2726]: W0912 06:01:49.540842 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.540921 kubelet[2726]: E0912 06:01:49.540868 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.541039 kubelet[2726]: E0912 06:01:49.541023 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.541039 kubelet[2726]: W0912 06:01:49.541035 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.541090 kubelet[2726]: E0912 06:01:49.541064 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.541266 kubelet[2726]: E0912 06:01:49.541249 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.541266 kubelet[2726]: W0912 06:01:49.541262 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.541322 kubelet[2726]: E0912 06:01:49.541278 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.541497 kubelet[2726]: E0912 06:01:49.541481 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.541497 kubelet[2726]: W0912 06:01:49.541494 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.541554 kubelet[2726]: E0912 06:01:49.541509 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.541762 kubelet[2726]: E0912 06:01:49.541736 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.541762 kubelet[2726]: W0912 06:01:49.541752 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.541812 kubelet[2726]: E0912 06:01:49.541768 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.541960 kubelet[2726]: E0912 06:01:49.541944 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.541960 kubelet[2726]: W0912 06:01:49.541958 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.542015 kubelet[2726]: E0912 06:01:49.541969 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.542172 kubelet[2726]: E0912 06:01:49.542161 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.542172 kubelet[2726]: W0912 06:01:49.542170 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.542223 kubelet[2726]: E0912 06:01:49.542183 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.542400 kubelet[2726]: E0912 06:01:49.542388 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.542400 kubelet[2726]: W0912 06:01:49.542397 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.542444 kubelet[2726]: E0912 06:01:49.542409 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.542662 kubelet[2726]: E0912 06:01:49.542643 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.542662 kubelet[2726]: W0912 06:01:49.542657 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.542716 kubelet[2726]: E0912 06:01:49.542668 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:49.543263 kubelet[2726]: E0912 06:01:49.543245 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:49.543263 kubelet[2726]: W0912 06:01:49.543261 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:49.543324 kubelet[2726]: E0912 06:01:49.543273 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.531252 kubelet[2726]: E0912 06:01:50.531213 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.531252 kubelet[2726]: W0912 06:01:50.531239 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.531699 kubelet[2726]: E0912 06:01:50.531263 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.531699 kubelet[2726]: E0912 06:01:50.531496 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.531699 kubelet[2726]: W0912 06:01:50.531504 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.531699 kubelet[2726]: E0912 06:01:50.531513 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.531826 kubelet[2726]: E0912 06:01:50.531724 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.531826 kubelet[2726]: W0912 06:01:50.531733 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.531826 kubelet[2726]: E0912 06:01:50.531741 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.532019 kubelet[2726]: E0912 06:01:50.532004 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.532019 kubelet[2726]: W0912 06:01:50.532015 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.532074 kubelet[2726]: E0912 06:01:50.532025 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.532408 kubelet[2726]: E0912 06:01:50.532281 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.532408 kubelet[2726]: W0912 06:01:50.532294 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.532408 kubelet[2726]: E0912 06:01:50.532305 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.532520 kubelet[2726]: E0912 06:01:50.532504 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.532520 kubelet[2726]: W0912 06:01:50.532514 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.532562 kubelet[2726]: E0912 06:01:50.532523 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.532717 kubelet[2726]: E0912 06:01:50.532691 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.532717 kubelet[2726]: W0912 06:01:50.532698 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.532717 kubelet[2726]: E0912 06:01:50.532705 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.532853 kubelet[2726]: E0912 06:01:50.532837 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.532853 kubelet[2726]: W0912 06:01:50.532847 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.532853 kubelet[2726]: E0912 06:01:50.532854 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.533012 kubelet[2726]: E0912 06:01:50.532998 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.533012 kubelet[2726]: W0912 06:01:50.533009 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.533067 kubelet[2726]: E0912 06:01:50.533017 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.533179 kubelet[2726]: E0912 06:01:50.533165 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.533179 kubelet[2726]: W0912 06:01:50.533175 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.533235 kubelet[2726]: E0912 06:01:50.533183 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.533338 kubelet[2726]: E0912 06:01:50.533324 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.533338 kubelet[2726]: W0912 06:01:50.533333 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.533411 kubelet[2726]: E0912 06:01:50.533341 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.533608 kubelet[2726]: E0912 06:01:50.533563 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.533608 kubelet[2726]: W0912 06:01:50.533596 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.533608 kubelet[2726]: E0912 06:01:50.533622 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.533851 kubelet[2726]: E0912 06:01:50.533842 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.533851 kubelet[2726]: W0912 06:01:50.533850 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.533900 kubelet[2726]: E0912 06:01:50.533858 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.534115 kubelet[2726]: E0912 06:01:50.534076 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.534115 kubelet[2726]: W0912 06:01:50.534090 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.534115 kubelet[2726]: E0912 06:01:50.534113 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.534328 kubelet[2726]: E0912 06:01:50.534304 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.534328 kubelet[2726]: W0912 06:01:50.534322 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.534328 kubelet[2726]: E0912 06:01:50.534330 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.545973 kubelet[2726]: E0912 06:01:50.545935 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.545973 kubelet[2726]: W0912 06:01:50.545962 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.546231 kubelet[2726]: E0912 06:01:50.545983 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.546231 kubelet[2726]: E0912 06:01:50.546221 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.546231 kubelet[2726]: W0912 06:01:50.546229 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.546330 kubelet[2726]: E0912 06:01:50.546241 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.546416 kubelet[2726]: E0912 06:01:50.546389 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.546416 kubelet[2726]: W0912 06:01:50.546399 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.546416 kubelet[2726]: E0912 06:01:50.546407 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.546659 kubelet[2726]: E0912 06:01:50.546639 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.546659 kubelet[2726]: W0912 06:01:50.546652 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.546734 kubelet[2726]: E0912 06:01:50.546669 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.546905 kubelet[2726]: E0912 06:01:50.546881 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.546905 kubelet[2726]: W0912 06:01:50.546895 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.546905 kubelet[2726]: E0912 06:01:50.546913 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.547085 kubelet[2726]: E0912 06:01:50.547070 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.547085 kubelet[2726]: W0912 06:01:50.547083 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.547161 kubelet[2726]: E0912 06:01:50.547120 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.547452 kubelet[2726]: E0912 06:01:50.547431 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.547452 kubelet[2726]: W0912 06:01:50.547450 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.547507 kubelet[2726]: E0912 06:01:50.547468 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.547800 kubelet[2726]: E0912 06:01:50.547775 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.547838 kubelet[2726]: W0912 06:01:50.547811 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.547923 kubelet[2726]: E0912 06:01:50.547897 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.548097 kubelet[2726]: E0912 06:01:50.548075 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.548097 kubelet[2726]: W0912 06:01:50.548089 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.548247 kubelet[2726]: E0912 06:01:50.548231 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.548433 kubelet[2726]: E0912 06:01:50.548414 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.548433 kubelet[2726]: W0912 06:01:50.548426 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.548540 kubelet[2726]: E0912 06:01:50.548511 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.548703 kubelet[2726]: E0912 06:01:50.548692 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.548703 kubelet[2726]: W0912 06:01:50.548702 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.548755 kubelet[2726]: E0912 06:01:50.548729 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.548984 kubelet[2726]: E0912 06:01:50.548967 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.548984 kubelet[2726]: W0912 06:01:50.548980 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.549042 kubelet[2726]: E0912 06:01:50.548994 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.549282 kubelet[2726]: E0912 06:01:50.549264 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.549282 kubelet[2726]: W0912 06:01:50.549276 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.549337 kubelet[2726]: E0912 06:01:50.549291 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.549460 kubelet[2726]: E0912 06:01:50.549440 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.549518 kubelet[2726]: W0912 06:01:50.549452 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.549546 kubelet[2726]: E0912 06:01:50.549527 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.549770 kubelet[2726]: E0912 06:01:50.549754 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.549770 kubelet[2726]: W0912 06:01:50.549765 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.549822 kubelet[2726]: E0912 06:01:50.549778 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.550301 kubelet[2726]: E0912 06:01:50.550268 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.550301 kubelet[2726]: W0912 06:01:50.550285 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.550365 kubelet[2726]: E0912 06:01:50.550302 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.550515 kubelet[2726]: E0912 06:01:50.550497 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.550515 kubelet[2726]: W0912 06:01:50.550511 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.550568 kubelet[2726]: E0912 06:01:50.550545 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.550777 kubelet[2726]: E0912 06:01:50.550760 2726 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 06:01:50.550777 kubelet[2726]: W0912 06:01:50.550771 2726 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 06:01:50.550831 kubelet[2726]: E0912 06:01:50.550780 2726 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 06:01:50.589451 containerd[1587]: time="2025-09-12T06:01:50.589368277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:50.590273 containerd[1587]: time="2025-09-12T06:01:50.590205504Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 06:01:50.591712 containerd[1587]: time="2025-09-12T06:01:50.591665092Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:50.594065 containerd[1587]: time="2025-09-12T06:01:50.594002653Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:50.594787 containerd[1587]: time="2025-09-12T06:01:50.594736345Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.038647676s" Sep 12 06:01:50.594787 containerd[1587]: time="2025-09-12T06:01:50.594770850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 06:01:50.596922 containerd[1587]: time="2025-09-12T06:01:50.596880262Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 06:01:50.606180 containerd[1587]: time="2025-09-12T06:01:50.606133856Z" level=info msg="Container 388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:50.616343 containerd[1587]: time="2025-09-12T06:01:50.616295389Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\"" Sep 12 06:01:50.616862 containerd[1587]: time="2025-09-12T06:01:50.616810810Z" level=info msg="StartContainer for \"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\"" Sep 12 06:01:50.618220 containerd[1587]: time="2025-09-12T06:01:50.618197431Z" level=info msg="connecting to shim 388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f" address="unix:///run/containerd/s/1e4a4d983a8f95987eaa771e9c86dd4677a09ca626e8dbece8fc999a3ae0d910" protocol=ttrpc version=3 Sep 12 06:01:50.655282 systemd[1]: Started cri-containerd-388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f.scope - libcontainer container 388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f. Sep 12 06:01:50.700499 containerd[1587]: time="2025-09-12T06:01:50.700453033Z" level=info msg="StartContainer for \"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\" returns successfully" Sep 12 06:01:50.710323 systemd[1]: cri-containerd-388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f.scope: Deactivated successfully. Sep 12 06:01:50.713301 containerd[1587]: time="2025-09-12T06:01:50.713258275Z" level=info msg="received exit event container_id:\"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\" id:\"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\" pid:3468 exited_at:{seconds:1757656910 nanos:712805212}" Sep 12 06:01:50.713400 containerd[1587]: time="2025-09-12T06:01:50.713337865Z" level=info msg="TaskExit event in podsandbox handler container_id:\"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\" id:\"388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f\" pid:3468 exited_at:{seconds:1757656910 nanos:712805212}" Sep 12 06:01:50.740872 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-388ced07c893094d80356cbeb1ee6a8489da2a25c4458d5fa5e63041c8c2992f-rootfs.mount: Deactivated successfully. Sep 12 06:01:51.423375 kubelet[2726]: E0912 06:01:51.423333 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:51.504799 containerd[1587]: time="2025-09-12T06:01:51.504756752Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 06:01:53.423595 kubelet[2726]: E0912 06:01:53.423532 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:55.423122 kubelet[2726]: E0912 06:01:55.423042 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:57.106602 containerd[1587]: time="2025-09-12T06:01:57.106501880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:57.107788 containerd[1587]: time="2025-09-12T06:01:57.107728336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 06:01:57.109268 containerd[1587]: time="2025-09-12T06:01:57.109186898Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:57.111314 containerd[1587]: time="2025-09-12T06:01:57.111273301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:01:57.112126 containerd[1587]: time="2025-09-12T06:01:57.112060310Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.607241291s" Sep 12 06:01:57.112166 containerd[1587]: time="2025-09-12T06:01:57.112132947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 06:01:57.115255 containerd[1587]: time="2025-09-12T06:01:57.115189975Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 06:01:57.127998 containerd[1587]: time="2025-09-12T06:01:57.127944565Z" level=info msg="Container 0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:01:57.141303 containerd[1587]: time="2025-09-12T06:01:57.141168689Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\"" Sep 12 06:01:57.141884 containerd[1587]: time="2025-09-12T06:01:57.141833178Z" level=info msg="StartContainer for \"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\"" Sep 12 06:01:57.143675 containerd[1587]: time="2025-09-12T06:01:57.143615139Z" level=info msg="connecting to shim 0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea" address="unix:///run/containerd/s/1e4a4d983a8f95987eaa771e9c86dd4677a09ca626e8dbece8fc999a3ae0d910" protocol=ttrpc version=3 Sep 12 06:01:57.168263 systemd[1]: Started cri-containerd-0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea.scope - libcontainer container 0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea. Sep 12 06:01:57.355725 containerd[1587]: time="2025-09-12T06:01:57.355548984Z" level=info msg="StartContainer for \"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\" returns successfully" Sep 12 06:01:57.423887 kubelet[2726]: E0912 06:01:57.423714 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:01:58.314928 systemd[1]: cri-containerd-0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea.scope: Deactivated successfully. Sep 12 06:01:58.315300 systemd[1]: cri-containerd-0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea.scope: Consumed 632ms CPU time, 180.3M memory peak, 8K read from disk, 171.3M written to disk. Sep 12 06:01:58.316185 containerd[1587]: time="2025-09-12T06:01:58.316132974Z" level=info msg="received exit event container_id:\"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\" id:\"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\" pid:3527 exited_at:{seconds:1757656918 nanos:315731369}" Sep 12 06:01:58.316649 containerd[1587]: time="2025-09-12T06:01:58.316342348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\" id:\"0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea\" pid:3527 exited_at:{seconds:1757656918 nanos:315731369}" Sep 12 06:01:58.340169 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0b1e39f6a253821146987d2364d00281e7da953fff68e46650bee6bd66b0ceea-rootfs.mount: Deactivated successfully. Sep 12 06:01:58.354509 kubelet[2726]: I0912 06:01:58.354476 2726 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 06:01:58.432616 systemd[1]: Created slice kubepods-burstable-pod9d395707_c6f1_4aac_b0b4_5b583852339d.slice - libcontainer container kubepods-burstable-pod9d395707_c6f1_4aac_b0b4_5b583852339d.slice. Sep 12 06:01:58.492921 systemd[1]: Created slice kubepods-besteffort-podc152de20_23a4_4564_a6ad_1bc57e7a8f64.slice - libcontainer container kubepods-besteffort-podc152de20_23a4_4564_a6ad_1bc57e7a8f64.slice. Sep 12 06:01:58.493129 kubelet[2726]: I0912 06:01:58.493032 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-ca-bundle\") pod \"whisker-78c5cc786c-7ts5g\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:01:58.493129 kubelet[2726]: I0912 06:01:58.493067 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c0e49e1-71a2-4844-908b-3dd7ba7b4800-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-mms5t\" (UID: \"9c0e49e1-71a2-4844-908b-3dd7ba7b4800\") " pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.493129 kubelet[2726]: I0912 06:01:58.493088 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqdd\" (UniqueName: \"kubernetes.io/projected/9c0e49e1-71a2-4844-908b-3dd7ba7b4800-kube-api-access-pfqdd\") pod \"goldmane-54d579b49d-mms5t\" (UID: \"9c0e49e1-71a2-4844-908b-3dd7ba7b4800\") " pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.493129 kubelet[2726]: I0912 06:01:58.493123 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbkd\" (UniqueName: \"kubernetes.io/projected/624a0d3b-047c-441f-8ae5-8082f3ac4f71-kube-api-access-mcbkd\") pod \"calico-kube-controllers-84b98fd8-d6rmw\" (UID: \"624a0d3b-047c-441f-8ae5-8082f3ac4f71\") " pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" Sep 12 06:01:58.493549 kubelet[2726]: I0912 06:01:58.493142 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9c0e49e1-71a2-4844-908b-3dd7ba7b4800-goldmane-key-pair\") pod \"goldmane-54d579b49d-mms5t\" (UID: \"9c0e49e1-71a2-4844-908b-3dd7ba7b4800\") " pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.493549 kubelet[2726]: I0912 06:01:58.493162 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2db5b58d-c27a-4a2a-bb15-98610be69135-config-volume\") pod \"coredns-668d6bf9bc-qvxnp\" (UID: \"2db5b58d-c27a-4a2a-bb15-98610be69135\") " pod="kube-system/coredns-668d6bf9bc-qvxnp" Sep 12 06:01:58.493549 kubelet[2726]: I0912 06:01:58.493182 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vcs\" (UniqueName: \"kubernetes.io/projected/9d395707-c6f1-4aac-b0b4-5b583852339d-kube-api-access-z7vcs\") pod \"coredns-668d6bf9bc-xt455\" (UID: \"9d395707-c6f1-4aac-b0b4-5b583852339d\") " pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:01:58.493549 kubelet[2726]: I0912 06:01:58.493202 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tzd\" (UniqueName: \"kubernetes.io/projected/c152de20-23a4-4564-a6ad-1bc57e7a8f64-kube-api-access-95tzd\") pod \"calico-apiserver-6bf4c55664-d9qhl\" (UID: \"c152de20-23a4-4564-a6ad-1bc57e7a8f64\") " pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" Sep 12 06:01:58.493549 kubelet[2726]: I0912 06:01:58.493229 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nqr\" (UniqueName: \"kubernetes.io/projected/2db5b58d-c27a-4a2a-bb15-98610be69135-kube-api-access-s5nqr\") pod \"coredns-668d6bf9bc-qvxnp\" (UID: \"2db5b58d-c27a-4a2a-bb15-98610be69135\") " pod="kube-system/coredns-668d6bf9bc-qvxnp" Sep 12 06:01:58.493694 kubelet[2726]: I0912 06:01:58.493244 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx84w\" (UniqueName: \"kubernetes.io/projected/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-kube-api-access-vx84w\") pod \"whisker-78c5cc786c-7ts5g\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:01:58.493694 kubelet[2726]: I0912 06:01:58.493261 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c0e49e1-71a2-4844-908b-3dd7ba7b4800-config\") pod \"goldmane-54d579b49d-mms5t\" (UID: \"9c0e49e1-71a2-4844-908b-3dd7ba7b4800\") " pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.493694 kubelet[2726]: I0912 06:01:58.493278 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d395707-c6f1-4aac-b0b4-5b583852339d-config-volume\") pod \"coredns-668d6bf9bc-xt455\" (UID: \"9d395707-c6f1-4aac-b0b4-5b583852339d\") " pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:01:58.493694 kubelet[2726]: I0912 06:01:58.493368 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cec02704-586d-4027-acf1-4feef1605e56-calico-apiserver-certs\") pod \"calico-apiserver-6bf4c55664-xmj8h\" (UID: \"cec02704-586d-4027-acf1-4feef1605e56\") " pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" Sep 12 06:01:58.493694 kubelet[2726]: I0912 06:01:58.493402 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c152de20-23a4-4564-a6ad-1bc57e7a8f64-calico-apiserver-certs\") pod \"calico-apiserver-6bf4c55664-d9qhl\" (UID: \"c152de20-23a4-4564-a6ad-1bc57e7a8f64\") " pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" Sep 12 06:01:58.493835 kubelet[2726]: I0912 06:01:58.493430 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrw5\" (UniqueName: \"kubernetes.io/projected/cec02704-586d-4027-acf1-4feef1605e56-kube-api-access-zvrw5\") pod \"calico-apiserver-6bf4c55664-xmj8h\" (UID: \"cec02704-586d-4027-acf1-4feef1605e56\") " pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" Sep 12 06:01:58.493835 kubelet[2726]: I0912 06:01:58.493450 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/624a0d3b-047c-441f-8ae5-8082f3ac4f71-tigera-ca-bundle\") pod \"calico-kube-controllers-84b98fd8-d6rmw\" (UID: \"624a0d3b-047c-441f-8ae5-8082f3ac4f71\") " pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" Sep 12 06:01:58.493835 kubelet[2726]: I0912 06:01:58.493467 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-backend-key-pair\") pod \"whisker-78c5cc786c-7ts5g\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:01:58.499467 systemd[1]: Created slice kubepods-burstable-pod2db5b58d_c27a_4a2a_bb15_98610be69135.slice - libcontainer container kubepods-burstable-pod2db5b58d_c27a_4a2a_bb15_98610be69135.slice. Sep 12 06:01:58.506226 systemd[1]: Created slice kubepods-besteffort-podbfd8e5e4_5df3_4c10_b6ba_3d2571948e7b.slice - libcontainer container kubepods-besteffort-podbfd8e5e4_5df3_4c10_b6ba_3d2571948e7b.slice. Sep 12 06:01:58.517061 systemd[1]: Created slice kubepods-besteffort-pod624a0d3b_047c_441f_8ae5_8082f3ac4f71.slice - libcontainer container kubepods-besteffort-pod624a0d3b_047c_441f_8ae5_8082f3ac4f71.slice. Sep 12 06:01:58.525313 systemd[1]: Created slice kubepods-besteffort-pod9c0e49e1_71a2_4844_908b_3dd7ba7b4800.slice - libcontainer container kubepods-besteffort-pod9c0e49e1_71a2_4844_908b_3dd7ba7b4800.slice. Sep 12 06:01:58.529336 containerd[1587]: time="2025-09-12T06:01:58.529277791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 06:01:58.532559 systemd[1]: Created slice kubepods-besteffort-podcec02704_586d_4027_acf1_4feef1605e56.slice - libcontainer container kubepods-besteffort-podcec02704_586d_4027_acf1_4feef1605e56.slice. Sep 12 06:01:58.736169 containerd[1587]: time="2025-09-12T06:01:58.736072140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:58.797147 containerd[1587]: time="2025-09-12T06:01:58.797084898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-d9qhl,Uid:c152de20-23a4-4564-a6ad-1bc57e7a8f64,Namespace:calico-apiserver,Attempt:0,}" Sep 12 06:01:58.802917 containerd[1587]: time="2025-09-12T06:01:58.802699832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qvxnp,Uid:2db5b58d-c27a-4a2a-bb15-98610be69135,Namespace:kube-system,Attempt:0,}" Sep 12 06:01:58.809849 containerd[1587]: time="2025-09-12T06:01:58.809803657Z" level=error msg="Failed to destroy network for sandbox \"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.814912 containerd[1587]: time="2025-09-12T06:01:58.814798817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5cc786c-7ts5g,Uid:bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:58.823884 containerd[1587]: time="2025-09-12T06:01:58.823704589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b98fd8-d6rmw,Uid:624a0d3b-047c-441f-8ae5-8082f3ac4f71,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:58.830377 containerd[1587]: time="2025-09-12T06:01:58.830193207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mms5t,Uid:9c0e49e1-71a2-4844-908b-3dd7ba7b4800,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:58.837767 containerd[1587]: time="2025-09-12T06:01:58.837432566Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.838413 containerd[1587]: time="2025-09-12T06:01:58.838366302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-xmj8h,Uid:cec02704-586d-4027-acf1-4feef1605e56,Namespace:calico-apiserver,Attempt:0,}" Sep 12 06:01:58.849904 kubelet[2726]: E0912 06:01:58.849534 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.849904 kubelet[2726]: E0912 06:01:58.849625 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:01:58.849904 kubelet[2726]: E0912 06:01:58.849649 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:01:58.851592 kubelet[2726]: E0912 06:01:58.849928 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-xt455_kube-system(9d395707-c6f1-4aac-b0b4-5b583852339d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-xt455_kube-system(9d395707-c6f1-4aac-b0b4-5b583852339d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5667904da146c6322925bc49d5ef14746a94fe89f473777f8d8a823900371dce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-xt455" podUID="9d395707-c6f1-4aac-b0b4-5b583852339d" Sep 12 06:01:58.905148 containerd[1587]: time="2025-09-12T06:01:58.905048446Z" level=error msg="Failed to destroy network for sandbox \"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.907586 containerd[1587]: time="2025-09-12T06:01:58.907556832Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-d9qhl,Uid:c152de20-23a4-4564-a6ad-1bc57e7a8f64,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.908000 kubelet[2726]: E0912 06:01:58.907957 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.908079 kubelet[2726]: E0912 06:01:58.908024 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" Sep 12 06:01:58.908079 kubelet[2726]: E0912 06:01:58.908049 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" Sep 12 06:01:58.908184 kubelet[2726]: E0912 06:01:58.908093 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bf4c55664-d9qhl_calico-apiserver(c152de20-23a4-4564-a6ad-1bc57e7a8f64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bf4c55664-d9qhl_calico-apiserver(c152de20-23a4-4564-a6ad-1bc57e7a8f64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fdfa6fcb54c4726a499ca452f357b47f66813d3965bfb9e8cba4a297e331c6fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" podUID="c152de20-23a4-4564-a6ad-1bc57e7a8f64" Sep 12 06:01:58.914688 containerd[1587]: time="2025-09-12T06:01:58.914578091Z" level=error msg="Failed to destroy network for sandbox \"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.917038 containerd[1587]: time="2025-09-12T06:01:58.916953977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qvxnp,Uid:2db5b58d-c27a-4a2a-bb15-98610be69135,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.917283 kubelet[2726]: E0912 06:01:58.917233 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.917334 kubelet[2726]: E0912 06:01:58.917311 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qvxnp" Sep 12 06:01:58.917361 kubelet[2726]: E0912 06:01:58.917336 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qvxnp" Sep 12 06:01:58.917437 kubelet[2726]: E0912 06:01:58.917381 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-qvxnp_kube-system(2db5b58d-c27a-4a2a-bb15-98610be69135)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-qvxnp_kube-system(2db5b58d-c27a-4a2a-bb15-98610be69135)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f6a999dded04d9e725a5268182796d2b38c8dad6e48bdd32bd2734300be0932\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-qvxnp" podUID="2db5b58d-c27a-4a2a-bb15-98610be69135" Sep 12 06:01:58.921298 containerd[1587]: time="2025-09-12T06:01:58.921209377Z" level=error msg="Failed to destroy network for sandbox \"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.923250 containerd[1587]: time="2025-09-12T06:01:58.923198216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5cc786c-7ts5g,Uid:bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.923523 kubelet[2726]: E0912 06:01:58.923474 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.923683 kubelet[2726]: E0912 06:01:58.923541 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:01:58.923683 kubelet[2726]: E0912 06:01:58.923598 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:01:58.923683 kubelet[2726]: E0912 06:01:58.923652 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-78c5cc786c-7ts5g_calico-system(bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-78c5cc786c-7ts5g_calico-system(bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b190b158dc2dbad07d41315fb5ae605454de31390dd4a1f48293d7fff987ee97\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-78c5cc786c-7ts5g" podUID="bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" Sep 12 06:01:58.927437 containerd[1587]: time="2025-09-12T06:01:58.927300227Z" level=error msg="Failed to destroy network for sandbox \"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.928699 containerd[1587]: time="2025-09-12T06:01:58.928668980Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b98fd8-d6rmw,Uid:624a0d3b-047c-441f-8ae5-8082f3ac4f71,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.930632 kubelet[2726]: E0912 06:01:58.930280 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.930632 kubelet[2726]: E0912 06:01:58.930376 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" Sep 12 06:01:58.930632 kubelet[2726]: E0912 06:01:58.930404 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" Sep 12 06:01:58.930766 kubelet[2726]: E0912 06:01:58.930445 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-84b98fd8-d6rmw_calico-system(624a0d3b-047c-441f-8ae5-8082f3ac4f71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-84b98fd8-d6rmw_calico-system(624a0d3b-047c-441f-8ae5-8082f3ac4f71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93f5e566c4874c52efa5a3ad1e89004fed4f5df9e849421acb0580dbe8d0319a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" podUID="624a0d3b-047c-441f-8ae5-8082f3ac4f71" Sep 12 06:01:58.942788 containerd[1587]: time="2025-09-12T06:01:58.942738509Z" level=error msg="Failed to destroy network for sandbox \"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.947356 containerd[1587]: time="2025-09-12T06:01:58.947312638Z" level=error msg="Failed to destroy network for sandbox \"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.950167 containerd[1587]: time="2025-09-12T06:01:58.950096341Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mms5t,Uid:9c0e49e1-71a2-4844-908b-3dd7ba7b4800,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.950500 kubelet[2726]: E0912 06:01:58.950426 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.950500 kubelet[2726]: E0912 06:01:58.950495 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.950634 kubelet[2726]: E0912 06:01:58.950520 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mms5t" Sep 12 06:01:58.950634 kubelet[2726]: E0912 06:01:58.950570 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-mms5t_calico-system(9c0e49e1-71a2-4844-908b-3dd7ba7b4800)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-mms5t_calico-system(9c0e49e1-71a2-4844-908b-3dd7ba7b4800)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"91201388e69de830c87a8579dc8be59608565ca00d812f8f91674477fefe61bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-mms5t" podUID="9c0e49e1-71a2-4844-908b-3dd7ba7b4800" Sep 12 06:01:58.951299 containerd[1587]: time="2025-09-12T06:01:58.951238518Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-xmj8h,Uid:cec02704-586d-4027-acf1-4feef1605e56,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.951563 kubelet[2726]: E0912 06:01:58.951516 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:58.951625 kubelet[2726]: E0912 06:01:58.951603 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" Sep 12 06:01:58.951659 kubelet[2726]: E0912 06:01:58.951628 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" Sep 12 06:01:58.951710 kubelet[2726]: E0912 06:01:58.951675 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bf4c55664-xmj8h_calico-apiserver(cec02704-586d-4027-acf1-4feef1605e56)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bf4c55664-xmj8h_calico-apiserver(cec02704-586d-4027-acf1-4feef1605e56)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f184ac52e63a03f764afca55f660a83813b317791a658cff185354c0f3d6ed36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" podUID="cec02704-586d-4027-acf1-4feef1605e56" Sep 12 06:01:59.441530 systemd[1]: Created slice kubepods-besteffort-pod9181799d_61c8_4e39_8795_bc27b5674755.slice - libcontainer container kubepods-besteffort-pod9181799d_61c8_4e39_8795_bc27b5674755.slice. Sep 12 06:01:59.444233 containerd[1587]: time="2025-09-12T06:01:59.444169676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-clj9k,Uid:9181799d-61c8-4e39-8795-bc27b5674755,Namespace:calico-system,Attempt:0,}" Sep 12 06:01:59.834810 containerd[1587]: time="2025-09-12T06:01:59.834758807Z" level=error msg="Failed to destroy network for sandbox \"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:59.837404 systemd[1]: run-netns-cni\x2d6fed0fa7\x2da0f2\x2da754\x2df643\x2de60205777b84.mount: Deactivated successfully. Sep 12 06:01:59.963313 containerd[1587]: time="2025-09-12T06:01:59.963245420Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-clj9k,Uid:9181799d-61c8-4e39-8795-bc27b5674755,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:59.963557 kubelet[2726]: E0912 06:01:59.963495 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:01:59.963917 kubelet[2726]: E0912 06:01:59.963589 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:59.963917 kubelet[2726]: E0912 06:01:59.963613 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-clj9k" Sep 12 06:01:59.963917 kubelet[2726]: E0912 06:01:59.963662 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-clj9k_calico-system(9181799d-61c8-4e39-8795-bc27b5674755)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-clj9k_calico-system(9181799d-61c8-4e39-8795-bc27b5674755)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"399b3a3570f11fe6da575018bd87997276bf400e2e7413845b2c3c3a4b285e5c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-clj9k" podUID="9181799d-61c8-4e39-8795-bc27b5674755" Sep 12 06:02:05.511379 systemd[1]: Started sshd@7-10.0.0.132:22-10.0.0.1:53896.service - OpenSSH per-connection server daemon (10.0.0.1:53896). Sep 12 06:02:05.581210 sshd[3835]: Accepted publickey for core from 10.0.0.1 port 53896 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:05.582893 sshd-session[3835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:05.587528 systemd-logind[1559]: New session 8 of user core. Sep 12 06:02:05.598261 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 06:02:05.718213 sshd[3838]: Connection closed by 10.0.0.1 port 53896 Sep 12 06:02:05.718603 sshd-session[3835]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:05.722870 systemd[1]: sshd@7-10.0.0.132:22-10.0.0.1:53896.service: Deactivated successfully. Sep 12 06:02:05.725032 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 06:02:05.725884 systemd-logind[1559]: Session 8 logged out. Waiting for processes to exit. Sep 12 06:02:05.727608 systemd-logind[1559]: Removed session 8. Sep 12 06:02:09.424208 containerd[1587]: time="2025-09-12T06:02:09.424138498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,}" Sep 12 06:02:10.482651 containerd[1587]: time="2025-09-12T06:02:10.482580594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5cc786c-7ts5g,Uid:bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b,Namespace:calico-system,Attempt:0,}" Sep 12 06:02:10.492441 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2327258650.mount: Deactivated successfully. Sep 12 06:02:10.661541 containerd[1587]: time="2025-09-12T06:02:10.661461893Z" level=error msg="Failed to destroy network for sandbox \"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.663957 systemd[1]: run-netns-cni\x2d7a5358d1\x2defa9\x2da697\x2d2625\x2d2148269d9144.mount: Deactivated successfully. Sep 12 06:02:10.733215 systemd[1]: Started sshd@8-10.0.0.132:22-10.0.0.1:55220.service - OpenSSH per-connection server daemon (10.0.0.1:55220). Sep 12 06:02:10.818165 containerd[1587]: time="2025-09-12T06:02:10.818109810Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.818649 kubelet[2726]: E0912 06:02:10.818569 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.818649 kubelet[2726]: E0912 06:02:10.818655 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:02:10.819220 kubelet[2726]: E0912 06:02:10.818678 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xt455" Sep 12 06:02:10.819220 kubelet[2726]: E0912 06:02:10.818730 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-xt455_kube-system(9d395707-c6f1-4aac-b0b4-5b583852339d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-xt455_kube-system(9d395707-c6f1-4aac-b0b4-5b583852339d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e39e5ee856c49819f2566a32f56e0cced3f100c7b266f7e36c25bfa7717ff307\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-xt455" podUID="9d395707-c6f1-4aac-b0b4-5b583852339d" Sep 12 06:02:10.842157 sshd[3890]: Accepted publickey for core from 10.0.0.1 port 55220 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:10.843867 sshd-session[3890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:10.848840 systemd-logind[1559]: New session 9 of user core. Sep 12 06:02:10.856238 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 06:02:10.867121 containerd[1587]: time="2025-09-12T06:02:10.867065172Z" level=error msg="Failed to destroy network for sandbox \"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.868575 containerd[1587]: time="2025-09-12T06:02:10.868519242Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:10.869759 systemd[1]: run-netns-cni\x2d2fb81115\x2d3e95\x2d4bcf\x2d4e2e\x2de953dd98800a.mount: Deactivated successfully. Sep 12 06:02:10.872470 containerd[1587]: time="2025-09-12T06:02:10.872427159Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 06:02:10.873317 containerd[1587]: time="2025-09-12T06:02:10.873275932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c5cc786c-7ts5g,Uid:bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.873721 kubelet[2726]: E0912 06:02:10.873645 2726 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 06:02:10.873805 kubelet[2726]: E0912 06:02:10.873750 2726 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:02:10.873805 kubelet[2726]: E0912 06:02:10.873779 2726 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78c5cc786c-7ts5g" Sep 12 06:02:10.873861 kubelet[2726]: E0912 06:02:10.873833 2726 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-78c5cc786c-7ts5g_calico-system(bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-78c5cc786c-7ts5g_calico-system(bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"865c129f3b57efda1bcf4f4551cbf474fd8e36dece760da1780188e280d6f2bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-78c5cc786c-7ts5g" podUID="bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" Sep 12 06:02:10.874706 containerd[1587]: time="2025-09-12T06:02:10.874658578Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:10.876812 containerd[1587]: time="2025-09-12T06:02:10.876775773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:10.877237 containerd[1587]: time="2025-09-12T06:02:10.877201372Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 12.347883304s" Sep 12 06:02:10.877285 containerd[1587]: time="2025-09-12T06:02:10.877244653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 06:02:10.886670 containerd[1587]: time="2025-09-12T06:02:10.886633090Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 06:02:10.898439 containerd[1587]: time="2025-09-12T06:02:10.898399630Z" level=info msg="Container 65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:10.922130 containerd[1587]: time="2025-09-12T06:02:10.922065432Z" level=info msg="CreateContainer within sandbox \"364dc3a5b72c204067121974bf6828ff2355797ac440ffc47401e64a5c4946b1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\"" Sep 12 06:02:10.922843 containerd[1587]: time="2025-09-12T06:02:10.922797315Z" level=info msg="StartContainer for \"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\"" Sep 12 06:02:10.924647 containerd[1587]: time="2025-09-12T06:02:10.924608406Z" level=info msg="connecting to shim 65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5" address="unix:///run/containerd/s/1e4a4d983a8f95987eaa771e9c86dd4677a09ca626e8dbece8fc999a3ae0d910" protocol=ttrpc version=3 Sep 12 06:02:10.954229 systemd[1]: Started cri-containerd-65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5.scope - libcontainer container 65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5. Sep 12 06:02:11.201539 containerd[1587]: time="2025-09-12T06:02:11.201483557Z" level=info msg="StartContainer for \"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\" returns successfully" Sep 12 06:02:11.212714 sshd[3923]: Connection closed by 10.0.0.1 port 55220 Sep 12 06:02:11.214322 sshd-session[3890]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:11.219287 systemd-logind[1559]: Session 9 logged out. Waiting for processes to exit. Sep 12 06:02:11.219632 systemd[1]: sshd@8-10.0.0.132:22-10.0.0.1:55220.service: Deactivated successfully. Sep 12 06:02:11.222136 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 06:02:11.223630 systemd-logind[1559]: Removed session 9. Sep 12 06:02:11.224332 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 06:02:11.224876 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 06:02:11.423870 containerd[1587]: time="2025-09-12T06:02:11.423803120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qvxnp,Uid:2db5b58d-c27a-4a2a-bb15-98610be69135,Namespace:kube-system,Attempt:0,}" Sep 12 06:02:11.576197 kubelet[2726]: I0912 06:02:11.575673 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mnjns" podStartSLOduration=1.359716814 podStartE2EDuration="30.575647309s" podCreationTimestamp="2025-09-12 06:01:41 +0000 UTC" firstStartedPulling="2025-09-12 06:01:41.6625507 +0000 UTC m=+19.321323193" lastFinishedPulling="2025-09-12 06:02:10.878481195 +0000 UTC m=+48.537253688" observedRunningTime="2025-09-12 06:02:11.574645258 +0000 UTC m=+49.233417761" watchObservedRunningTime="2025-09-12 06:02:11.575647309 +0000 UTC m=+49.234419802" Sep 12 06:02:11.578355 kubelet[2726]: I0912 06:02:11.578298 2726 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-ca-bundle\") pod \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " Sep 12 06:02:11.578355 kubelet[2726]: I0912 06:02:11.578339 2726 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx84w\" (UniqueName: \"kubernetes.io/projected/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-kube-api-access-vx84w\") pod \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " Sep 12 06:02:11.578866 kubelet[2726]: I0912 06:02:11.578379 2726 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-backend-key-pair\") pod \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\" (UID: \"bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b\") " Sep 12 06:02:11.580063 kubelet[2726]: I0912 06:02:11.580024 2726 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" (UID: "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 06:02:11.585534 kubelet[2726]: I0912 06:02:11.585502 2726 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" (UID: "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 06:02:11.585598 systemd[1]: var-lib-kubelet-pods-bfd8e5e4\x2d5df3\x2d4c10\x2db6ba\x2d3d2571948e7b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 06:02:11.591084 kubelet[2726]: I0912 06:02:11.591024 2726 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-kube-api-access-vx84w" (OuterVolumeSpecName: "kube-api-access-vx84w") pod "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" (UID: "bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b"). InnerVolumeSpecName "kube-api-access-vx84w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 06:02:11.591282 systemd[1]: var-lib-kubelet-pods-bfd8e5e4\x2d5df3\x2d4c10\x2db6ba\x2d3d2571948e7b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvx84w.mount: Deactivated successfully. Sep 12 06:02:11.600334 systemd-networkd[1491]: cali8bb4b5d711b: Link UP Sep 12 06:02:11.600686 systemd-networkd[1491]: cali8bb4b5d711b: Gained carrier Sep 12 06:02:11.617591 containerd[1587]: 2025-09-12 06:02:11.454 [INFO][3992] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 06:02:11.617591 containerd[1587]: 2025-09-12 06:02:11.474 [INFO][3992] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0 coredns-668d6bf9bc- kube-system 2db5b58d-c27a-4a2a-bb15-98610be69135 810 0 2025-09-12 06:01:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-qvxnp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8bb4b5d711b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-" Sep 12 06:02:11.617591 containerd[1587]: 2025-09-12 06:02:11.474 [INFO][3992] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.617591 containerd[1587]: 2025-09-12 06:02:11.540 [INFO][4007] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" HandleID="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Workload="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.541 [INFO][4007] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" HandleID="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Workload="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000b8890), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-qvxnp", "timestamp":"2025-09-12 06:02:11.540921748 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.541 [INFO][4007] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.541 [INFO][4007] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.541 [INFO][4007] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.550 [INFO][4007] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" host="localhost" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.556 [INFO][4007] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.561 [INFO][4007] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.564 [INFO][4007] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.567 [INFO][4007] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:11.618028 containerd[1587]: 2025-09-12 06:02:11.568 [INFO][4007] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" host="localhost" Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.569 [INFO][4007] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.576 [INFO][4007] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" host="localhost" Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.583 [INFO][4007] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" host="localhost" Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.583 [INFO][4007] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" host="localhost" Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.583 [INFO][4007] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:11.618292 containerd[1587]: 2025-09-12 06:02:11.583 [INFO][4007] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" HandleID="k8s-pod-network.c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Workload="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.618412 containerd[1587]: 2025-09-12 06:02:11.590 [INFO][3992] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2db5b58d-c27a-4a2a-bb15-98610be69135", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-qvxnp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bb4b5d711b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:11.618500 containerd[1587]: 2025-09-12 06:02:11.590 [INFO][3992] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.618500 containerd[1587]: 2025-09-12 06:02:11.592 [INFO][3992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8bb4b5d711b ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.618500 containerd[1587]: 2025-09-12 06:02:11.601 [INFO][3992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.618572 containerd[1587]: 2025-09-12 06:02:11.603 [INFO][3992] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2db5b58d-c27a-4a2a-bb15-98610be69135", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c", Pod:"coredns-668d6bf9bc-qvxnp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bb4b5d711b", MAC:"02:85:c0:1b:11:59", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:11.618572 containerd[1587]: 2025-09-12 06:02:11.613 [INFO][3992] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-qvxnp" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--qvxnp-eth0" Sep 12 06:02:11.680165 kubelet[2726]: I0912 06:02:11.679319 2726 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 06:02:11.680165 kubelet[2726]: I0912 06:02:11.679349 2726 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 06:02:11.680165 kubelet[2726]: I0912 06:02:11.679357 2726 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vx84w\" (UniqueName: \"kubernetes.io/projected/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b-kube-api-access-vx84w\") on node \"localhost\" DevicePath \"\"" Sep 12 06:02:11.741230 containerd[1587]: time="2025-09-12T06:02:11.741171189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\" id:\"04a14bc1a35ef4c212bbfae4496e87341b3c05e2c4d5a44df0e74845d160f7bb\" pid:4048 exit_status:1 exited_at:{seconds:1757656931 nanos:740580610}" Sep 12 06:02:11.754945 containerd[1587]: time="2025-09-12T06:02:11.754427466Z" level=info msg="connecting to shim c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c" address="unix:///run/containerd/s/87845996e0dce9f0ff2a43879867d5c22e4a9005ecf14da063f6f729643d46e8" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:11.782244 systemd[1]: Started cri-containerd-c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c.scope - libcontainer container c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c. Sep 12 06:02:11.795657 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:11.878645 containerd[1587]: time="2025-09-12T06:02:11.878595297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qvxnp,Uid:2db5b58d-c27a-4a2a-bb15-98610be69135,Namespace:kube-system,Attempt:0,} returns sandbox id \"c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c\"" Sep 12 06:02:11.885520 containerd[1587]: time="2025-09-12T06:02:11.885469071Z" level=info msg="CreateContainer within sandbox \"c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 06:02:11.902131 containerd[1587]: time="2025-09-12T06:02:11.902040141Z" level=info msg="Container 5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:11.909513 containerd[1587]: time="2025-09-12T06:02:11.909453317Z" level=info msg="CreateContainer within sandbox \"c37b2bcad4e40cd3ae08761d6a57b181804e4db9d8bf4e320cda2c9a96c43f5c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a\"" Sep 12 06:02:11.910035 containerd[1587]: time="2025-09-12T06:02:11.910009502Z" level=info msg="StartContainer for \"5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a\"" Sep 12 06:02:11.910924 containerd[1587]: time="2025-09-12T06:02:11.910888441Z" level=info msg="connecting to shim 5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a" address="unix:///run/containerd/s/87845996e0dce9f0ff2a43879867d5c22e4a9005ecf14da063f6f729643d46e8" protocol=ttrpc version=3 Sep 12 06:02:11.937258 systemd[1]: Started cri-containerd-5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a.scope - libcontainer container 5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a. Sep 12 06:02:11.979958 containerd[1587]: time="2025-09-12T06:02:11.979920888Z" level=info msg="StartContainer for \"5b246d4e412fb9b2930eb7042c49b7511ea74ebbb37c007fcd94c07b69c4964a\" returns successfully" Sep 12 06:02:12.424320 containerd[1587]: time="2025-09-12T06:02:12.424269527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-d9qhl,Uid:c152de20-23a4-4564-a6ad-1bc57e7a8f64,Namespace:calico-apiserver,Attempt:0,}" Sep 12 06:02:12.431163 systemd[1]: Removed slice kubepods-besteffort-podbfd8e5e4_5df3_4c10_b6ba_3d2571948e7b.slice - libcontainer container kubepods-besteffort-podbfd8e5e4_5df3_4c10_b6ba_3d2571948e7b.slice. Sep 12 06:02:12.516843 systemd-networkd[1491]: cali0e190e0b0ed: Link UP Sep 12 06:02:12.517252 systemd-networkd[1491]: cali0e190e0b0ed: Gained carrier Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.446 [INFO][4139] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.455 [INFO][4139] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0 calico-apiserver-6bf4c55664- calico-apiserver c152de20-23a4-4564-a6ad-1bc57e7a8f64 808 0 2025-09-12 06:01:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bf4c55664 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6bf4c55664-d9qhl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0e190e0b0ed [] [] }} ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.455 [INFO][4139] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.487 [INFO][4153] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" HandleID="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Workload="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.487 [INFO][4153] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" HandleID="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Workload="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002de0b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6bf4c55664-d9qhl", "timestamp":"2025-09-12 06:02:12.48713103 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.487 [INFO][4153] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.487 [INFO][4153] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.487 [INFO][4153] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.493 [INFO][4153] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.496 [INFO][4153] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.499 [INFO][4153] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.501 [INFO][4153] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.502 [INFO][4153] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.502 [INFO][4153] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.503 [INFO][4153] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31 Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.506 [INFO][4153] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.512 [INFO][4153] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.512 [INFO][4153] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" host="localhost" Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.512 [INFO][4153] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:12.530652 containerd[1587]: 2025-09-12 06:02:12.512 [INFO][4153] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" HandleID="k8s-pod-network.67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Workload="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.515 [INFO][4139] cni-plugin/k8s.go 418: Populated endpoint ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0", GenerateName:"calico-apiserver-6bf4c55664-", Namespace:"calico-apiserver", SelfLink:"", UID:"c152de20-23a4-4564-a6ad-1bc57e7a8f64", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bf4c55664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6bf4c55664-d9qhl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0e190e0b0ed", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.515 [INFO][4139] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.515 [INFO][4139] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e190e0b0ed ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.517 [INFO][4139] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.517 [INFO][4139] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0", GenerateName:"calico-apiserver-6bf4c55664-", Namespace:"calico-apiserver", SelfLink:"", UID:"c152de20-23a4-4564-a6ad-1bc57e7a8f64", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bf4c55664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31", Pod:"calico-apiserver-6bf4c55664-d9qhl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0e190e0b0ed", MAC:"2a:e3:f2:52:28:fc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:12.531273 containerd[1587]: 2025-09-12 06:02:12.527 [INFO][4139] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-d9qhl" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--d9qhl-eth0" Sep 12 06:02:12.574734 containerd[1587]: time="2025-09-12T06:02:12.574682610Z" level=info msg="connecting to shim 67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31" address="unix:///run/containerd/s/3e5622038274c3d84d53959b7ee68e39c5c558c0f0d310e37cb0d409bcd71ffc" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:12.582506 kubelet[2726]: I0912 06:02:12.582431 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-qvxnp" podStartSLOduration=44.58241177 podStartE2EDuration="44.58241177s" podCreationTimestamp="2025-09-12 06:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:02:12.582215721 +0000 UTC m=+50.240988214" watchObservedRunningTime="2025-09-12 06:02:12.58241177 +0000 UTC m=+50.241184273" Sep 12 06:02:12.665381 systemd[1]: Started cri-containerd-67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31.scope - libcontainer container 67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31. Sep 12 06:02:12.687225 systemd[1]: Created slice kubepods-besteffort-pode8c81024_dc97_437f_9978_20584d2c74bf.slice - libcontainer container kubepods-besteffort-pode8c81024_dc97_437f_9978_20584d2c74bf.slice. Sep 12 06:02:12.688651 kubelet[2726]: I0912 06:02:12.688404 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8c81024-dc97-437f-9978-20584d2c74bf-whisker-ca-bundle\") pod \"whisker-5bcfc55986-lv299\" (UID: \"e8c81024-dc97-437f-9978-20584d2c74bf\") " pod="calico-system/whisker-5bcfc55986-lv299" Sep 12 06:02:12.688651 kubelet[2726]: I0912 06:02:12.688454 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e8c81024-dc97-437f-9978-20584d2c74bf-whisker-backend-key-pair\") pod \"whisker-5bcfc55986-lv299\" (UID: \"e8c81024-dc97-437f-9978-20584d2c74bf\") " pod="calico-system/whisker-5bcfc55986-lv299" Sep 12 06:02:12.688651 kubelet[2726]: I0912 06:02:12.688471 2726 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqcbj\" (UniqueName: \"kubernetes.io/projected/e8c81024-dc97-437f-9978-20584d2c74bf-kube-api-access-wqcbj\") pod \"whisker-5bcfc55986-lv299\" (UID: \"e8c81024-dc97-437f-9978-20584d2c74bf\") " pod="calico-system/whisker-5bcfc55986-lv299" Sep 12 06:02:12.729295 systemd-networkd[1491]: cali8bb4b5d711b: Gained IPv6LL Sep 12 06:02:12.730328 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:12.796868 containerd[1587]: time="2025-09-12T06:02:12.796806881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-d9qhl,Uid:c152de20-23a4-4564-a6ad-1bc57e7a8f64,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31\"" Sep 12 06:02:12.803678 containerd[1587]: time="2025-09-12T06:02:12.803621162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 06:02:12.897862 containerd[1587]: time="2025-09-12T06:02:12.897822275Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\" id:\"23654252727cf75b85b45cdfe246a68ad01a5ac4615dd15eeca0462159316f52\" pid:4273 exit_status:1 exited_at:{seconds:1757656932 nanos:896801068}" Sep 12 06:02:12.996905 containerd[1587]: time="2025-09-12T06:02:12.996619066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bcfc55986-lv299,Uid:e8c81024-dc97-437f-9978-20584d2c74bf,Namespace:calico-system,Attempt:0,}" Sep 12 06:02:13.098336 systemd-networkd[1491]: cali21884a234e5: Link UP Sep 12 06:02:13.098734 systemd-networkd[1491]: cali21884a234e5: Gained carrier Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.037 [INFO][4382] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5bcfc55986--lv299-eth0 whisker-5bcfc55986- calico-system e8c81024-dc97-437f-9978-20584d2c74bf 957 0 2025-09-12 06:02:12 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5bcfc55986 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5bcfc55986-lv299 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali21884a234e5 [] [] }} ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.037 [INFO][4382] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.065 [INFO][4390] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" HandleID="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Workload="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.065 [INFO][4390] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" HandleID="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Workload="localhost-k8s-whisker--5bcfc55986--lv299-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e7b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5bcfc55986-lv299", "timestamp":"2025-09-12 06:02:13.065276311 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.065 [INFO][4390] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.065 [INFO][4390] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.065 [INFO][4390] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.072 [INFO][4390] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.075 [INFO][4390] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.079 [INFO][4390] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.080 [INFO][4390] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.082 [INFO][4390] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.082 [INFO][4390] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.083 [INFO][4390] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06 Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.086 [INFO][4390] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.093 [INFO][4390] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.093 [INFO][4390] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" host="localhost" Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.093 [INFO][4390] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:13.116341 containerd[1587]: 2025-09-12 06:02:13.093 [INFO][4390] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" HandleID="k8s-pod-network.9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Workload="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.096 [INFO][4382] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5bcfc55986--lv299-eth0", GenerateName:"whisker-5bcfc55986-", Namespace:"calico-system", SelfLink:"", UID:"e8c81024-dc97-437f-9978-20584d2c74bf", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 2, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5bcfc55986", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5bcfc55986-lv299", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali21884a234e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.096 [INFO][4382] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.096 [INFO][4382] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali21884a234e5 ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.100 [INFO][4382] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.101 [INFO][4382] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5bcfc55986--lv299-eth0", GenerateName:"whisker-5bcfc55986-", Namespace:"calico-system", SelfLink:"", UID:"e8c81024-dc97-437f-9978-20584d2c74bf", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 2, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5bcfc55986", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06", Pod:"whisker-5bcfc55986-lv299", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali21884a234e5", MAC:"56:10:ad:d3:95:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.117157 containerd[1587]: 2025-09-12 06:02:13.111 [INFO][4382] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" Namespace="calico-system" Pod="whisker-5bcfc55986-lv299" WorkloadEndpoint="localhost-k8s-whisker--5bcfc55986--lv299-eth0" Sep 12 06:02:13.138928 containerd[1587]: time="2025-09-12T06:02:13.138868259Z" level=info msg="connecting to shim 9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06" address="unix:///run/containerd/s/4fd63a75c492f92923c3f2b1a101448235575efdfe32d8bfce140f2f898ebdd5" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:13.162268 systemd[1]: Started cri-containerd-9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06.scope - libcontainer container 9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06. Sep 12 06:02:13.176362 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:13.212149 containerd[1587]: time="2025-09-12T06:02:13.212096123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5bcfc55986-lv299,Uid:e8c81024-dc97-437f-9978-20584d2c74bf,Namespace:calico-system,Attempt:0,} returns sandbox id \"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06\"" Sep 12 06:02:13.251004 systemd-networkd[1491]: vxlan.calico: Link UP Sep 12 06:02:13.251043 systemd-networkd[1491]: vxlan.calico: Gained carrier Sep 12 06:02:13.424115 containerd[1587]: time="2025-09-12T06:02:13.424062520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b98fd8-d6rmw,Uid:624a0d3b-047c-441f-8ae5-8082f3ac4f71,Namespace:calico-system,Attempt:0,}" Sep 12 06:02:13.424260 containerd[1587]: time="2025-09-12T06:02:13.424063662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-xmj8h,Uid:cec02704-586d-4027-acf1-4feef1605e56,Namespace:calico-apiserver,Attempt:0,}" Sep 12 06:02:13.553801 systemd-networkd[1491]: cali1df846089bd: Link UP Sep 12 06:02:13.555586 systemd-networkd[1491]: cali1df846089bd: Gained carrier Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.461 [INFO][4492] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0 calico-kube-controllers-84b98fd8- calico-system 624a0d3b-047c-441f-8ae5-8082f3ac4f71 812 0 2025-09-12 06:01:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:84b98fd8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-84b98fd8-d6rmw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1df846089bd [] [] }} ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.461 [INFO][4492] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.502 [INFO][4519] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" HandleID="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Workload="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.503 [INFO][4519] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" HandleID="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Workload="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001386a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-84b98fd8-d6rmw", "timestamp":"2025-09-12 06:02:13.502478643 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.503 [INFO][4519] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.503 [INFO][4519] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.503 [INFO][4519] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.510 [INFO][4519] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.515 [INFO][4519] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.519 [INFO][4519] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.521 [INFO][4519] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.523 [INFO][4519] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.523 [INFO][4519] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.525 [INFO][4519] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40 Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.528 [INFO][4519] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4519] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4519] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" host="localhost" Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4519] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:13.575653 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4519] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" HandleID="k8s-pod-network.e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Workload="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.542 [INFO][4492] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0", GenerateName:"calico-kube-controllers-84b98fd8-", Namespace:"calico-system", SelfLink:"", UID:"624a0d3b-047c-441f-8ae5-8082f3ac4f71", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84b98fd8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-84b98fd8-d6rmw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1df846089bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.543 [INFO][4492] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.543 [INFO][4492] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1df846089bd ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.560 [INFO][4492] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.561 [INFO][4492] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0", GenerateName:"calico-kube-controllers-84b98fd8-", Namespace:"calico-system", SelfLink:"", UID:"624a0d3b-047c-441f-8ae5-8082f3ac4f71", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84b98fd8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40", Pod:"calico-kube-controllers-84b98fd8-d6rmw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1df846089bd", MAC:"aa:24:49:dc:cf:9d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.576240 containerd[1587]: 2025-09-12 06:02:13.572 [INFO][4492] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" Namespace="calico-system" Pod="calico-kube-controllers-84b98fd8-d6rmw" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--84b98fd8--d6rmw-eth0" Sep 12 06:02:13.618343 containerd[1587]: time="2025-09-12T06:02:13.618168622Z" level=info msg="connecting to shim e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40" address="unix:///run/containerd/s/100d37784a4d4d77fc5543e10e28c6f4b37e3ef29e1da348c6727aa8c6468ec4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:13.657712 systemd[1]: Started cri-containerd-e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40.scope - libcontainer container e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40. Sep 12 06:02:13.664502 systemd-networkd[1491]: cali175a1332a59: Link UP Sep 12 06:02:13.666586 systemd-networkd[1491]: cali175a1332a59: Gained carrier Sep 12 06:02:13.681707 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:13.689657 containerd[1587]: time="2025-09-12T06:02:13.689611486Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\" id:\"e3dbbe2f99778be69326e3564bd106f567a15334d4cb363eab2fa9763f7dd408\" pid:4595 exit_status:1 exited_at:{seconds:1757656933 nanos:689137246}" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.486 [INFO][4503] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0 calico-apiserver-6bf4c55664- calico-apiserver cec02704-586d-4027-acf1-4feef1605e56 814 0 2025-09-12 06:01:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bf4c55664 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6bf4c55664-xmj8h eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali175a1332a59 [] [] }} ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.486 [INFO][4503] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.522 [INFO][4529] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" HandleID="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Workload="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.523 [INFO][4529] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" HandleID="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Workload="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f030), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6bf4c55664-xmj8h", "timestamp":"2025-09-12 06:02:13.522840073 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.523 [INFO][4529] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4529] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.536 [INFO][4529] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.611 [INFO][4529] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.616 [INFO][4529] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.624 [INFO][4529] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.627 [INFO][4529] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.630 [INFO][4529] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.630 [INFO][4529] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.633 [INFO][4529] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06 Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.640 [INFO][4529] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.649 [INFO][4529] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.650 [INFO][4529] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" host="localhost" Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.651 [INFO][4529] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:13.690872 containerd[1587]: 2025-09-12 06:02:13.651 [INFO][4529] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" HandleID="k8s-pod-network.b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Workload="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.660 [INFO][4503] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0", GenerateName:"calico-apiserver-6bf4c55664-", Namespace:"calico-apiserver", SelfLink:"", UID:"cec02704-586d-4027-acf1-4feef1605e56", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bf4c55664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6bf4c55664-xmj8h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali175a1332a59", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.660 [INFO][4503] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.660 [INFO][4503] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali175a1332a59 ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.664 [INFO][4503] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.665 [INFO][4503] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0", GenerateName:"calico-apiserver-6bf4c55664-", Namespace:"calico-apiserver", SelfLink:"", UID:"cec02704-586d-4027-acf1-4feef1605e56", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bf4c55664", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06", Pod:"calico-apiserver-6bf4c55664-xmj8h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali175a1332a59", MAC:"ae:db:ad:2e:98:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:13.691594 containerd[1587]: 2025-09-12 06:02:13.682 [INFO][4503] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" Namespace="calico-apiserver" Pod="calico-apiserver-6bf4c55664-xmj8h" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bf4c55664--xmj8h-eth0" Sep 12 06:02:13.717688 containerd[1587]: time="2025-09-12T06:02:13.717621172Z" level=info msg="connecting to shim b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06" address="unix:///run/containerd/s/19e001bac834bf7c202abad39bc48e3c8e137d4cbc19162d52997c6f121c0e55" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:13.735405 containerd[1587]: time="2025-09-12T06:02:13.735354448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84b98fd8-d6rmw,Uid:624a0d3b-047c-441f-8ae5-8082f3ac4f71,Namespace:calico-system,Attempt:0,} returns sandbox id \"e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40\"" Sep 12 06:02:13.755226 systemd[1]: Started cri-containerd-b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06.scope - libcontainer container b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06. Sep 12 06:02:13.766921 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:13.794290 containerd[1587]: time="2025-09-12T06:02:13.794249058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bf4c55664-xmj8h,Uid:cec02704-586d-4027-acf1-4feef1605e56,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06\"" Sep 12 06:02:14.136306 systemd-networkd[1491]: cali21884a234e5: Gained IPv6LL Sep 12 06:02:14.136670 systemd-networkd[1491]: cali0e190e0b0ed: Gained IPv6LL Sep 12 06:02:14.426311 containerd[1587]: time="2025-09-12T06:02:14.426089254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mms5t,Uid:9c0e49e1-71a2-4844-908b-3dd7ba7b4800,Namespace:calico-system,Attempt:0,}" Sep 12 06:02:14.430143 kubelet[2726]: I0912 06:02:14.430077 2726 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b" path="/var/lib/kubelet/pods/bfd8e5e4-5df3-4c10-b6ba-3d2571948e7b/volumes" Sep 12 06:02:14.632653 systemd-networkd[1491]: cali70f18f1bccb: Link UP Sep 12 06:02:14.633683 systemd-networkd[1491]: cali70f18f1bccb: Gained carrier Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.569 [INFO][4715] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--mms5t-eth0 goldmane-54d579b49d- calico-system 9c0e49e1-71a2-4844-908b-3dd7ba7b4800 815 0 2025-09-12 06:01:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-mms5t eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali70f18f1bccb [] [] }} ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.569 [INFO][4715] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.598 [INFO][4730] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" HandleID="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Workload="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.598 [INFO][4730] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" HandleID="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Workload="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cfa70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-mms5t", "timestamp":"2025-09-12 06:02:14.598382944 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.598 [INFO][4730] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.598 [INFO][4730] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.598 [INFO][4730] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.604 [INFO][4730] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.607 [INFO][4730] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.611 [INFO][4730] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.612 [INFO][4730] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.614 [INFO][4730] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.614 [INFO][4730] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.616 [INFO][4730] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62 Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.619 [INFO][4730] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.624 [INFO][4730] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.624 [INFO][4730] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" host="localhost" Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.624 [INFO][4730] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:14.651719 containerd[1587]: 2025-09-12 06:02:14.624 [INFO][4730] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" HandleID="k8s-pod-network.38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Workload="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.630 [INFO][4715] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--mms5t-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"9c0e49e1-71a2-4844-908b-3dd7ba7b4800", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-mms5t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali70f18f1bccb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.630 [INFO][4715] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.630 [INFO][4715] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali70f18f1bccb ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.634 [INFO][4715] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.635 [INFO][4715] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--mms5t-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"9c0e49e1-71a2-4844-908b-3dd7ba7b4800", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62", Pod:"goldmane-54d579b49d-mms5t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali70f18f1bccb", MAC:"c2:ff:6e:02:57:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:14.652590 containerd[1587]: 2025-09-12 06:02:14.648 [INFO][4715] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" Namespace="calico-system" Pod="goldmane-54d579b49d-mms5t" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--mms5t-eth0" Sep 12 06:02:14.675443 containerd[1587]: time="2025-09-12T06:02:14.675384785Z" level=info msg="connecting to shim 38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62" address="unix:///run/containerd/s/4872218e07d2f6cb62635973729c27e99e92f4c955780d1474ecbd23d67abf52" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:14.699352 systemd[1]: Started cri-containerd-38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62.scope - libcontainer container 38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62. Sep 12 06:02:14.713653 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:14.840396 systemd-networkd[1491]: vxlan.calico: Gained IPv6LL Sep 12 06:02:14.860383 containerd[1587]: time="2025-09-12T06:02:14.860327065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mms5t,Uid:9c0e49e1-71a2-4844-908b-3dd7ba7b4800,Namespace:calico-system,Attempt:0,} returns sandbox id \"38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62\"" Sep 12 06:02:14.904317 systemd-networkd[1491]: cali1df846089bd: Gained IPv6LL Sep 12 06:02:15.224357 systemd-networkd[1491]: cali175a1332a59: Gained IPv6LL Sep 12 06:02:15.424152 containerd[1587]: time="2025-09-12T06:02:15.424082694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-clj9k,Uid:9181799d-61c8-4e39-8795-bc27b5674755,Namespace:calico-system,Attempt:0,}" Sep 12 06:02:16.233362 systemd[1]: Started sshd@9-10.0.0.132:22-10.0.0.1:55230.service - OpenSSH per-connection server daemon (10.0.0.1:55230). Sep 12 06:02:16.442248 systemd-networkd[1491]: cali70f18f1bccb: Gained IPv6LL Sep 12 06:02:16.493251 containerd[1587]: time="2025-09-12T06:02:16.493052472Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:16.494965 sshd[4813]: Accepted publickey for core from 10.0.0.1 port 55230 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:16.495476 containerd[1587]: time="2025-09-12T06:02:16.495010968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 06:02:16.497581 containerd[1587]: time="2025-09-12T06:02:16.497497364Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:16.498004 sshd-session[4813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:16.503502 containerd[1587]: time="2025-09-12T06:02:16.503428767Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:16.504223 containerd[1587]: time="2025-09-12T06:02:16.504052307Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.700371914s" Sep 12 06:02:16.504223 containerd[1587]: time="2025-09-12T06:02:16.504149068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 06:02:16.505467 systemd-logind[1559]: New session 10 of user core. Sep 12 06:02:16.509015 containerd[1587]: time="2025-09-12T06:02:16.508980546Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 06:02:16.510444 containerd[1587]: time="2025-09-12T06:02:16.510397586Z" level=info msg="CreateContainer within sandbox \"67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 06:02:16.512383 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 06:02:16.516921 systemd-networkd[1491]: cali3a06b21773c: Link UP Sep 12 06:02:16.518331 systemd-networkd[1491]: cali3a06b21773c: Gained carrier Sep 12 06:02:16.533291 containerd[1587]: time="2025-09-12T06:02:16.533217753Z" level=info msg="Container b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.423 [INFO][4798] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--clj9k-eth0 csi-node-driver- calico-system 9181799d-61c8-4e39-8795-bc27b5674755 683 0 2025-09-12 06:01:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-clj9k eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3a06b21773c [] [] }} ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.423 [INFO][4798] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.455 [INFO][4816] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" HandleID="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Workload="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.455 [INFO][4816] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" HandleID="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Workload="localhost-k8s-csi--node--driver--clj9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e130), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-clj9k", "timestamp":"2025-09-12 06:02:16.45553463 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.455 [INFO][4816] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.455 [INFO][4816] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.455 [INFO][4816] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.467 [INFO][4816] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.474 [INFO][4816] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.480 [INFO][4816] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.484 [INFO][4816] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.487 [INFO][4816] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.487 [INFO][4816] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.488 [INFO][4816] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55 Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.495 [INFO][4816] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.503 [INFO][4816] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.504 [INFO][4816] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" host="localhost" Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.504 [INFO][4816] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:16.553736 containerd[1587]: 2025-09-12 06:02:16.504 [INFO][4816] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" HandleID="k8s-pod-network.80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Workload="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.512 [INFO][4798] cni-plugin/k8s.go 418: Populated endpoint ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--clj9k-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9181799d-61c8-4e39-8795-bc27b5674755", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-clj9k", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3a06b21773c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.512 [INFO][4798] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.512 [INFO][4798] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a06b21773c ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.520 [INFO][4798] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.524 [INFO][4798] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--clj9k-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9181799d-61c8-4e39-8795-bc27b5674755", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55", Pod:"csi-node-driver-clj9k", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3a06b21773c", MAC:"52:02:2d:39:7a:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:16.555423 containerd[1587]: 2025-09-12 06:02:16.536 [INFO][4798] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" Namespace="calico-system" Pod="csi-node-driver-clj9k" WorkloadEndpoint="localhost-k8s-csi--node--driver--clj9k-eth0" Sep 12 06:02:16.563821 containerd[1587]: time="2025-09-12T06:02:16.563673442Z" level=info msg="CreateContainer within sandbox \"67a8d6b8d134589cfb300d954e59142bbe2dfc625458a038651e7141bb0d5c31\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738\"" Sep 12 06:02:16.564375 containerd[1587]: time="2025-09-12T06:02:16.564346946Z" level=info msg="StartContainer for \"b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738\"" Sep 12 06:02:16.569438 containerd[1587]: time="2025-09-12T06:02:16.569226223Z" level=info msg="connecting to shim b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738" address="unix:///run/containerd/s/3e5622038274c3d84d53959b7ee68e39c5c558c0f0d310e37cb0d409bcd71ffc" protocol=ttrpc version=3 Sep 12 06:02:16.588254 containerd[1587]: time="2025-09-12T06:02:16.588210202Z" level=info msg="connecting to shim 80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55" address="unix:///run/containerd/s/b526a59f7b00d493fdf0e0839d5a3a7a529c664ae106b53dd55fb0961157c3c3" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:16.634380 systemd[1]: Started cri-containerd-80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55.scope - libcontainer container 80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55. Sep 12 06:02:16.638681 systemd[1]: Started cri-containerd-b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738.scope - libcontainer container b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738. Sep 12 06:02:16.659947 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:16.687186 containerd[1587]: time="2025-09-12T06:02:16.687141203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-clj9k,Uid:9181799d-61c8-4e39-8795-bc27b5674755,Namespace:calico-system,Attempt:0,} returns sandbox id \"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55\"" Sep 12 06:02:16.688868 sshd[4831]: Connection closed by 10.0.0.1 port 55230 Sep 12 06:02:16.689352 sshd-session[4813]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:16.698727 systemd[1]: sshd@9-10.0.0.132:22-10.0.0.1:55230.service: Deactivated successfully. Sep 12 06:02:16.699023 systemd-logind[1559]: Session 10 logged out. Waiting for processes to exit. Sep 12 06:02:16.701379 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 06:02:16.703387 systemd-logind[1559]: Removed session 10. Sep 12 06:02:16.712663 containerd[1587]: time="2025-09-12T06:02:16.712629389Z" level=info msg="StartContainer for \"b8a66514d2e592a30425c4d82edc32b4ad19c828d5b7a3f4deba2dde6317f738\" returns successfully" Sep 12 06:02:17.603945 kubelet[2726]: I0912 06:02:17.603196 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bf4c55664-d9qhl" podStartSLOduration=35.896926424 podStartE2EDuration="39.60317126s" podCreationTimestamp="2025-09-12 06:01:38 +0000 UTC" firstStartedPulling="2025-09-12 06:02:12.802499797 +0000 UTC m=+50.461272300" lastFinishedPulling="2025-09-12 06:02:16.508744633 +0000 UTC m=+54.167517136" observedRunningTime="2025-09-12 06:02:17.602441591 +0000 UTC m=+55.261214104" watchObservedRunningTime="2025-09-12 06:02:17.60317126 +0000 UTC m=+55.261943773" Sep 12 06:02:18.361338 systemd-networkd[1491]: cali3a06b21773c: Gained IPv6LL Sep 12 06:02:18.593692 kubelet[2726]: I0912 06:02:18.593646 2726 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 06:02:18.817449 containerd[1587]: time="2025-09-12T06:02:18.817289070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:18.819276 containerd[1587]: time="2025-09-12T06:02:18.819237787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 06:02:18.821390 containerd[1587]: time="2025-09-12T06:02:18.821308262Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:18.823728 containerd[1587]: time="2025-09-12T06:02:18.823671998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:18.824281 containerd[1587]: time="2025-09-12T06:02:18.824253530Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.31523929s" Sep 12 06:02:18.824336 containerd[1587]: time="2025-09-12T06:02:18.824284538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 06:02:18.825124 containerd[1587]: time="2025-09-12T06:02:18.825076384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 06:02:18.826769 containerd[1587]: time="2025-09-12T06:02:18.826726280Z" level=info msg="CreateContainer within sandbox \"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 06:02:18.838121 containerd[1587]: time="2025-09-12T06:02:18.835593951Z" level=info msg="Container df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:18.845082 containerd[1587]: time="2025-09-12T06:02:18.845040288Z" level=info msg="CreateContainer within sandbox \"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741\"" Sep 12 06:02:18.845589 containerd[1587]: time="2025-09-12T06:02:18.845551617Z" level=info msg="StartContainer for \"df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741\"" Sep 12 06:02:18.846810 containerd[1587]: time="2025-09-12T06:02:18.846785623Z" level=info msg="connecting to shim df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741" address="unix:///run/containerd/s/4fd63a75c492f92923c3f2b1a101448235575efdfe32d8bfce140f2f898ebdd5" protocol=ttrpc version=3 Sep 12 06:02:18.869248 systemd[1]: Started cri-containerd-df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741.scope - libcontainer container df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741. Sep 12 06:02:19.003124 containerd[1587]: time="2025-09-12T06:02:19.003060588Z" level=info msg="StartContainer for \"df3e5dde86dc1d1491d16e9e81ea3919f512feb88da025f1aadf805fad633741\" returns successfully" Sep 12 06:02:21.703427 systemd[1]: Started sshd@10-10.0.0.132:22-10.0.0.1:60040.service - OpenSSH per-connection server daemon (10.0.0.1:60040). Sep 12 06:02:21.775179 sshd[4989]: Accepted publickey for core from 10.0.0.1 port 60040 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:21.777233 sshd-session[4989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:21.782812 systemd-logind[1559]: New session 11 of user core. Sep 12 06:02:21.789307 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 06:02:21.961715 sshd[4992]: Connection closed by 10.0.0.1 port 60040 Sep 12 06:02:21.962354 sshd-session[4989]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:21.975062 systemd[1]: sshd@10-10.0.0.132:22-10.0.0.1:60040.service: Deactivated successfully. Sep 12 06:02:21.979165 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 06:02:21.980950 systemd-logind[1559]: Session 11 logged out. Waiting for processes to exit. Sep 12 06:02:21.984405 systemd-logind[1559]: Removed session 11. Sep 12 06:02:21.986504 systemd[1]: Started sshd@11-10.0.0.132:22-10.0.0.1:60048.service - OpenSSH per-connection server daemon (10.0.0.1:60048). Sep 12 06:02:22.042693 sshd[5006]: Accepted publickey for core from 10.0.0.1 port 60048 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:22.045059 sshd-session[5006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:22.052895 systemd-logind[1559]: New session 12 of user core. Sep 12 06:02:22.061242 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 06:02:22.416653 containerd[1587]: time="2025-09-12T06:02:22.416568860Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:22.417754 containerd[1587]: time="2025-09-12T06:02:22.417711413Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 06:02:22.419876 containerd[1587]: time="2025-09-12T06:02:22.419425620Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:22.421219 containerd[1587]: time="2025-09-12T06:02:22.421192045Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:22.422145 containerd[1587]: time="2025-09-12T06:02:22.422072106Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.596965466s" Sep 12 06:02:22.422460 containerd[1587]: time="2025-09-12T06:02:22.422430308Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 06:02:22.425492 containerd[1587]: time="2025-09-12T06:02:22.425459742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 06:02:22.435694 containerd[1587]: time="2025-09-12T06:02:22.435642549Z" level=info msg="CreateContainer within sandbox \"e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 06:02:22.451331 containerd[1587]: time="2025-09-12T06:02:22.451282103Z" level=info msg="Container 8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:22.460578 sshd[5009]: Connection closed by 10.0.0.1 port 60048 Sep 12 06:02:22.463221 sshd-session[5006]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:22.474065 containerd[1587]: time="2025-09-12T06:02:22.474029254Z" level=info msg="CreateContainer within sandbox \"e31683c0d8c496c4c3469d47b39557367b6c60e98d0d4ec2fc9841b88ca94d40\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9\"" Sep 12 06:02:22.475219 containerd[1587]: time="2025-09-12T06:02:22.475180394Z" level=info msg="StartContainer for \"8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9\"" Sep 12 06:02:22.477223 systemd[1]: sshd@11-10.0.0.132:22-10.0.0.1:60048.service: Deactivated successfully. Sep 12 06:02:22.482120 containerd[1587]: time="2025-09-12T06:02:22.478364609Z" level=info msg="connecting to shim 8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9" address="unix:///run/containerd/s/100d37784a4d4d77fc5543e10e28c6f4b37e3ef29e1da348c6727aa8c6468ec4" protocol=ttrpc version=3 Sep 12 06:02:22.490790 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 06:02:22.494190 systemd-logind[1559]: Session 12 logged out. Waiting for processes to exit. Sep 12 06:02:22.497928 systemd[1]: Started sshd@12-10.0.0.132:22-10.0.0.1:60054.service - OpenSSH per-connection server daemon (10.0.0.1:60054). Sep 12 06:02:22.501376 systemd-logind[1559]: Removed session 12. Sep 12 06:02:22.526253 systemd[1]: Started cri-containerd-8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9.scope - libcontainer container 8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9. Sep 12 06:02:22.553029 sshd[5025]: Accepted publickey for core from 10.0.0.1 port 60054 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:22.554968 sshd-session[5025]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:22.560766 systemd-logind[1559]: New session 13 of user core. Sep 12 06:02:22.568450 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 06:02:22.713081 containerd[1587]: time="2025-09-12T06:02:22.712928606Z" level=info msg="StartContainer for \"8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9\" returns successfully" Sep 12 06:02:22.815212 sshd[5047]: Connection closed by 10.0.0.1 port 60054 Sep 12 06:02:22.815567 sshd-session[5025]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:22.820622 systemd[1]: sshd@12-10.0.0.132:22-10.0.0.1:60054.service: Deactivated successfully. Sep 12 06:02:22.822830 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 06:02:22.823612 systemd-logind[1559]: Session 13 logged out. Waiting for processes to exit. Sep 12 06:02:22.824755 systemd-logind[1559]: Removed session 13. Sep 12 06:02:23.401975 containerd[1587]: time="2025-09-12T06:02:23.401922355Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:23.402877 containerd[1587]: time="2025-09-12T06:02:23.402849251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 06:02:23.404522 containerd[1587]: time="2025-09-12T06:02:23.404494890Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 979.005673ms" Sep 12 06:02:23.404597 containerd[1587]: time="2025-09-12T06:02:23.404522773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 06:02:23.406598 containerd[1587]: time="2025-09-12T06:02:23.405767598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 06:02:23.406804 containerd[1587]: time="2025-09-12T06:02:23.406766399Z" level=info msg="CreateContainer within sandbox \"b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 06:02:23.415276 containerd[1587]: time="2025-09-12T06:02:23.415243356Z" level=info msg="Container d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:23.423920 containerd[1587]: time="2025-09-12T06:02:23.423878470Z" level=info msg="CreateContainer within sandbox \"b6f68f903e0faddb2ab34617b1c9427cb2a444065c34b62849fcbcce938ffc06\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d\"" Sep 12 06:02:23.424606 containerd[1587]: time="2025-09-12T06:02:23.424586934Z" level=info msg="StartContainer for \"d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d\"" Sep 12 06:02:23.425649 containerd[1587]: time="2025-09-12T06:02:23.425628086Z" level=info msg="connecting to shim d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d" address="unix:///run/containerd/s/19e001bac834bf7c202abad39bc48e3c8e137d4cbc19162d52997c6f121c0e55" protocol=ttrpc version=3 Sep 12 06:02:23.429144 containerd[1587]: time="2025-09-12T06:02:23.428288086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,}" Sep 12 06:02:23.459389 systemd[1]: Started cri-containerd-d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d.scope - libcontainer container d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d. Sep 12 06:02:23.522272 containerd[1587]: time="2025-09-12T06:02:23.522220001Z" level=info msg="StartContainer for \"d71d6194c332f487085f3965c19da39044476eeb54d8859852f27ed7c247353d\" returns successfully" Sep 12 06:02:23.552573 systemd-networkd[1491]: cali270f5bc3ff3: Link UP Sep 12 06:02:23.552785 systemd-networkd[1491]: cali270f5bc3ff3: Gained carrier Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.473 [INFO][5091] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--xt455-eth0 coredns-668d6bf9bc- kube-system 9d395707-c6f1-4aac-b0b4-5b583852339d 803 0 2025-09-12 06:01:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-xt455 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali270f5bc3ff3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.473 [INFO][5091] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.507 [INFO][5120] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" HandleID="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Workload="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.507 [INFO][5120] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" HandleID="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Workload="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000325490), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-xt455", "timestamp":"2025-09-12 06:02:23.506987019 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.507 [INFO][5120] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.507 [INFO][5120] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.507 [INFO][5120] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.514 [INFO][5120] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.523 [INFO][5120] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.528 [INFO][5120] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.530 [INFO][5120] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.533 [INFO][5120] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.533 [INFO][5120] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.535 [INFO][5120] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.539 [INFO][5120] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.546 [INFO][5120] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.546 [INFO][5120] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" host="localhost" Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.546 [INFO][5120] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 06:02:23.646137 containerd[1587]: 2025-09-12 06:02:23.546 [INFO][5120] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" HandleID="k8s-pod-network.f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Workload="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.549 [INFO][5091] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--xt455-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9d395707-c6f1-4aac-b0b4-5b583852339d", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-xt455", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali270f5bc3ff3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.549 [INFO][5091] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.549 [INFO][5091] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali270f5bc3ff3 ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.552 [INFO][5091] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.552 [INFO][5091] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--xt455-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9d395707-c6f1-4aac-b0b4-5b583852339d", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 6, 1, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f", Pod:"coredns-668d6bf9bc-xt455", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali270f5bc3ff3", MAC:"06:df:a0:d7:e7:bd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 06:02:23.646736 containerd[1587]: 2025-09-12 06:02:23.639 [INFO][5091] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" Namespace="kube-system" Pod="coredns-668d6bf9bc-xt455" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--xt455-eth0" Sep 12 06:02:23.758312 kubelet[2726]: I0912 06:02:23.757611 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-84b98fd8-d6rmw" podStartSLOduration=34.073008485 podStartE2EDuration="42.757526522s" podCreationTimestamp="2025-09-12 06:01:41 +0000 UTC" firstStartedPulling="2025-09-12 06:02:13.740045343 +0000 UTC m=+51.398817846" lastFinishedPulling="2025-09-12 06:02:22.42456338 +0000 UTC m=+60.083335883" observedRunningTime="2025-09-12 06:02:23.75563055 +0000 UTC m=+61.414403053" watchObservedRunningTime="2025-09-12 06:02:23.757526522 +0000 UTC m=+61.416299015" Sep 12 06:02:23.778930 containerd[1587]: time="2025-09-12T06:02:23.778876826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9\" id:\"f835f596caa39d31ee702024b4617d0de14dcf598359836a3919ba98550e7ceb\" pid:5176 exited_at:{seconds:1757656943 nanos:778514964}" Sep 12 06:02:23.785052 kubelet[2726]: I0912 06:02:23.784715 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bf4c55664-xmj8h" podStartSLOduration=36.174591935 podStartE2EDuration="45.784696497s" podCreationTimestamp="2025-09-12 06:01:38 +0000 UTC" firstStartedPulling="2025-09-12 06:02:13.795204171 +0000 UTC m=+51.453976674" lastFinishedPulling="2025-09-12 06:02:23.405308723 +0000 UTC m=+61.064081236" observedRunningTime="2025-09-12 06:02:23.781217074 +0000 UTC m=+61.439989577" watchObservedRunningTime="2025-09-12 06:02:23.784696497 +0000 UTC m=+61.443469000" Sep 12 06:02:23.808904 containerd[1587]: time="2025-09-12T06:02:23.808847778Z" level=info msg="connecting to shim f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f" address="unix:///run/containerd/s/421bfc26ee2ad5707439c509255ff14f6cbe4890a6f5b56548a7569eb4c07fb2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 06:02:23.839294 systemd[1]: Started cri-containerd-f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f.scope - libcontainer container f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f. Sep 12 06:02:23.860236 systemd-resolved[1415]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 06:02:23.890476 containerd[1587]: time="2025-09-12T06:02:23.890418380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xt455,Uid:9d395707-c6f1-4aac-b0b4-5b583852339d,Namespace:kube-system,Attempt:0,} returns sandbox id \"f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f\"" Sep 12 06:02:23.893322 containerd[1587]: time="2025-09-12T06:02:23.893289908Z" level=info msg="CreateContainer within sandbox \"f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 06:02:23.906825 containerd[1587]: time="2025-09-12T06:02:23.906784997Z" level=info msg="Container b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:23.915962 containerd[1587]: time="2025-09-12T06:02:23.915923710Z" level=info msg="CreateContainer within sandbox \"f8627c5eafcb26d97f1ba285c3f8c6a00907997acbd3ced09fea6065a908a41f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4\"" Sep 12 06:02:23.916826 containerd[1587]: time="2025-09-12T06:02:23.916802936Z" level=info msg="StartContainer for \"b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4\"" Sep 12 06:02:23.918040 containerd[1587]: time="2025-09-12T06:02:23.917972088Z" level=info msg="connecting to shim b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4" address="unix:///run/containerd/s/421bfc26ee2ad5707439c509255ff14f6cbe4890a6f5b56548a7569eb4c07fb2" protocol=ttrpc version=3 Sep 12 06:02:23.947238 systemd[1]: Started cri-containerd-b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4.scope - libcontainer container b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4. Sep 12 06:02:23.981570 containerd[1587]: time="2025-09-12T06:02:23.981525834Z" level=info msg="StartContainer for \"b87fdc8b46a64ccfcf696c2a422490297043d460d9cb8369489102974d47fcf4\" returns successfully" Sep 12 06:02:24.632330 systemd-networkd[1491]: cali270f5bc3ff3: Gained IPv6LL Sep 12 06:02:25.448990 kubelet[2726]: I0912 06:02:25.448917 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-xt455" podStartSLOduration=57.448881738 podStartE2EDuration="57.448881738s" podCreationTimestamp="2025-09-12 06:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 06:02:24.884790199 +0000 UTC m=+62.543562722" watchObservedRunningTime="2025-09-12 06:02:25.448881738 +0000 UTC m=+63.107654241" Sep 12 06:02:26.694633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3041737387.mount: Deactivated successfully. Sep 12 06:02:27.409320 containerd[1587]: time="2025-09-12T06:02:27.409272367Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:27.410039 containerd[1587]: time="2025-09-12T06:02:27.410010656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 06:02:27.411286 containerd[1587]: time="2025-09-12T06:02:27.411257169Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:27.421551 containerd[1587]: time="2025-09-12T06:02:27.421515826Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:27.422070 containerd[1587]: time="2025-09-12T06:02:27.422036775Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.016235462s" Sep 12 06:02:27.422070 containerd[1587]: time="2025-09-12T06:02:27.422061933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 06:02:27.422942 containerd[1587]: time="2025-09-12T06:02:27.422915516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 06:02:27.424760 containerd[1587]: time="2025-09-12T06:02:27.424735589Z" level=info msg="CreateContainer within sandbox \"38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 06:02:27.433158 containerd[1587]: time="2025-09-12T06:02:27.433118074Z" level=info msg="Container ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:27.446646 containerd[1587]: time="2025-09-12T06:02:27.446602675Z" level=info msg="CreateContainer within sandbox \"38675cc85d68e06c22271f9ef5c24fecc983cd3f4d61b879ef10245ad1618e62\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\"" Sep 12 06:02:27.447133 containerd[1587]: time="2025-09-12T06:02:27.447088285Z" level=info msg="StartContainer for \"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\"" Sep 12 06:02:27.448376 containerd[1587]: time="2025-09-12T06:02:27.448350278Z" level=info msg="connecting to shim ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e" address="unix:///run/containerd/s/4872218e07d2f6cb62635973729c27e99e92f4c955780d1474ecbd23d67abf52" protocol=ttrpc version=3 Sep 12 06:02:27.472263 systemd[1]: Started cri-containerd-ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e.scope - libcontainer container ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e. Sep 12 06:02:27.523748 containerd[1587]: time="2025-09-12T06:02:27.523695789Z" level=info msg="StartContainer for \"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\" returns successfully" Sep 12 06:02:27.824577 kubelet[2726]: I0912 06:02:27.823764 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-mms5t" podStartSLOduration=35.262407474 podStartE2EDuration="47.823742041s" podCreationTimestamp="2025-09-12 06:01:40 +0000 UTC" firstStartedPulling="2025-09-12 06:02:14.861426719 +0000 UTC m=+52.520199222" lastFinishedPulling="2025-09-12 06:02:27.422761276 +0000 UTC m=+65.081533789" observedRunningTime="2025-09-12 06:02:27.823258666 +0000 UTC m=+65.482031169" watchObservedRunningTime="2025-09-12 06:02:27.823742041 +0000 UTC m=+65.482514544" Sep 12 06:02:27.833066 systemd[1]: Started sshd@13-10.0.0.132:22-10.0.0.1:60068.service - OpenSSH per-connection server daemon (10.0.0.1:60068). Sep 12 06:02:27.917376 sshd[5324]: Accepted publickey for core from 10.0.0.1 port 60068 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:27.918993 sshd-session[5324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:27.923698 systemd-logind[1559]: New session 14 of user core. Sep 12 06:02:27.929264 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 06:02:28.064932 sshd[5330]: Connection closed by 10.0.0.1 port 60068 Sep 12 06:02:28.065351 sshd-session[5324]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:28.070318 systemd[1]: sshd@13-10.0.0.132:22-10.0.0.1:60068.service: Deactivated successfully. Sep 12 06:02:28.072554 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 06:02:28.073448 systemd-logind[1559]: Session 14 logged out. Waiting for processes to exit. Sep 12 06:02:28.075015 systemd-logind[1559]: Removed session 14. Sep 12 06:02:28.852023 containerd[1587]: time="2025-09-12T06:02:28.851966633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\" id:\"7d4bc696dbd10c7d34f3d949129f6de57b6140def69aa105b86bb30c8685fc34\" pid:5358 exit_status:1 exited_at:{seconds:1757656948 nanos:851541401}" Sep 12 06:02:28.922478 containerd[1587]: time="2025-09-12T06:02:28.922423026Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:28.923288 containerd[1587]: time="2025-09-12T06:02:28.923237871Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 06:02:28.924502 containerd[1587]: time="2025-09-12T06:02:28.924460165Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:28.926584 containerd[1587]: time="2025-09-12T06:02:28.926543134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:28.927280 containerd[1587]: time="2025-09-12T06:02:28.927240452Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.504292404s" Sep 12 06:02:28.927325 containerd[1587]: time="2025-09-12T06:02:28.927278195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 06:02:28.928291 containerd[1587]: time="2025-09-12T06:02:28.928230397Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 06:02:28.929550 containerd[1587]: time="2025-09-12T06:02:28.929521604Z" level=info msg="CreateContainer within sandbox \"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 06:02:28.940723 containerd[1587]: time="2025-09-12T06:02:28.940568990Z" level=info msg="Container ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:29.005280 containerd[1587]: time="2025-09-12T06:02:29.005221029Z" level=info msg="CreateContainer within sandbox \"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b\"" Sep 12 06:02:29.007146 containerd[1587]: time="2025-09-12T06:02:29.005967371Z" level=info msg="StartContainer for \"ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b\"" Sep 12 06:02:29.007935 containerd[1587]: time="2025-09-12T06:02:29.007897210Z" level=info msg="connecting to shim ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b" address="unix:///run/containerd/s/b526a59f7b00d493fdf0e0839d5a3a7a529c664ae106b53dd55fb0961157c3c3" protocol=ttrpc version=3 Sep 12 06:02:29.030260 systemd[1]: Started cri-containerd-ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b.scope - libcontainer container ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b. Sep 12 06:02:29.074768 containerd[1587]: time="2025-09-12T06:02:29.074720268Z" level=info msg="StartContainer for \"ba1cbc2fdd47a9fc232db95b40e8f6636cfa5c550c73503ec1a1e2bd32ba301b\" returns successfully" Sep 12 06:02:29.841093 containerd[1587]: time="2025-09-12T06:02:29.841042133Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\" id:\"acdf0f17bfd745b984389b383a2612117dd4645977fec4547c80e08d26a6a712\" pid:5418 exit_status:1 exited_at:{seconds:1757656949 nanos:840703779}" Sep 12 06:02:32.568489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1222470813.mount: Deactivated successfully. Sep 12 06:02:33.081926 systemd[1]: Started sshd@14-10.0.0.132:22-10.0.0.1:54214.service - OpenSSH per-connection server daemon (10.0.0.1:54214). Sep 12 06:02:33.169976 sshd[5439]: Accepted publickey for core from 10.0.0.1 port 54214 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:33.171391 sshd-session[5439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:33.175904 systemd-logind[1559]: New session 15 of user core. Sep 12 06:02:33.186403 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 06:02:33.340737 sshd[5442]: Connection closed by 10.0.0.1 port 54214 Sep 12 06:02:33.341029 sshd-session[5439]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:33.345750 systemd[1]: sshd@14-10.0.0.132:22-10.0.0.1:54214.service: Deactivated successfully. Sep 12 06:02:33.348203 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 06:02:33.349401 systemd-logind[1559]: Session 15 logged out. Waiting for processes to exit. Sep 12 06:02:33.350769 systemd-logind[1559]: Removed session 15. Sep 12 06:02:33.608944 containerd[1587]: time="2025-09-12T06:02:33.608828902Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:33.609976 containerd[1587]: time="2025-09-12T06:02:33.609954650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 06:02:33.611840 containerd[1587]: time="2025-09-12T06:02:33.611823100Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:33.614803 containerd[1587]: time="2025-09-12T06:02:33.614750950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:33.616079 containerd[1587]: time="2025-09-12T06:02:33.616038992Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.687750493s" Sep 12 06:02:33.616232 containerd[1587]: time="2025-09-12T06:02:33.616074881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 06:02:33.619867 containerd[1587]: time="2025-09-12T06:02:33.619615642Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 06:02:33.620727 containerd[1587]: time="2025-09-12T06:02:33.620702376Z" level=info msg="CreateContainer within sandbox \"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 06:02:33.630440 containerd[1587]: time="2025-09-12T06:02:33.630411009Z" level=info msg="Container cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:33.643742 containerd[1587]: time="2025-09-12T06:02:33.643694418Z" level=info msg="CreateContainer within sandbox \"9d2f531199ea08346fbac9320cb591fc674e8284d1fd14720bc114c6b9692e06\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909\"" Sep 12 06:02:33.644220 containerd[1587]: time="2025-09-12T06:02:33.644178070Z" level=info msg="StartContainer for \"cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909\"" Sep 12 06:02:33.645171 containerd[1587]: time="2025-09-12T06:02:33.645141617Z" level=info msg="connecting to shim cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909" address="unix:///run/containerd/s/4fd63a75c492f92923c3f2b1a101448235575efdfe32d8bfce140f2f898ebdd5" protocol=ttrpc version=3 Sep 12 06:02:33.680442 systemd[1]: Started cri-containerd-cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909.scope - libcontainer container cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909. Sep 12 06:02:33.737384 containerd[1587]: time="2025-09-12T06:02:33.737343108Z" level=info msg="StartContainer for \"cb2ae5d0531282093a6d8a7415ee19c36324fe1ceb47b04e389161a84c7ad909\" returns successfully" Sep 12 06:02:33.783186 kubelet[2726]: I0912 06:02:33.783129 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5bcfc55986-lv299" podStartSLOduration=1.3812529310000001 podStartE2EDuration="21.783114915s" podCreationTimestamp="2025-09-12 06:02:12 +0000 UTC" firstStartedPulling="2025-09-12 06:02:13.216206538 +0000 UTC m=+50.874979041" lastFinishedPulling="2025-09-12 06:02:33.618068512 +0000 UTC m=+71.276841025" observedRunningTime="2025-09-12 06:02:33.782478398 +0000 UTC m=+71.441250901" watchObservedRunningTime="2025-09-12 06:02:33.783114915 +0000 UTC m=+71.441887418" Sep 12 06:02:35.402402 containerd[1587]: time="2025-09-12T06:02:35.402348546Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:35.403303 containerd[1587]: time="2025-09-12T06:02:35.403264488Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 06:02:35.404692 containerd[1587]: time="2025-09-12T06:02:35.404637499Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:35.407301 containerd[1587]: time="2025-09-12T06:02:35.407260806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 06:02:35.407753 containerd[1587]: time="2025-09-12T06:02:35.407730069Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.788083457s" Sep 12 06:02:35.407808 containerd[1587]: time="2025-09-12T06:02:35.407759204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 06:02:35.430578 containerd[1587]: time="2025-09-12T06:02:35.430523828Z" level=info msg="CreateContainer within sandbox \"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 06:02:35.448404 containerd[1587]: time="2025-09-12T06:02:35.448350642Z" level=info msg="Container d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032: CDI devices from CRI Config.CDIDevices: []" Sep 12 06:02:35.458472 containerd[1587]: time="2025-09-12T06:02:35.458415564Z" level=info msg="CreateContainer within sandbox \"80a39f120751677eaace3693170e89885aa538a472443ed4d14ac2d89ff79d55\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032\"" Sep 12 06:02:35.459259 containerd[1587]: time="2025-09-12T06:02:35.459049094Z" level=info msg="StartContainer for \"d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032\"" Sep 12 06:02:35.461049 containerd[1587]: time="2025-09-12T06:02:35.461013112Z" level=info msg="connecting to shim d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032" address="unix:///run/containerd/s/b526a59f7b00d493fdf0e0839d5a3a7a529c664ae106b53dd55fb0961157c3c3" protocol=ttrpc version=3 Sep 12 06:02:35.483534 systemd[1]: Started cri-containerd-d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032.scope - libcontainer container d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032. Sep 12 06:02:35.534936 containerd[1587]: time="2025-09-12T06:02:35.534882989Z" level=info msg="StartContainer for \"d561ae843802713e378322625e88014f3159506641f43b20dab81c7a09370032\" returns successfully" Sep 12 06:02:35.549923 containerd[1587]: time="2025-09-12T06:02:35.549868026Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\" id:\"4b79445e58d60cf2fb08d7445964d8889753ee03af48b05387122d914e78c1f8\" pid:5525 exited_at:{seconds:1757656955 nanos:549560676}" Sep 12 06:02:35.790315 kubelet[2726]: I0912 06:02:35.789876 2726 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-clj9k" podStartSLOduration=36.051223371 podStartE2EDuration="54.789861244s" podCreationTimestamp="2025-09-12 06:01:41 +0000 UTC" firstStartedPulling="2025-09-12 06:02:16.690452958 +0000 UTC m=+54.349225461" lastFinishedPulling="2025-09-12 06:02:35.429090831 +0000 UTC m=+73.087863334" observedRunningTime="2025-09-12 06:02:35.788807828 +0000 UTC m=+73.447580331" watchObservedRunningTime="2025-09-12 06:02:35.789861244 +0000 UTC m=+73.448633747" Sep 12 06:02:36.512040 kubelet[2726]: I0912 06:02:36.512006 2726 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 06:02:36.512040 kubelet[2726]: I0912 06:02:36.512041 2726 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 06:02:37.056450 kubelet[2726]: I0912 06:02:37.056390 2726 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 06:02:38.352837 systemd[1]: Started sshd@15-10.0.0.132:22-10.0.0.1:54226.service - OpenSSH per-connection server daemon (10.0.0.1:54226). Sep 12 06:02:38.416450 sshd[5567]: Accepted publickey for core from 10.0.0.1 port 54226 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:38.418273 sshd-session[5567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:38.422673 systemd-logind[1559]: New session 16 of user core. Sep 12 06:02:38.431261 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 06:02:38.547093 sshd[5570]: Connection closed by 10.0.0.1 port 54226 Sep 12 06:02:38.547445 sshd-session[5567]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:38.550677 systemd[1]: sshd@15-10.0.0.132:22-10.0.0.1:54226.service: Deactivated successfully. Sep 12 06:02:38.552892 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 06:02:38.554478 systemd-logind[1559]: Session 16 logged out. Waiting for processes to exit. Sep 12 06:02:38.556357 systemd-logind[1559]: Removed session 16. Sep 12 06:02:43.563666 systemd[1]: Started sshd@16-10.0.0.132:22-10.0.0.1:47744.service - OpenSSH per-connection server daemon (10.0.0.1:47744). Sep 12 06:02:43.649179 sshd[5584]: Accepted publickey for core from 10.0.0.1 port 47744 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:43.651089 sshd-session[5584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:43.662228 containerd[1587]: time="2025-09-12T06:02:43.662089680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65734a1f8b995aec253a64a77afcd58977e3f943d892cbcbec32b70a91e658d5\" id:\"e070167928ae791cf131cdbff963e8248d5e3a89fb302e3bdb27b70b0ca28e35\" pid:5596 exited_at:{seconds:1757656963 nanos:661671890}" Sep 12 06:02:43.664543 systemd-logind[1559]: New session 17 of user core. Sep 12 06:02:43.670354 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 06:02:43.843584 sshd[5611]: Connection closed by 10.0.0.1 port 47744 Sep 12 06:02:43.843953 sshd-session[5584]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:43.852597 systemd[1]: sshd@16-10.0.0.132:22-10.0.0.1:47744.service: Deactivated successfully. Sep 12 06:02:43.854382 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 06:02:43.855067 systemd-logind[1559]: Session 17 logged out. Waiting for processes to exit. Sep 12 06:02:43.857569 systemd[1]: Started sshd@17-10.0.0.132:22-10.0.0.1:47746.service - OpenSSH per-connection server daemon (10.0.0.1:47746). Sep 12 06:02:43.858203 systemd-logind[1559]: Removed session 17. Sep 12 06:02:43.912600 sshd[5626]: Accepted publickey for core from 10.0.0.1 port 47746 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:43.913962 sshd-session[5626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:43.920190 systemd-logind[1559]: New session 18 of user core. Sep 12 06:02:43.926314 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 06:02:44.222160 sshd[5629]: Connection closed by 10.0.0.1 port 47746 Sep 12 06:02:44.223518 sshd-session[5626]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:44.234872 systemd[1]: sshd@17-10.0.0.132:22-10.0.0.1:47746.service: Deactivated successfully. Sep 12 06:02:44.236968 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 06:02:44.237880 systemd-logind[1559]: Session 18 logged out. Waiting for processes to exit. Sep 12 06:02:44.241088 systemd[1]: Started sshd@18-10.0.0.132:22-10.0.0.1:47752.service - OpenSSH per-connection server daemon (10.0.0.1:47752). Sep 12 06:02:44.242652 systemd-logind[1559]: Removed session 18. Sep 12 06:02:44.308738 sshd[5640]: Accepted publickey for core from 10.0.0.1 port 47752 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:44.310247 sshd-session[5640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:44.314855 systemd-logind[1559]: New session 19 of user core. Sep 12 06:02:44.321263 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 06:02:44.835056 sshd[5643]: Connection closed by 10.0.0.1 port 47752 Sep 12 06:02:44.836320 sshd-session[5640]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:44.848547 systemd[1]: sshd@18-10.0.0.132:22-10.0.0.1:47752.service: Deactivated successfully. Sep 12 06:02:44.852259 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 06:02:44.856820 systemd-logind[1559]: Session 19 logged out. Waiting for processes to exit. Sep 12 06:02:44.859759 systemd[1]: Started sshd@19-10.0.0.132:22-10.0.0.1:47762.service - OpenSSH per-connection server daemon (10.0.0.1:47762). Sep 12 06:02:44.863418 systemd-logind[1559]: Removed session 19. Sep 12 06:02:44.936140 sshd[5663]: Accepted publickey for core from 10.0.0.1 port 47762 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:44.938284 sshd-session[5663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:44.943018 systemd-logind[1559]: New session 20 of user core. Sep 12 06:02:44.949256 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 06:02:45.212942 sshd[5666]: Connection closed by 10.0.0.1 port 47762 Sep 12 06:02:45.213537 sshd-session[5663]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:45.229893 systemd[1]: sshd@19-10.0.0.132:22-10.0.0.1:47762.service: Deactivated successfully. Sep 12 06:02:45.231882 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 06:02:45.232808 systemd-logind[1559]: Session 20 logged out. Waiting for processes to exit. Sep 12 06:02:45.236126 systemd[1]: Started sshd@20-10.0.0.132:22-10.0.0.1:47766.service - OpenSSH per-connection server daemon (10.0.0.1:47766). Sep 12 06:02:45.237126 systemd-logind[1559]: Removed session 20. Sep 12 06:02:45.294589 sshd[5678]: Accepted publickey for core from 10.0.0.1 port 47766 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:45.296220 sshd-session[5678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:45.300546 systemd-logind[1559]: New session 21 of user core. Sep 12 06:02:45.314246 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 06:02:45.452060 sshd[5681]: Connection closed by 10.0.0.1 port 47766 Sep 12 06:02:45.452433 sshd-session[5678]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:45.456368 systemd[1]: sshd@20-10.0.0.132:22-10.0.0.1:47766.service: Deactivated successfully. Sep 12 06:02:45.458517 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 06:02:45.459293 systemd-logind[1559]: Session 21 logged out. Waiting for processes to exit. Sep 12 06:02:45.460399 systemd-logind[1559]: Removed session 21. Sep 12 06:02:50.473772 systemd[1]: Started sshd@21-10.0.0.132:22-10.0.0.1:45024.service - OpenSSH per-connection server daemon (10.0.0.1:45024). Sep 12 06:02:50.532402 sshd[5695]: Accepted publickey for core from 10.0.0.1 port 45024 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:50.534203 sshd-session[5695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:50.539908 systemd-logind[1559]: New session 22 of user core. Sep 12 06:02:50.545401 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 06:02:50.659520 sshd[5698]: Connection closed by 10.0.0.1 port 45024 Sep 12 06:02:50.659922 sshd-session[5695]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:50.664696 systemd[1]: sshd@21-10.0.0.132:22-10.0.0.1:45024.service: Deactivated successfully. Sep 12 06:02:50.666845 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 06:02:50.667773 systemd-logind[1559]: Session 22 logged out. Waiting for processes to exit. Sep 12 06:02:50.669136 systemd-logind[1559]: Removed session 22. Sep 12 06:02:53.793460 containerd[1587]: time="2025-09-12T06:02:53.793399691Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d741bed22e6df9b124d3088639946a2ecc3cfb76f7081e79ec7b434ec8a27c9\" id:\"1ff635dbd2e0f3db6834c75269b77266209e70155426c7b3646fcdb19061e350\" pid:5731 exited_at:{seconds:1757656973 nanos:793005400}" Sep 12 06:02:55.673247 systemd[1]: Started sshd@22-10.0.0.132:22-10.0.0.1:45034.service - OpenSSH per-connection server daemon (10.0.0.1:45034). Sep 12 06:02:55.729281 sshd[5742]: Accepted publickey for core from 10.0.0.1 port 45034 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:02:55.730855 sshd-session[5742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:02:55.735049 systemd-logind[1559]: New session 23 of user core. Sep 12 06:02:55.746323 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 06:02:56.082398 sshd[5745]: Connection closed by 10.0.0.1 port 45034 Sep 12 06:02:56.082912 sshd-session[5742]: pam_unix(sshd:session): session closed for user core Sep 12 06:02:56.087012 systemd[1]: sshd@22-10.0.0.132:22-10.0.0.1:45034.service: Deactivated successfully. Sep 12 06:02:56.089182 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 06:02:56.089944 systemd-logind[1559]: Session 23 logged out. Waiting for processes to exit. Sep 12 06:02:56.091151 systemd-logind[1559]: Removed session 23. Sep 12 06:02:59.852547 containerd[1587]: time="2025-09-12T06:02:59.852467170Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ad02408571db06eb07b5671ee986dc815de09eeb2766a3c1d81191c57e2db54e\" id:\"709bc05eb5107e9d378afa422e0513029df959ebf957e4133b29c1abaa505dd1\" pid:5770 exited_at:{seconds:1757656979 nanos:852063754}" Sep 12 06:03:00.883386 systemd[1]: Started sshd@23-10.0.0.132:22-10.0.0.1:50152.service - OpenSSH per-connection server daemon (10.0.0.1:50152). Sep 12 06:03:00.959497 sshd[5785]: Accepted publickey for core from 10.0.0.1 port 50152 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:03:00.961142 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:03:00.967190 systemd-logind[1559]: New session 24 of user core. Sep 12 06:03:00.975233 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 06:03:01.495029 sshd[5788]: Connection closed by 10.0.0.1 port 50152 Sep 12 06:03:01.495438 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Sep 12 06:03:01.500339 systemd[1]: sshd@23-10.0.0.132:22-10.0.0.1:50152.service: Deactivated successfully. Sep 12 06:03:01.502278 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 06:03:01.503016 systemd-logind[1559]: Session 24 logged out. Waiting for processes to exit. Sep 12 06:03:01.504192 systemd-logind[1559]: Removed session 24. Sep 12 06:03:06.517072 systemd[1]: Started sshd@24-10.0.0.132:22-10.0.0.1:50158.service - OpenSSH per-connection server daemon (10.0.0.1:50158). Sep 12 06:03:06.588342 sshd[5804]: Accepted publickey for core from 10.0.0.1 port 50158 ssh2: RSA SHA256:U1JO+eJG2JU9nuyVYS4dzqqYhW7JLNNCX6TNK3ddyUk Sep 12 06:03:06.589904 sshd-session[5804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 06:03:06.594079 systemd-logind[1559]: New session 25 of user core. Sep 12 06:03:06.600263 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 06:03:06.779952 sshd[5807]: Connection closed by 10.0.0.1 port 50158 Sep 12 06:03:06.780189 sshd-session[5804]: pam_unix(sshd:session): session closed for user core Sep 12 06:03:06.785269 systemd[1]: sshd@24-10.0.0.132:22-10.0.0.1:50158.service: Deactivated successfully. Sep 12 06:03:06.787429 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 06:03:06.789028 systemd-logind[1559]: Session 25 logged out. Waiting for processes to exit. Sep 12 06:03:06.790625 systemd-logind[1559]: Removed session 25.